Abstract
The problem of identifying a single-input and single-output linear discrete-time system is considered where the output data is corrupted by additive noise. Use is made of the fact that a consistent estimator may be obtained by compensating an asymptotic bias in the leastsquares estimator. The algorithm proposed in this paper is useful for the case when the variance of the noise is not known and on-line computation is required. It is shown that the estimate generated by this algorithm is asymptotically equivalent to the one that minimizes approximately the sum of squared output errors. The latter minimization problem can be reduced to the eigenvalue problem. The experimental result of digital simulation is presented to illustrate the usefulness of the approach and verify the validity of the theoretical discussions.