抄録
Small sampling period is preferable in system identification. However, the LS estimate of an impulse response tends to have a large mean squares error (MSE) when input-output signals are sampled at too fast rate. In this paper, we decimate the input-output signals which are sampled at fast rate to improve the condition number of the input correlation matrix and then calculate the decimated LS estimate, which then should be interpolated at fast original sampling rate so as to derivate the MSE of the impulse response estimate. A MSE criterion is given in terms of the decimation rate, which depends on a power spectral density of input signals, a frequency response of the system, a noise variance, and data length. Thus it is clarified there exists an optimal decimation rate which minimizes the MSE. We also clarify that an optimal decimation rate can be obtained by using the only input-output data.