抄録
In this paper we propose a novel method of model reduction with a time delay for single-input, single-output continuous-time systems by a separable least-squares (LS) approach. The reduced-order model is determined by minimizing the integral of the magnitude squared of the transfer function error. The denominator parameters and time delay of the reduced-order model are represented by the positions of the food sources of the employed bees and searched for by the artificial bee colony algorithm, while the numerator parameters are estimated by the linear LS method for each candidate of the denominator parameters and time delay. All the best parameters and the time delay of the reduced-order model are obtained through the search by the employed, onlooker and scout bees. Simulation results show that the accuracy of the proposed method is superior to that of the genetic algorithm (GA)-based model reduction algorithm.