Abstract
This paper proposes a new and computationally efficient approach to hidden-layer size reducing for multilayer neural networks. The author's attention is focused on minimizing hidden-layer redundancy using the orthogonal least-squares (OLS) method with the aid of Gram-Schmidt orthogonal transformation. The neural network with a large hidden-layer size is first trained via a standard training rule. Then the OLS method is introduced to identify and eliminate redundant neurons such that a simpler neural network is obtained. The OLS method is employed as a forward regression procedure to select a suitable set of neurons from a large set of preliminarily trained hidden neurons, such that the input to the output-layer is reconstructed with less hidden neurons. At each step of regression, the increment to the energy of the summation of the weighted outputs of the hidden neurons (the input to the output-layer) is maximized. And the weights of the links between the selected hidden neurons and the output layer is automatically determined through the regression procedure. Therefore the neurons which contribute trivially to the input to the output-layer can be eliminated without much distortion of the neural network output. Simulation results are included to show the efficiency of the proposed method.