計測自動制御学会論文集
Online ISSN : 1883-8189
Print ISSN : 0453-4654
ISSN-L : 0453-4654
直交化最小二乗法による階層型ニューラルネットワークの中間層ニューロン数の削減法
楊 子江
著者情報
ジャーナル フリー

1997 年 33 巻 3 号 p. 216-223

詳細
抄録
This paper proposes a new and computationally efficient approach to hidden-layer size reducing for multilayer neural networks. The author's attention is focused on minimizing hidden-layer redundancy using the orthogonal least-squares (OLS) method with the aid of Gram-Schmidt orthogonal transformation. The neural network with a large hidden-layer size is first trained via a standard training rule. Then the OLS method is introduced to identify and eliminate redundant neurons such that a simpler neural network is obtained. The OLS method is employed as a forward regression procedure to select a suitable set of neurons from a large set of preliminarily trained hidden neurons, such that the input to the output-layer is reconstructed with less hidden neurons. At each step of regression, the increment to the energy of the summation of the weighted outputs of the hidden neurons (the input to the output-layer) is maximized. And the weights of the links between the selected hidden neurons and the output layer is automatically determined through the regression procedure. Therefore the neurons which contribute trivially to the input to the output-layer can be eliminated without much distortion of the neural network output. Simulation results are included to show the efficiency of the proposed method.
著者関連情報
© 社団法人 計測自動制御学会
前の記事 次の記事
feedback
Top