Transactions of the Institute of Systems, Control and Information Engineers
Online ISSN : 2185-811X
Print ISSN : 1342-5668
ISSN-L : 1342-5668
The Back Propagation Method Using the Least Mean-Square Method for the Output Recurrent Neural Network
Shigenobu YAMAWAKIMasashi FUJINOSyozo IMAO
Author information
JOURNAL FREE ACCESS

1999 Volume 12 Issue 4 Pages 225-233

Details
Abstract
The back propagation method on the basis of the gradient method is often utilized as a learning rule of a neural network. This paper proposes a back propagation method using the least mean-square method for the output recurrent neural network. The approach consists of the decision of the input vector and the parameter estimation of each layer. The input vector of the output layer is corrected to decrease the output error corresponding to learning rate and the learning value of the other layer. The parameter is calculated using the least-square method from the obtained input and output of each layer.
The identification result for the linear oscillation system shows the effectiveness of the proposed algorithm which is not based on the gradient method. It is shown that better estimate is obtained by the proposed algorithm compared with the classical back propagation method.
Content from these authors
© The Institute of Systems, Control and Information Engineers
Previous article Next article
feedback
Top