IEEJ Transactions on Electronics, Information and Systems
Online ISSN : 1348-8155
Print ISSN : 0385-4221
ISSN-L : 0385-4221
A New Learning Algorithm by Reducing the Oscillation of Weights for Feedforward Neural Network
Kick Out Algorithm
Keihiro OchiaiNaohiro TodaShiro Usui
Author information
JOURNAL FREE ACCESS

1993 Volume 113 Issue 12 Pages 1154-1162

Details
Abstract

Backpropagation algorithm (BP) having only a gradient term converges very slowly, because of the oscillation of weights occurring in regions where the error surface forms a ravine. In order to reduce the oscillation of weights, the momentum term was introduced. However, it has not worked well to reduce the oscillation because the gradient includes the component across a ravine which causes further oscillations. To overcome this problem, we should focus near the bottom of a ravine where the steepest descent direction is the same as the downward direction along a ravine. We described a method to correct the position of weights near the bottom of the ravine and proposed a new accelerated learning algorithm based on Jacobs' algorithm with it. The proposed algorithm reduces the oscillation of weights quickly and converges about 18 times faster than the standard BP in the problem of a sine function approximation.

Content from these authors
© The Institute of Electrical Engineers of Japan
Previous article Next article
feedback
Top