1993 Volume 113 Issue 12 Pages 1154-1162
Backpropagation algorithm (BP) having only a gradient term converges very slowly, because of the oscillation of weights occurring in regions where the error surface forms a ravine. In order to reduce the oscillation of weights, the momentum term was introduced. However, it has not worked well to reduce the oscillation because the gradient includes the component across a ravine which causes further oscillations. To overcome this problem, we should focus near the bottom of a ravine where the steepest descent direction is the same as the downward direction along a ravine. We described a method to correct the position of weights near the bottom of the ravine and proposed a new accelerated learning algorithm based on Jacobs' algorithm with it. The proposed algorithm reduces the oscillation of weights quickly and converges about 18 times faster than the standard BP in the problem of a sine function approximation.
The transactions of the Institute of Electrical Engineers of Japan.C
The Journal of the Institute of Electrical Engineers of Japan