2003 年 123 巻 5 号 p. 999-1003
Backpropagation, one of the most popular learning algorithms in multi-layered feedforward neural networks, suffers from the drawback of slow convergence. Several modifications have been proposed to accelerate the learning process using different techniques. In this paper, a new cost function expressed as exponential of sum-squared or Log-likelihood is proposed. Weight update using this modification varies the learning rate parameter dynamically during training as opposed to constant learning rate parameter used in standard Backpropagation. Simulation results with different problems demonstrate significant improvement in the learning speed of Backpropagation algorithm.
J-STAGEがリニューアルされました! https://www.jstage.jst.go.jp/browse/-char/ja/