Nonlinear Theory and Its Applications, IEICE
Online ISSN : 2185-4106
ISSN-L : 2185-4106
Special Section on Nonlinear Circuits and Systems
A novel quasi-Newton-based optimization for neural network training incorporating Nesterov's accelerated gradient
Hiroshi Ninomiya
著者情報
ジャーナル フリー

2017 年 8 巻 4 号 p. 289-301

詳細
抄録
This paper describes a novel quasi-Newton (QN) based accelerated technique for training of neural networks. Recently, Nesterov's accelerated gradient method has been utilized for the acceleration of the gradient-based training. In this paper the acceleration of the QN training algorithm is realized by the quadratic approximation of the error function incorporating the momentum term as Nesterov's method. It is shown that the proposed algorithm has a similar convergence property with the QN method. Neural network trainings for the function approximation and the microwave circuit modeling problems are presented to demonstrate the proposed algorithm. The method proposed here drastically improves the convergence speed of the conventional QN algorithm.
著者関連情報
© 2017 The Institute of Electronics, Information and Communication Engineers
前の記事 次の記事
feedback
Top