Nonlinear Theory and Its Applications, IEICE
Online ISSN : 2185-4106
ISSN-L : 2185-4106
Special Section on Nonlinear Circuits and Systems
A novel quasi-Newton-based optimization for neural network training incorporating Nesterov's accelerated gradient
Hiroshi Ninomiya
Author information
JOURNAL FREE ACCESS

2017 Volume 8 Issue 4 Pages 289-301

Details
Abstract
This paper describes a novel quasi-Newton (QN) based accelerated technique for training of neural networks. Recently, Nesterov's accelerated gradient method has been utilized for the acceleration of the gradient-based training. In this paper the acceleration of the QN training algorithm is realized by the quadratic approximation of the error function incorporating the momentum term as Nesterov's method. It is shown that the proposed algorithm has a similar convergence property with the QN method. Neural network trainings for the function approximation and the microwave circuit modeling problems are presented to demonstrate the proposed algorithm. The method proposed here drastically improves the convergence speed of the conventional QN algorithm.
Content from these authors
© 2017 The Institute of Electronics, Information and Communication Engineers
Previous article Next article
feedback
Top