2022 Volume 13 Issue 2 Pages 271-276
Quasi-Newton (QN) methods have shown to be effective in training neural networks. However, the computation and the storage of the approximated Hessian in large-scale applications is still a problem. The Memory-less QN (MLQN) was introduced as a method that did not require the storage of the matrix. This paper describes the effectiveness of the momentum term for the accelerated MLQN method through computer simulations on function approximation and classification problems.