Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
36th (2022)
Session ID : 1D1-GS-2-02
Conference information

Scheduling of Damping in Natural Gradient Method
*Hiroki NAGANUMAGaku FUJIMORIMari TAKEUCHIJumpei NAGASE
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

In recent years, second-order optimization with a fast convergence rate has been used in deep learning owing to fast approximation methods for natural gradient methods. Second-order optimization requires the inverse computation of the information matrix, which generally degenerates in the deep learning problem. Therefore, as a heuristic, a damping method adds a unit matrix multiplied by a constant. This study proposed a method for scheduling damping motivated by the Levenberg-Marquardt method for determining damping and investigated its effectiveness.

Content from these authors
© 2022 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top