Host: The Japanese Society for Artificial Intelligence
Name : The 36th Annual Conference of the Japanese Society for Artificial Intelligence
Number : 36
Location : [in Japanese]
Date : June 14, 2022 - June 17, 2022
In recent years, second-order optimization with a fast convergence rate has been used in deep learning owing to fast approximation methods for natural gradient methods. Second-order optimization requires the inverse computation of the information matrix, which generally degenerates in the deep learning problem. Therefore, as a heuristic, a damping method adds a unit matrix multiplied by a constant. This study proposed a method for scheduling damping motivated by the Levenberg-Marquardt method for determining damping and investigated its effectiveness.