Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
38th (2024)
Session ID : 2L5-OS-19a-04
Conference information

Robustness comparison of different learning rates of optimization algorithms in CNN models
Yuto YOKOYAMA*Kozo OKANOShinpei OGATAShin NAKAJIMA
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

SGD and Adam are optimization algorithms commonly used for training DNN models. While Adam is favored over SGD in many applications, robustness performance has not been thoroughly studied. Particularly, differences in learning rates may result in varying robustness performance while the generalization performance is almost the same. In this paper, we investigate the robustness performance of each optimization algorithm using indicators based on the active neurons within the model. We generate models using SGD and Adam with four learning rates, apply noise to the test data inputs, and compare using three metrics. The results from our proposed method show that SGD exhibits lower robustness performance compared to Adam. Additionally, the models with lower active neuron rates exhibit lower robustness performance. These findings have the potential to establish benchmarks for robustness performance and aid in the development of future optimization algorithms.

Content from these authors
© 2024 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top