Host: The Japanese Society for Artificial Intelligence
Name : The 38th Annual Conference of the Japanese Society for Artificial Intelligence
Number : 38
Location : [in Japanese]
Date : May 28, 2024 - May 31, 2024
Even in tasks in which deep learning excels, such as image classification, if the strength of the noise in the training data is uneven, the average loss minimization of conventional methods will result in a significant bias toward good or bad classification. SharpDRO, which considers the flatness and distribution shift of the objective function at once, has been proposed as a method that preserves the generalization ability of deep learning while targeting diverse data for learning, but its practicality is limited by its dependence on prior knowledge and increased computational cost. In this proposal, the ”flooding” of the loss is applied to the objective function of DRO instead of the average loss, leading to the same effect as SharpDRO with the same computational cost as the usual gradient descent method.