Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
38th (2024)
Session ID : 2D4-GS-2-01
Conference information

Proposed learning method assuming distribution shift due to corrupted data
*Toma HAMADAMatthew J HOLLAND
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

Even in tasks in which deep learning excels, such as image classification, if the strength of the noise in the training data is uneven, the average loss minimization of conventional methods will result in a significant bias toward good or bad classification. SharpDRO, which considers the flatness and distribution shift of the objective function at once, has been proposed as a method that preserves the generalization ability of deep learning while targeting diverse data for learning, but its practicality is limited by its dependence on prior knowledge and increased computational cost. In this proposal, the ”flooding” of the loss is applied to the objective function of DRO instead of the average loss, leading to the same effect as SharpDRO with the same computational cost as the usual gradient descent method.

Content from these authors
© 2024 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top