IEICE Transactions on Information and Systems
Online ISSN : 1745-1361
Print ISSN : 0916-8532
Regular Section
Learning from Noisy Complementary Labels with Robust Loss Functions
Hiroki ISHIGUROTakashi ISHIDAMasashi SUGIYAMA
著者情報
ジャーナル フリー

2022 年 E105.D 巻 2 号 p. 364-376

詳細
抄録

It has been demonstrated that large-scale labeled datasets facilitate the success of machine learning. However, collecting labeled data is often very costly and error-prone in practice. To cope with this problem, previous studies have considered the use of a complementary label, which specifies a class that an instance does not belong to and can be collected more easily than ordinary labels. However, complementary labels could also be error-prone and thus mitigating the influence of label noise is an important challenge to make complementary-label learning more useful in practice. In this paper, we derive conditions for the loss function such that the learning algorithm is not affected by noise in complementary labels. Experiments on benchmark datasets with noisy complementary labels demonstrate that the loss functions that satisfy our conditions significantly improve the classification performance.

著者関連情報
© 2022 The Institute of Electronics, Information and Communication Engineers
前の記事 次の記事
feedback
Top