Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
35th (2021)
Session ID : 4G2-GS-2k-03
Conference information

Refined Consistency for Semi-Supervised Learning with Knowledge Distillation
*Yoshitaka MURAMOTONaoki OKAMOTOTubasa HIRAKAWATakayoshi YAMASHITAHironobu FUJIYOSHI
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

Semi-supervised learning is a method that uses both labeled and unlabeled data for training the model. Dual Student (DS), which transfers knowledge between two networks, and Multiple Student (MS), which expands the number of DS networks to four or more, have been proposed as semi-supervised learning. MS achieves higher accuracy than DS, but learning MS is inefficient because knowledge transfer between all networks is not performed at once in the MS learning. In this paper, we propose refined-consistency, which transfers knowledge between all networks at once, to improve accuracy through an efficient knowledge transfer method. In the experiment with the CIFAR-100 dataset, we show that the proposed method improves the accuracy more than MS.

Content from these authors
© 2021 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top