2022 Volume 40 Issue 5 Pages 233-240
In classification problems, Cross-Entropy Loss can be used to separate features in the feature space. On the other hand, Contrastive Learning is possible to obtain useful representations by learning features so that the features of the same class are close and those of the different classes are far from each other. In this paper, we focus on Supervised Contrastive Learning (SCL), which uses label information to embed features more appropriately within the framework of supervised learning, and applying it to the task of classifying the opacities of chest CT images. We found that the classification accuracy was improved by 8-18% in the four validation patterns performed in terms of adaptation to each of the two different domains (Hospital 1 cases and Hospital 2 cases) and adaptation across domains. Furthermore, by visualizing the obtained features using t-SNE, we confirmed that the groups of classes are created by SCL clearly compared with the method with Cross-Entropy Loss.