Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
37th (2023)
Session ID : 3D5-GS-2-05
Conference information

Contrastive Distillation Learning for Neural Topic Models
*Kohei WATANABEKoji EGUCHI
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

Topic modeling is one of the techniques used in text data analysis, aiming to estimate the latent topics of the data. Knowledge distillation has been attracting attention as a means of transferring knowledge from a large teacher model to a small student model in the field of deep learning. Contrastive learning has also been gaining attention in self-supervised representation learning, and its effectiveness has been reported lately. Based on these backgrounds, this study focuses on transferring the structural knowledge from a teacher model to a student model using knowledge distillation within the framework of contrastive learning for learning a neural topic model. We demonstrate through experiments that our proposed method improves topic coherence, compared to previous neural topic models, by leveraging a contrastive loss to learn the latent representations of a student model while maintaining the topic relationships in each document representation generated by a teacher model.

Content from these authors
© 2023 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top