JSAI2023

Presentation information

General Session

General Session » GS-2 Machine learning

[3D5-GS-2] Machine learning

Thu. Jun 8, 2023 3:30 PM - 5:10 PM Room D (A1)

座長:宮川 大輝(NEC) [オンライン]

4:50 PM - 5:10 PM

[3D5-GS-2-05] Contrastive Distillation Learning for Neural Topic Models

〇Kohei Watanabe1, Koji Eguchi1 (1. Hiroshima University)

Keywords:Topic modeling, Knowledge distillation, Contrastive learning, Deep learning

Topic modeling is one of the techniques used in text data analysis, aiming to estimate the latent topics of the data. Knowledge distillation has been attracting attention as a means of transferring knowledge from a large teacher model to a small student model in the field of deep learning. Contrastive learning has also been gaining attention in self-supervised representation learning, and its effectiveness has been reported lately. Based on these backgrounds, this study focuses on transferring the structural knowledge from a teacher model to a student model using knowledge distillation within the framework of contrastive learning for learning a neural topic model. We demonstrate through experiments that our proposed method improves topic coherence, compared to previous neural topic models, by leveraging a contrastive loss to learn the latent representations of a student model while maintaining the topic relationships in each document representation generated by a teacher model.

Authentication for paper PDF access

A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.

Password