Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
34th (2020)
Session ID : 1E3-GS-9-02
Conference information

Construction of Domain Specific DistilBERT Model by Using Fine-Tuning
*Hiroyuki SHINNOUJing BAIRui CAOWen MA
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

In this paper, we point out the problem that BERT is domain dependent, and propose to construct the domain specific pre-training model by using fine-tuning. In particular, parameters of a DistilBERT model are initialized by a trained BERT model, and then they are tuned from the specific domain corpus. As a result, we can efficiently construct the domain specific DistilBERT model. In the experiment, we make the test set for each domain, which is the estimation of a masked word in a sentence. By this test set, we evaluate the domain specific DistilBERT model by comparing with the general BERT model, and show the superiority of our proposed model.

Content from these authors
© 2020 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top