1:40 PM - 2:00 PM
[1E3-GS-9-02] Construction of Domain Specific DistilBERT Model by Using Fine-Tuning
Keywords:BERT, Fine Tuning, Domain Dependency
In this paper, we point out the problem that BERT is domain dependent, and propose to construct the domain specific pre-training model by using fine-tuning. In particular, parameters of a DistilBERT model are initialized by a trained BERT model, and then they are tuned from the specific domain corpus. As a result, we can efficiently construct the domain specific DistilBERT model. In the experiment, we make the test set for each domain, which is the estimation of a masked word in a sentence. By this test set, we evaluate the domain specific DistilBERT model by comparing with the general BERT model, and show the superiority of our proposed model.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.