2:00 PM - 2:20 PM
[2B4-GS-6-03] An Approach to Building a General-Purpose Language Model for Understanding Temporal Common Sense
Keywords:Temporal Common Sense, General-Purpose Language Model, Masked Language Modeling, Multi-Step Fine-Tuning
The ability to capture common sense temporal relationships for time-related events expressed in text is a very important task in natural language understanding. On the other hand, pre-trained language models such as BERT, which have recently achieved great success in a wide range of natural language processing tasks, are still considered to have poor performance in temporal reasoning. In this paper, we focus on the development of language models for temporal common sense inference. Our model relies on multi-step fine-tuning using multiple corpora, and masked language modeling to predict masked temporal indicators that are crucial for temporal common sense reasoning. Our experimental results showed a significant improvement in accuracy over standard fine-tuning in temporal common sense inference.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.