Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
36th (2022)
Session ID : 2B4-GS-6-03
Conference information

An Approach to Building a General-Purpose Language Model for Understanding Temporal Common Sense
*Mayuko KIMURALis Kanashiro PEREIRAMasayuki ASAHARAFei CHENGAyako OCHIIchiro KOBAYASHI
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

The ability to capture common sense temporal relationships for time-related events expressed in text is a very important task in natural language understanding. On the other hand, pre-trained language models such as BERT, which have recently achieved great success in a wide range of natural language processing tasks, are still considered to have poor performance in temporal reasoning. In this paper, we focus on the development of language models for temporal common sense inference. Our model relies on multi-step fine-tuning using multiple corpora, and masked language modeling to predict masked temporal indicators that are crucial for temporal common sense reasoning. Our experimental results showed a significant improvement in accuracy over standard fine-tuning in temporal common sense inference.

Content from these authors
© 2022 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top