Host: The Japanese Society for Artificial Intelligence
Name : The 35th Annual Conference of the Japanese Society for Artificial Intelligence
Number : 35
Location : [in Japanese]
Date : June 08, 2021 - June 11, 2021
In this study, we tackle abstract summarization of Japanese news articles using BERT, which is common in the field of natural language processing in recent years. Specifically, we use BertSum, a summarization method that is an extension of BERT. We trained BertSum using three types of BERT, and the experiment showed that Japanese pre-trained models worked better than multilingual model. There was no significant difference in the performance of the model pre-trained on Japanese news articles and Japanese Wikipedia. We also discussed tokenizers and unknown words, which are important in dealing with news articles in Japanese.