Host: The Japanese Society for Artificial Intelligence
Name : The 33rd Annual Conference of the Japanese Society for Artificial Intelligence, 2019
Number : 33
Location : [in Japanese]
Date : June 04, 2019 - June 07, 2019
A new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers obtains new state-of-the-art results on eleven natural language processing tasks in English. We build a Japanese version of BERT model with Japanese Wikipedia data and perform sentiment analysis of Japan Economic Watcher Survey Data. We confirmed that the result of sentiment analysis using the Japanese version of BERT model is better than the result without the model.