Host: The Japanese Society for Artificial Intelligence
Name : The 36th Annual Conference of the Japanese Society for Artificial Intelligence
Number : 36
Location : [in Japanese]
Date : June 14, 2022 - June 17, 2022
The Transformer model, released in 2017, was initially used in natural language processing but has since been widely used in various fields such as image processing and network science. The Transformer model can be used to publish trained models using large data sets and apply new data to individual tasks. Fine-tuning can be applied. The scientific literature contains a wide variety of data, including language, citations, and images of figures and tables. However, classification and regression studies have mainly been conducted by using each data individually and combining the extracted features, and the interaction between the data has not been fully considered. This paper proposes an end2end fusion method of linguistic and citation information of academic literature data using the Transformer model. The proposed method improves the F-measure by 2.6 to 6.0 points compared to using only individual information. This method makes it possible to fuse various data from the academic literature into end2end and shows the possibility of efficiently improving the accuracy of various classifications and predictions.