Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
36th (2022)
Session ID : 3M4-GS-4-04
Conference information

Fusion of Linguistic and Citation Information in Scientific Literature using Transformer Model
*Masanao Ochi OCHIMasanori SHIROJun'ichiro MORIIchiro Sakata SAKATA
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

The Transformer model, released in 2017, was initially used in natural language processing but has since been widely used in various fields such as image processing and network science. The Transformer model can be used to publish trained models using large data sets and apply new data to individual tasks. Fine-tuning can be applied. The scientific literature contains a wide variety of data, including language, citations, and images of figures and tables. However, classification and regression studies have mainly been conducted by using each data individually and combining the extracted features, and the interaction between the data has not been fully considered. This paper proposes an end2end fusion method of linguistic and citation information of academic literature data using the Transformer model. The proposed method improves the F-measure by 2.6 to 6.0 points compared to using only individual information. This method makes it possible to fuse various data from the academic literature into end2end and shows the possibility of efficiently improving the accuracy of various classifications and predictions.

Content from these authors
© 2022 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top