Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
37th (2023)
Session ID : 1E3-GS-6-05
Conference information

Investigation of Expert Knowledge Extraction Using Pre-trained Language Models
*Seiya ASANOMasaru ISONUMAKimitaka ASATANIMisuzu NOMURAJunichiro MORIIchiro SAKATA
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

In recent years, there has been a lot of research focused on using language models instead of knowledge bases. Language models have many advantages compared to structured knowledge bases, such as not requiring manual definition of information attributes and relationships and being able to search more data in a more flexible and efficient manner. However, their performance is still developing, and there are still many hurdles to overcome, such as the inability to predict compound nouns. This study specifically focused on the knowledge of specialized compound nouns related to chemistry and investigated how accurately knowledge in a specific field could be extracted. Specifically, by using SciFive, which was further trained with T5 on biomedical papers, and by performing additional training on abstract data contained in Scopus, the study aimed to improve the accuracy of extracting specialized knowledge in chemistry. The results confirmed how accuracy changes depending on the amount of data used for additional training, with a decrease in accuracy with less data and an improvement in accuracy with relatively more data. These results demonstrate further potential for attempts to extract knowledge from language models.

Content from these authors
© 2023 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top