Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
38th (2024)
Session ID : 4Xin2-45
Conference information

Conceptual Knowledge in the Pretrained Japanese BERT Model
*Ryuichi WATANABE
Author information
Keywords: AI, NLP, transformer
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

In this paper, we undertake experiments to identify neurons that hold knowledge about concepts in the Feed-Forward Network (FFN) layers of a Japanese BERT model. Specifically, we identify important weight parameters (neurons) within the pre-trained Japanese BERT model for MLM tasks, and confirm their locations and significance through comparative experiments. Additionally, we compare our results with prior studies using a pre-trained English BERT model, demonstrating differences in experimental outcomes under specific conditions.

Content from these authors
© 2024 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top