Cognitive Studies: Bulletin of the Japanese Cognitive Science Society
Online ISSN : 1881-5995
Print ISSN : 1341-7924
ISSN-L : 1341-7924
Feature Cognitive science on language-What are the bases of language?
Understanding of utterance meaning by Subjective-BERT: An analysis of internal representation with particular reference to self-attention chain
Shuhei MandokoroNatsuki OkaAkane MatsushimaChie FukadaYuko YoshimuraKoji KawaharaKazuaki Tanaka
Author information
JOURNAL FREE ACCESS

2024 Volume 31 Issue 1 Pages 205-224

Details
Abstract

Japanese sentence-final particles such as “yo” and “ne” reflect our cognitive mechanism of manipulating and integrating linguistic and pragmatic/sociolinguistic information. However, despite the importance of this mechanism in cognitive science, few constructive approaches have been presented so far. Therefore, we developed the Subjective-BERT model for acquiring the meanings of the two sentence-final particles, “yo” and “ne” and those of content words. The proposed model was pre-trained on the dataset with not only linguistic information but also sensory and mental information. In a previous study (Mandokoro et al., 2022a), we demonstrated that the model acquires the meanings of the particles by learning the relations between not only words but also words and sensory/mental information. This paper further explores how the meaning of each word is represented and processed in the model by analyzing the internal representations. Among the main results: (i) the analysis of the attention chain between three tokens (a particle, a content word, and its referent), revealed that the information of the particle is first conveyed to the content word, which in turn controls the subsequent information flow from the content word to its referent; and (ii) the contrastive analysis of the transformation processes of embeddings when each sentence-final particle is used, enables the identification of the mechanism used to generate the attention chains, suggesting that the residual connections and feedforward neural network layers, as well as self-attention, contribute to the understanding of utterance meaning.

Content from these authors
© 2024 Japanese Cognitive Science Society
Previous article Next article
feedback
Top