Japanese sentence-final particles such as “yo” and “ne” reflect our cognitive mechanism of manipulating and integrating linguistic and pragmatic/sociolinguistic information. However, despite the importance of this mechanism in cognitive science, few constructive approaches have been presented so far. Therefore, we developed the Subjective-BERT model for acquiring the meanings of the two sentence-final particles, “yo” and “ne” and those of content words. The proposed model was pre-trained on the dataset with not only linguistic information but also sensory and mental information. In a previous study (Mandokoro et al., 2022a), we demonstrated that the model acquires the meanings of the particles by learning the relations between not only words but also words and sensory/mental information. This paper further explores how the meaning of each word is represented and processed in the model by analyzing the internal representations. Among the main results: (i) the analysis of the attention chain between three tokens (a particle, a content word, and its referent), revealed that the information of the particle is first conveyed to the content word, which in turn controls the subsequent information flow from the content word to its referent; and (ii) the contrastive analysis of the transformation processes of embeddings when each sentence-final particle is used, enables the identification of the mechanism used to generate the attention chains, suggesting that the residual connections and feedforward neural network layers, as well as self-attention, contribute to the understanding of utterance meaning.