Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
38th (2024)
Session ID : 4Xin2-104
Conference information

Ensemble Transfer Learning for Multilingual Learning via BrainLM: A Multi-modal Model
*Ying LUOIchiro KOBAYASHI
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

Recently, a multi-modal model: BrainLM was introduced, which shows excellent efficacy in text-brain encoding and decoding tasks. Expanding upon this groundwork, this study extended the applicability of BrainLM into a novel dataset and unexplored language systems. By employing transfer learning techniques, this study sought to extend the model's potential in multilingual learning tasks and improve its generalization capabilities. Notably, during the fine-tuning process of the binary classification task, BrainLM achieved the highest accuracy of 51.75%. this study obtained an improvement in correlation coefficients of about 3%-15% by comparing the model before and after performing transfer learning for the brain prediction task. Furthermore, BrainLM demonstrated the highest correlation in the brain prediction task under the whole cerebral cortex compared to other models. This study not only broadens the application scope of BrainLM but also sheds light on the intricate interactions between brain functional regions and large language models across diverse linguistic environments.

Content from these authors
© 2024 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top