Journal of Natural Language Processing
Online ISSN : 2185-8314
Print ISSN : 1340-7619
ISSN-L : 1340-7619
General Paper (Peer-Reviewed)
Second Language Acquisition of Neural Language Models
Miyu ObaTatsuki KuribayashiHiroki OuchiTaro Watanabe
Author information
JOURNAL FREE ACCESS

2024 Volume 31 Issue 2 Pages 433-455

Details
Abstract

The success of neural language models (LMs) has considerably increased the attention on their language acquisition. This work focuses on the second language (L2) acquisition of LMs, whereas previous study has typically explored their first language (L1) acquisition. Specifically, we trained bilingual LMs using a scenario similar to human L2 acquisition and analyzed their cross-lingual transfer from linguistic perspectives. Our exploratory experiments demonstrated that L1 pre-training accelerated their linguistic generalization in L2, and language transfer configurations (for example, L1 choice and the presence of parallel texts) substantially affected their generalizations. These clarify their (non-) humanlike L2 acquisition in particular aspects.

Content from these authors
© 2024 The Association for Natural Language Processing
Previous article Next article
feedback
Top