IEICE Transactions on Information and Systems
Online ISSN : 1745-1361
Print ISSN : 0916-8532
Pre-trained BERT Model Retrieval: Inference-Based No-Learning Approach using k-Nearest Neighbour Algorithm
Huu-Long PHAMRyota MIBAYASHITakehiro YAMAMOTOMakoto P. KATOYusuke YAMAMOTOYoshiyuki SHOJIHiroaki OHSHIMA
著者情報
ジャーナル フリー 早期公開

論文ID: 2024DAT0003

詳細
抄録

In this study, we propose a method to efficiently retrieve BERT pre-trained models that achieve good performance on a specific document classification task. In natural language processing problems, the common practice involves fine-tuning existing pre-trained models rather than building new ones from the ground up due to the extensive time and computational resources required. The challenge, however, lies in identifying the most suitable model from a large number of available pre-trained models. To address this problem, our proposed method utilizes the k-nearest neighbor algorithm to retrieve appropriate BERT pre-trained models without the necessity for fine-tuning. We conducted experiments by constructing a benchmark dataset with 28 document classification tasks and 20 BERT models.

著者関連情報
© 2025 The Institute of Electronics, Information and Communication Engineers
前の記事 次の記事
feedback
Top