Journal of Natural Language Processing
Online ISSN : 2185-8314
Print ISSN : 1340-7619
ISSN-L : 1340-7619
General Paper
Probing Simple Factoid Question Answering Based on Linguistic Knowledge
Namgi HanHiroshi NojiKatsuhiko HayashiHiroya TakamuraYusuke Miyao
Author information
JOURNAL FREE ACCESS

2021 Volume 28 Issue 4 Pages 938-964

Details
Abstract

Recent studies have indicated that existing systems for simple factoid question answering over a knowledge base are not robust for different datasets. We evaluated the ability of a pretrained language model, BERT, to perform this task on four datasets, Free917, FreebaseQA, SimpleQuestions, and WebQSP, and found that, like other existing systems, the existing BERT-based system also can not solve them robustly. To investigate the reason for this problem, we employ a statistical method, partial least squares path modeling (PLSPM), with 24 BERT models and two probing tasks, SentEval and GLUE. Our results reveal that the existing BERT-based system tends to depend on the surface and syntactic features of each dataset, and it disturbs the generality and robustness of the system performance. We also discuss the reason for this phenomenon by considering the features of each dataset and the method that was used to evaluate the simple factoid question answering task.

Content from these authors
© 2021 The Association for Natural Language Processing
Previous article Next article
feedback
Top