自然言語処理
Online ISSN : 2185-8314
Print ISSN : 1340-7619
ISSN-L : 1340-7619
一般論文
Analyzing Methods for Generating Feedback Comments for Language Learners
Kazuaki HanawaRyo NagataKentaro Inui
著者情報
ジャーナル フリー

2022 年 29 巻 3 号 p. 901-924

詳細
抄録

Feedback comment generation is the task of generating explanatory notes for language learners. Although various generation techniques are available, little is known about which methods are appropriate for this task. Nagata (2019) demonstrates the effectiveness of neural-retrieval-based methods in generating feedback comments for preposition use. Retrieval-based methods have limitations in that they can only output feedback comments existing in the given training data. Besides, feedback comments can be made on other grammatical and writing items other than preposition use, which has not yet been addressed. To shed light on these points, we investigate a wider range of methods for generating various types of feedback comments in this study. Our close analysis of the features of the task leads us to investigate three different architectures for comment generation: (i) a neural-retrieval-based method as a baseline, (ii) a pointer-generator-based generation method as a neural seq2seq method, (iii) a retrieve-and-edit method, a hybrid of (i) and (ii). Intuitively, the pointer-generator should outperform neural-retrieval, and retrieve-and-edit should perform the best. However, in our experiments, this expectation is completely overturned. We closely analyze the results to reveal the major causes of these counter-intuitive results and report on our findings from the experiments, which will lead to further developments of feedback comment generation.

著者関連情報
© 2022 The Association for Natural Language Processing
前の記事 次の記事
feedback
Top