自然言語処理
Online ISSN : 2185-8314
Print ISSN : 1340-7619
ISSN-L : 1340-7619
一般論文(査読有)
Generic Mechanism for Reducing Repetitions in Encoder-decoder Models
Ying ZhangHidetaka KamigaitoTatsuya AokiHiroya TakamuraManabu Okumura
著者情報
ジャーナル フリー

2023 年 30 巻 2 号 p. 401-431

詳細
抄録

Encoder-decoder models have been commonly used; they have achieved state-of-the-art results for many natural language generation tasks. However, according to the reports of previous studies, encoder-decoder models suffer from generating redundant repetitions. Thus, we herein propose a repetition reduction module (RRM) for encoder-decoder models that estimates the semantic difference of a source sentence before and after it is fed into the model to capture the consistency between the two sides. As an autoencoder, the proposed mechanism supervises the training of encoder-decoder models to reduce the number of repeatedly generated tokens. The evaluation results of the publicly available machine translation and response generation datasets demonstrate the effectiveness of our proposal.

著者関連情報
© 2023 The Association for Natural Language Processing
前の記事 次の記事
feedback
Top