Journal of Natural Language Processing
Online ISSN : 2185-8314
Print ISSN : 1340-7619
ISSN-L : 1340-7619
General Paper (Peer-Reviewed)
Generic Mechanism for Reducing Repetitions in Encoder-decoder Models
Ying ZhangHidetaka KamigaitoTatsuya AokiHiroya TakamuraManabu Okumura
Author information
JOURNAL FREE ACCESS

2023 Volume 30 Issue 2 Pages 401-431

Details
Abstract

Encoder-decoder models have been commonly used; they have achieved state-of-the-art results for many natural language generation tasks. However, according to the reports of previous studies, encoder-decoder models suffer from generating redundant repetitions. Thus, we herein propose a repetition reduction module (RRM) for encoder-decoder models that estimates the semantic difference of a source sentence before and after it is fed into the model to capture the consistency between the two sides. As an autoencoder, the proposed mechanism supervises the training of encoder-decoder models to reduce the number of repeatedly generated tokens. The evaluation results of the publicly available machine translation and response generation datasets demonstrate the effectiveness of our proposal.

Content from these authors
© 2023 The Association for Natural Language Processing
Previous article Next article
feedback
Top