Journal of Information Processing
Online ISSN : 1882-6652
ISSN-L : 1882-6652
 
Nearest Neighbor Non-autoregressive Text Generation
Ayana NiwaSho TakaseNaoaki Okazaki
Author information
JOURNAL FREE ACCESS

2023 Volume 31 Pages 344-352

Details
Abstract

Non-autoregressive (NAR) models can generate sentences with less computation than autoregressive models but sacrifice generation quality. Previous studies addressed this issue through iterative decoding. This study proposes using nearest neighbors as the initial state of an NAR decoder and editing them iteratively. We present a novel training strategy to learn the edit operations on neighbors to improve NAR text generation. Experimental results show that the proposed method (NeighborEdit) achieves higher translation quality (1.69 points higher than the vanilla Transformer) with fewer decoding iterations (one-eighteenth fewer iterations) on the JRC-Acquis En-De dataset, the common benchmark dataset for machine translation using nearest neighbors. We also confirm the effectiveness of the proposed method on a data-to-text task (WikiBio). In addition, the proposed method outperforms an NAR baseline on the WMT'14 En-De dataset. We also report analysis on neighbor examples used in the proposed method.

Content from these authors
© 2023 by the Information Processing Society of Japan
Previous article Next article
feedback
Top