Journal of Natural Language Processing
Online ISSN : 2185-8314
Print ISSN : 1340-7619
ISSN-L : 1340-7619
General Paper (Peer-Reviewed)
Plug-and-Play Attribute-Aware Text Infilling via A New Attention Mechanism and Two-Level Positional Encoding
Dongyuan LiKotaro FunakoshiManabu Okumura
Author information
JOURNAL FREE ACCESS

2023 Volume 30 Issue 3 Pages 1011-1041

Details
Abstract

Text infilling aims to restore incomplete texts by filling in blanks and has attracted increasing attention recently because of its wide application in ancient text restoration, conversation generation, and text rewriting. However, attribute-aware text infilling is yet to be explored, and existing methods seldom focus on the infilling length of each blank or the number and location of the blanks. In this study, we propose a plug-and-play Attribute-aware Text Infilling method using a Pre-trained language model (A-TIP) that contains a text-infilling component and a plug-and-play discriminator. Specifically, we first designed a unified text-infilling component with modified attention mechanisms and intra- and inter-blank positional encoding to better perceive the number of blanks and the infilling length for each blank. We then propose a plug-and-play discriminator to guide generation and improve attribute relevance without decreasing text fluency. Finally, automatic and human evaluations on three open-source datasets indicate that, compared to all the baselines, A-TIP achieves state-of-the-art performance. An additional ablation study demonstrated the robustness of A-TIP.

Content from these authors
© 2023 The Association for Natural Language Processing
Previous article Next article
feedback
Top