自然言語処理
Online ISSN : 2185-8314
Print ISSN : 1340-7619
ISSN-L : 1340-7619
一般論文(査読有)
Plug-and-Play Attribute-Aware Text Infilling via A New Attention Mechanism and Two-Level Positional Encoding
Dongyuan LiKotaro FunakoshiManabu Okumura
著者情報
ジャーナル フリー

2023 年 30 巻 3 号 p. 1011-1041

詳細
抄録

Text infilling aims to restore incomplete texts by filling in blanks and has attracted increasing attention recently because of its wide application in ancient text restoration, conversation generation, and text rewriting. However, attribute-aware text infilling is yet to be explored, and existing methods seldom focus on the infilling length of each blank or the number and location of the blanks. In this study, we propose a plug-and-play Attribute-aware Text Infilling method using a Pre-trained language model (A-TIP) that contains a text-infilling component and a plug-and-play discriminator. Specifically, we first designed a unified text-infilling component with modified attention mechanisms and intra- and inter-blank positional encoding to better perceive the number of blanks and the infilling length for each blank. We then propose a plug-and-play discriminator to guide generation and improve attribute relevance without decreasing text fluency. Finally, automatic and human evaluations on three open-source datasets indicate that, compared to all the baselines, A-TIP achieves state-of-the-art performance. An additional ablation study demonstrated the robustness of A-TIP.

著者関連情報
© 2023 The Association for Natural Language Processing
前の記事 次の記事
feedback
Top