自然言語処理
Online ISSN : 2185-8314
Print ISSN : 1340-7619
ISSN-L : 1340-7619
一般論文(査読有)
Focused Prefix Tuning for Controllable Text Generation
Congda MaTianyu ZhaoMakoto ShingKei SawadaManabu Okumura
著者情報
ジャーナル フリー

2024 年 31 巻 1 号 p. 250-265

詳細
抄録

In a controllable text generation dataset, unannotated attributes may provide irrelevant learning signals to models that use them for training, thereby degrading their performance. We propose focused prefix tuning(FPT) to mitigate this problem and enable control to focus on the desired attribute. Experimental results show that FPT can achieve better control accuracy and text fluency than baseline models in single-attribute control tasks. In multi-attribute control tasks, FPT achieves control accuracy comparable to that of the state-of-the-art approach while maintaining the flexibility to control new attributes without retraining existing models.

著者関連情報
© 2024 The Association for Natural Language Processing
前の記事 次の記事
feedback
Top