Journal of Natural Language Processing
Online ISSN : 2185-8314
Print ISSN : 1340-7619
ISSN-L : 1340-7619
General Paper (Peer-Reviewed)
Focused Prefix Tuning for Controllable Text Generation
Congda MaTianyu ZhaoMakoto ShingKei SawadaManabu Okumura
Author information
JOURNAL FREE ACCESS

2024 Volume 31 Issue 1 Pages 250-265

Details
Abstract

In a controllable text generation dataset, unannotated attributes may provide irrelevant learning signals to models that use them for training, thereby degrading their performance. We propose focused prefix tuning(FPT) to mitigate this problem and enable control to focus on the desired attribute. Experimental results show that FPT can achieve better control accuracy and text fluency than baseline models in single-attribute control tasks. In multi-attribute control tasks, FPT achieves control accuracy comparable to that of the state-of-the-art approach while maintaining the flexibility to control new attributes without retraining existing models.

Content from these authors
© 2024 The Association for Natural Language Processing
Previous article Next article
feedback
Top