Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
34th (2020)
Session ID : 3Rin4-83
Conference information

A Conditional Language Model for Controlling Sentence Length
*Koichi NAGATSUKAMasayasu ATSUMI
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

The performance of sentence generation task has been dramatically improved due to the development of a pre-trained language model in recent years. In particular, a pre-trained conditional language model such as GPT-2 shows the nearly human-level performance of generating sentences by self-supervised learning. Furthermore, a conditional language model such as CTRL can successfully control the topics and styles of the generated text with control codes. However, it is effective to input a control code in a form of continuous representations rather than discrete representations in case of the sentence generation task. In this study, we propose an approach for the controllable sentence generation with a desirable length by explicitly adding a distributed representation of a target length. We use the positional encoding to obtain the continuous representation of a target length, and fine-tuned a pre-trained GPT-2 with wikitext-103. The result shows our approach is effective for controlling sentence length while generating natural sentences.

Content from these authors
© 2020 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top