Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
32nd (2018)
Session ID : 3Pin1-36
Conference information

Neural Headline Generation with Self-Training
Shintaro TAKEMAEKazuma MURAO*Taichi YATSUKAHayato KOBAYASHIMasaki NOGUCHIHitoshi NISHIKAWATakenobu TOKUNAGA
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

In this paper we propose a novel method which incorporates self-training into a sequence-to-sequence model in order to improve the accuracy of the headline generation task. Our model is based on neural network-based sequence-to-sequence learning with an attention mechanism and trained with approximately 100,000 labeled examples and 2,000,000 unlabeled examples. Through experiments, we show our proposal significantly improves the accuracy and works effectively.

Content from these authors
© 2018 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top