Journal of Natural Language Processing
Online ISSN : 2185-8314
Print ISSN : 1340-7619
ISSN-L : 1340-7619
General Paper (Peer-Reviewed)
ExtraPhrase: Efficient Data Augmentation for Abstractive Summarization
Mengsay LoemSho TakaseMasahiro KanekoNaoaki Okazaki
Author information
JOURNAL FREE ACCESS

2023 Volume 30 Issue 2 Pages 489-506

Details
Abstract

Neural models trained on large amounts of parallel data have achieved impressive performance in abstractive summarization tasks. However, constructing large-scale parallel corpora can be expensive and challenging. In this work, we introduce a low-cost and effective strategy called ExtraPhrase to augment training data for abstractive summarization tasks. ExtraPhrase constructs pseudo training data with two modules: sentence compression and paraphrasing. We extract major parts of an input text with sentence compression and obtain its diverse expressions with paraphrasing. Through experiments, we show that ExtraPhrase improves the performance of abstractive summarization tasks by more than 0.50 points in ROUGE scores compared to the setting without data augmentation. ExtraPhrase also outperforms existing methods such as back-translation and self-training. We also show that ExtraPhrase is significantly effective when the amount of genuine training data is remarkably small, i.e., in a low-resource setting. Moreover, ExtraPhrase is more cost-efficient than existing approaches.

Content from these authors
© 2023 The Association for Natural Language Processing
Previous article Next article
feedback
Top