自然言語処理
Online ISSN : 2185-8314
Print ISSN : 1340-7619
ISSN-L : 1340-7619
一般論文
Effectiveness of Syntactic Dependency Information for Higher-Order Syntactic Attention Network
Hidetaka KamigaitoKatsuhiko HayashiTsutomu HiraoMasaaki NagataManabu Okumura
著者情報
ジャーナル フリー

2021 年 28 巻 2 号 p. 321-349

詳細
抄録

Recently, as a replacement of syntactic tree-based approaches, such as tree-trimming, Long Short-Term Memory (LSTM)-based methods have been commonly used to compress sentences because LSTM can generate fluent compressed sentences. However, the performance of these methods degrades significantly while compressing long sentences because they do not explicitly handle long-distance dependencies between the words. To solve this problem, we proposed a higher-order syntactic attention network (HiSAN) that can handle higher-order dependency features as an attention distribution on LSTM hidden states. Furthermore, to avoid the influence of incorrect parse results, we trained HiSAN by maximizing the probability of a correct output together with the attention distribution. Experiments on the Google sentence compression dataset show that our method improved the performance from baselines in terms of F1 as well as ROUGE-1, -2, and -L scores. In subjective evaluations, HiSAN outperformed baseline methods in both readability and informativeness. Besides, in this study, we additionally investigated the performance of HiSAN after training it without any syntactic dependency tree information. The results of our investigation show that HiSAN can compress sentences without relying on any syntactic dependency information while maintaining accurate compression rates, and also shows the effectiveness of syntactic dependency information in compressing long sentences with higher F1 scores.

著者関連情報
© 2021 The Association for Natural Language Processing
前の記事 次の記事
feedback
Top