Journal of Natural Language Processing
Online ISSN : 2185-8314
Print ISSN : 1340-7619
ISSN-L : 1340-7619
Paper
Neural Japanese Zero Anaphora Resolution with Candidate Reduction Using Large-scale Case Frames
Souta YamashiroHitoshi NishikawaTakenobu Tokunaga
Author information
JOURNAL FREE ACCESS

2019 Volume 26 Issue 2 Pages 509-536

Details
Abstract

This paper presents a model for Japanese zero anaphora resolution that deals with both intra- and inter-sentential zero anaphora. Our model resolves anaphora for multiple cases simultaneously by utilising and comparing information from other cases. This simultaneous resolution requires the consideration of many combinations of antecedent candidates, which could be a crucial obstacle in both the training and resolving phases. To cope with this problem, we have proposed an effective candidate pruning method using case frame information. We compared the model, which estimates multiple cases simultaneously, by using our proposed candidate pruning method and model, which estimates each case independently without a candidate reduction method in a Japanese balanced corpus. The results confirmed a 0.056-point increase in accuracy. Furthermore, we also confirmed that the introduction of local attention Recurrent Neural Network increases the accuracy of inter-sentential anaphora resolution.

Content from these authors
© 2019 The Association for Natural Language Processing
Previous article
feedback
Top