Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
37th (2023)
Session ID : 2E4-GS-6-02
Conference information

Evaluation of Recurrent Neural Network CCG Parser
*Sora TAGAMIDaisuke BEKKI
Author information
Keywords: RNNG, CCG, Syntactic parser
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

Deep learning models have achieved high accuracy in various natural language processing tasks, but it is controversial whether these models encode the structural information of sentences. In this context, Recurrent Neural Network Grammars (RNNGs) were proposed as a model considering syntactic structures. In this study, we implemented RNN-CCGs, language models that substitute CFG, the underlying grammar of RNNGs, with Combinatory Categorial Grammar (CCG). Compared to CFG, CCG provides more appropriate syntactic structures for natural language and provides paths of semantic composition. Since RNNGs do not consider part-of-speech tags, we implemented a model that predicts POS tags necessary for semantic composition. We compared RNN-CCGs with RNNGs with/without POS tags and evaluated their behaviours.

Content from these authors
© 2023 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top