Proceedings of the Annual Conference of JSAI
Online ISSN : 2758-7347
36th (2022)
Session ID : 3Yin2-51
Conference information

A Supervised Syntactic Structure Analysis using Tree Structure Transformer
*Momoka NARITATomoe TANIGUCHIDaichi MOCHIHASHIIchiro KOBAYASHI
Author information
CONFERENCE PROCEEDINGS FREE ACCESS

Details
Abstract

Tree-Transformer is an unsupervised learning method which finds syntactic structures of input sentences by using the attention mechanism of Transformer. On the other hand, for syntactic structures, there are good training data such as Penn TreeBank. Using such data, we propose a method that applies supervised learning to Tree-Transformer for parsing the syntactic strcture of a sentence. We especially propose a new hierarchical error back propagation which is directly applied to the intermediate layers of the Transformer encoder to achieve syntactic structure parsing in the neural network framework. Through experiments, we have confirmed that our proposed method is useful for syntactic structure analysis.

Content from these authors
© 2022 The Japanese Society for Artificial Intelligence
Previous article Next article
feedback
Top