Host: The Japanese Society for Artificial Intelligence
Name : The 36th Annual Conference of the Japanese Society for Artificial Intelligence
Number : 36
Location : [in Japanese]
Date : June 14, 2022 - June 17, 2022
Tree-Transformer is an unsupervised learning method which finds syntactic structures of input sentences by using the attention mechanism of Transformer. On the other hand, for syntactic structures, there are good training data such as Penn TreeBank. Using such data, we propose a method that applies supervised learning to Tree-Transformer for parsing the syntactic strcture of a sentence. We especially propose a new hierarchical error back propagation which is directly applied to the intermediate layers of the Transformer encoder to achieve syntactic structure parsing in the neural network framework. Through experiments, we have confirmed that our proposed method is useful for syntactic structure analysis.