JSAI2022

Presentation information

Interactive Session

General Session » Interactive Session

[3Yin2] Interactive session 1

Thu. Jun 16, 2022 11:30 AM - 1:10 PM Room Y (Event Hall)

[3Yin2-51] A Supervised Syntactic Structure Analysis using Tree Structure Transformer

〇Momoka Narita1, Tomoe Taniguchi1, Daichi Mochihashi2, Ichiro Kobayashi1 (1.Ochanomizu University, 2.The Institute of Statistical Mathematics)

Keywords:Transformer, Syntax structure, Tree-Transformer

Tree-Transformer is an unsupervised learning method which finds syntactic structures of input sentences by using the attention mechanism of Transformer. On the other hand, for syntactic structures, there are good training data such as Penn TreeBank. Using such data, we propose a method that applies supervised learning to Tree-Transformer for parsing the syntactic strcture of a sentence.
We especially propose a new hierarchical error back propagation which is directly applied to the intermediate layers of the Transformer encoder to achieve syntactic structure parsing in the neural network framework.
Through experiments, we have confirmed that our proposed method is useful for syntactic structure analysis.

Authentication for paper PDF access

A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.

Password