[3Yin2-51] A Supervised Syntactic Structure Analysis using Tree Structure Transformer
Keywords:Transformer, Syntax structure, Tree-Transformer
Tree-Transformer is an unsupervised learning method which finds syntactic structures of input sentences by using the attention mechanism of Transformer. On the other hand, for syntactic structures, there are good training data such as Penn TreeBank. Using such data, we propose a method that applies supervised learning to Tree-Transformer for parsing the syntactic strcture of a sentence.
We especially propose a new hierarchical error back propagation which is directly applied to the intermediate layers of the Transformer encoder to achieve syntactic structure parsing in the neural network framework.
Through experiments, we have confirmed that our proposed method is useful for syntactic structure analysis.
We especially propose a new hierarchical error back propagation which is directly applied to the intermediate layers of the Transformer encoder to achieve syntactic structure parsing in the neural network framework.
Through experiments, we have confirmed that our proposed method is useful for syntactic structure analysis.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.