9:20 AM - 9:40 AM
[3A1-GS-6-02] Supervised Syntactic Structure Analysis using a Tree Structure Self-attention Mechanism to represent Latent Syntactic Trees
Keywords:Syntactic analysis, self-attention
In this paper, we evaluate our proposed model, which employs a tree structure self-attention mechanism with constraints on the self-attention mechanism in the Transformer encoder to reflect the syntactic structure of the input sentence, by analyzing syntactic errors in the syntactic structure analysis results, instead of the conventional F1 score-based evaluation method. As a result of the error analysis, it was found that capturing dependencies among clauses and clauses by paying attention to generative rules is necessary to improve its accuracy. In addition, the parameters of the encoder were set randomly in the previous proposed model. In order to further improve its accuracy, we conducted an additional experiment using BERT's Masked Language Modeling to pre-train the encoder, but were unable to obtain effective results.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.