JSAI2023

Presentation information

General Session

General Session » GS-5 Language media processing

[3A1-GS-6] Language media processing

Thu. Jun 8, 2023 9:00 AM - 10:40 AM Room A (Main hall)

座長:是枝 祐太(日立製作所) [現地]

9:20 AM - 9:40 AM

[3A1-GS-6-02] Supervised Syntactic Structure Analysis using a Tree Structure Self-attention Mechanism to represent Latent Syntactic Trees

〇Momoka Narita1, Daichi Mochihashi2, Ichiro Kobayashii1 (1. Ochanomizu University, 2. The Institute of Statistical Mathematics)

Keywords:Syntactic analysis, self-attention

In this paper, we evaluate our proposed model, which employs a tree structure self-attention mechanism with constraints on the self-attention mechanism in the Transformer encoder to reflect the syntactic structure of the input sentence, by analyzing syntactic errors in the syntactic structure analysis results, instead of the conventional F1 score-based evaluation method. As a result of the error analysis, it was found that capturing dependencies among clauses and clauses by paying attention to generative rules is necessary to improve its accuracy. In addition, the parameters of the encoder were set randomly in the previous proposed model. In order to further improve its accuracy, we conducted an additional experiment using BERT's Masked Language Modeling to pre-train the encoder, but were unable to obtain effective results.

Authentication for paper PDF access

A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.

Password