1:50 PM - 2:10 PM
[2E4-GS-6-02] Evaluation of Recurrent Neural Network CCG Parser
Keywords:RNNG, CCG, Syntactic parser
Deep learning models have achieved high accuracy in various natural language processing tasks, but it is controversial whether these models encode the structural information of sentences. In this context, Recurrent Neural Network Grammars (RNNGs) were proposed as a model considering syntactic structures. In this study, we implemented RNN-CCGs, language models that substitute CFG, the underlying grammar of RNNGs, with Combinatory Categorial Grammar (CCG). Compared to CFG, CCG provides more appropriate syntactic structures for natural language and provides paths of semantic composition. Since RNNGs do not consider part-of-speech tags, we implemented a model that predicts POS tags necessary for semantic composition. We compared RNN-CCGs with RNNGs with/without POS tags and evaluated their behaviours.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.