JSAI2020

Presentation information

Interactive Session

[3Rin4] Interactive 1

Thu. Jun 11, 2020 1:40 PM - 3:20 PM Room R01 (jsai2020online-2-33)

[3Rin4-39] Further Pretraining BERT for Causality Existence Classification in Financial Domain

〇Yuta Niki1, Hiroki Sakaji1, Kiyoshi Izumi1, Hiroyasu Matsushima1 (1.School of Engineering, The University of Tokyo)

Keywords:Causal Extraction, Natural Language Processing, Language Resources

In this research, J-FinBERT, which is further pretrained from BERT model by using Japanese financial corpus, is proposed to classify the causality existence in the financial texts. We verify the effectiveness of further pretraining on prediction accuracy and robustness to fewer samples and noisy labels in our task. Through our experiments, we find that further pretraining in specific domain corpus is effective for improving prediction accuracy and robustness for small dataset in our task, but not for noisy labels.

Authentication for paper PDF access

A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.

Password