JSAI2024

Presentation information

Poster Session

Poster session » Poster session

[4Xin2] Poster session 2

Fri. May 31, 2024 12:00 PM - 1:40 PM Room X (Event hall 1)

[4Xin2-45] Conceptual Knowledge in the Pretrained Japanese BERT Model

〇Ryuichi Watanabe1 (1.Kyoto University)

Keywords:AI, NLP, transformer

In this paper, we undertake experiments to identify neurons that hold knowledge about concepts in the Feed-Forward Network (FFN) layers of a Japanese BERT model. Specifically, we identify important weight parameters (neurons) within the pre-trained Japanese BERT model for MLM tasks, and confirm their locations and significance through comparative experiments. Additionally, we compare our results with prior studies using a pre-trained English BERT model, demonstrating differences in experimental outcomes under specific conditions.

Please log in with your participant account.
» Participant Log In