JSAI2022

Presentation information

Organized Session

Organized Session » OS-10

[1N4-OS-10a] System1型+2型統合AIへの展望(1/2)

Tue. Jun 14, 2022 2:20 PM - 4:00 PM Room N (Room 501)

オーガナイザ:栗原 聡(慶應義塾大学)[現地]、山川 宏(全脳アーキテクチャ・イニシアティブ)、三宅 陽一郎(スクウェア・エニックス)

3:20 PM - 3:40 PM

[1N4-OS-10a-04] Construction of system 2 by human-in-the-loop based on question generation driven by System 1

〇Ayame Shimizu1, Kei Wakabayashi1, Masaki Matsubara1, Ito Hiroyoshi1, Morishima Atsuyuki1 (1. University of Tsukuba)

Keywords:Knowledge Distillation, Explainable AI, Crowdsourcing, Human-in-the-loop

Many processes within machine learning models are a black box, and in most cases, their inference cannot be explained in a way humans can understand. This becomes a serious problem when implementing machine learning in domains requiring accountability.Wan et al. have proposed a method called NBDT, which generates a classification model with a tree structure of binary classifiers as each node from a deep learning model for multi-class classification, but it is unclear what kind of decision each node represents.In our work, we propose a method to reconstruct a machine learning model whose features used for judgment can be explained in natural language by incorporating a tree structure model constructed by NBDT into human-in-the-loop.The proposed method extracts only the nodes whose judgments are clear for humans and reconstructs a transparent machine learning model based on human annotation.Through crowdsourcing experiments, we show that it is possible to build machine learning models based on human interpretable judgments expressed by natural language.

Authentication for paper PDF access

A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.

Password