2:20 PM - 2:40 PM
[4T3-OS-6d-02] Developing a Method for Dynamically Deciding Emphasis Points in Human-XAI Interaction
Keywords:Explainable AI, User model, Decision support system, Human-XAI interaction
The development of explainable AIs and large language models has enabled AI-based systems to generate various kinds of explanations for AI predictions. This paper considers the problem of dynamically deciding how to provide such explanations to human users for successful decision support. Specifically, this paper proposes a prototype method for AI-based systems to determinswhich explanation to put emphasis on. The method aims to predict how explanations with or without emphasis affects user decisions and guide user decisions to AI recommended ones by switching emphasis. We implemented the proposed method in a stock trading simulator with a support of a stock price prediction AI and discuss the potential of the method.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.