9:40 AM - 10:00 AM
[4H1-GS-11b-03] Toward criticizable system to support personally critical decisions
Keywords:Decision support system, AI Ethics, Personally important decisions
As machine learning is widely used to help daily decisions, an increasing number of people are affected by the results of applications such as search engines, or recommender systems. On the other hand, it has been pointed out that these applications provide bias with the perceived world by individuals through the effect called filter bubble or social echo-chamber. Biasing users' perception is an ethical issue in the personally important decisions related to future pass such as decisions on the place of employment or which course s/he will take in Universities in that technology decides or distorts the possibility of a person. However, there have not been any appropriate ways of supporting personally important decisions for the individuals. In this paper, we discuss what we need to explore to develop systems to support personally important decisions in the context of decision strategies, mapping of alternatives, and supporting interactions. We conclude that we need the systems whose results can be criticized and overturned by their users for users to extend their choices in their lives. Finally, we suggest a list of research agendas to develop criticizable decision support systems.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.