JSAI2024

Presentation information

Organized Session

Organized Session » OS-13

[2R6-OS-13a] OS-13

Wed. May 29, 2024 5:30 PM - 7:10 PM Room R (Room 51)

オーガナイザ:酒井 元気(日本大学)、岡田 将吾(北陸先端科学技術大学院大学)、湯浅 将英(湘南工科大学)、近藤 一晃(京都大学)、下西 慶(京都大学)

6:10 PM - 6:30 PM

[2R6-OS-13a-03] Recognition of Others' Intention between Remote Locations through Autonomous Eye-tracking motion

〇Hibiki Ikoma1, Yugo Takeuchi1 (1. ShizuokaUniversity)

Keywords:Telepresence Avatar Robots, Eye Tracking, Autonomous Motion, Intention Estimation, Joint Attention

COVID-19 and technological developments have increased opportunities for remote communication, including online meetings.This has led to a focus on ``telepresence avatar robots'', a technology that makes people feel as if they are sharing the same place, even between remote locations.The telepresence avatar robot takes an approach to support communication by generating a realistic sense of conversation by incorporating technologies such as camera motion and eye contact with the speaker.However, this approach does not enable smooth communication because the robot operator cannot recognize the intention of the remote worker because the information transmitted from the remote location is less than that of face-to-face communication.In this study, we focused on the gaze-following motion that human unconscious behavior and implemented it in a robot to verify whether the robot operator can recognize the desires of a remote worker through this autonomous motion.The experimental results indicated that the robot operator may have recognized the intention of remote worker through the autonomous motion.This suggests that this approach may be a useful method for smooth communication.

Authentication for paper PDF access

A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.

Password