[3Win5-55] Classification of Emotional Valence Based on Physiological Signals in Conversations Using a Time-Series Model
Keywords:Affective Computing, Dialogue System, Physiological Signals
Accurately and continuously recognizing users' emotional states in real-time is essential for enabling a dialogue system to adapt to users flexibly. This becomes particularly challenging when speech and linguistic cues are unavailable, such as when the user is not taking a turn. In such cases, non-verbal information becomes crucial in understanding user emotions. In this study, we aimed to develop a model that classifies users' emotional valence during conversations in real-time using physiological signals. Specifically, we utilized multimodal dialogue data, including physiological signals, collected in our previous research. We attempted to build a model that estimates the user's emotional valence, categorized as positive or negative, by leveraging a time-series model on arbitrary segments of physiological signals (EDA, BVP, and PPG) recorded during the dialogue. Experimental results demonstrated that integrating multiple physiological signals enhances emotion estimation performance, highlighting the potential of physiological data to improve real-time emotion recognition in dialogue systems.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.