2:10 PM - 2:30 PM
[2U4-IS-2c-03] Emotion-guided Multi-modal Fusion for Personality Traits Recognition
[[Online, Regular]]
Keywords:Deep Learning, Multi-Modal
Multi-modal personality traits recognition methods recognize human personality traits to improve the quality of human-computer interaction, which has attracted increasing attention in recent years. However, current methods fail to remove noise and cannot align different modality features in the feature fusion process. To solve the above problems, we propose an emotion guided multi-modal fusion framework for personality traits recognition. Inspired by the close relationship between emotion and personality, we design a novel emotion-guided multi-modal fusion mechanism, which is expected to enhance emotion-related features by emotion-level alignment and pay less attention to irrelevant features to remove noise. Extensive experiments show the effectiveness and robustness of our model.
Authentication for paper PDF access
A password is required to view paper PDFs. If you are a registered participant, please log on the site from Participant Log In.
You could view the PDF with entering the PDF viewing password bellow.