6533b829fe1ef96bd128af94

RESEARCH PRODUCT

Exploiting Correlation between Body Gestures and Spoken Sentences for Real-time Emotion Recognition

Giovanni PilatoFabrizio MilazzoAgnese AugelloAntonio GentileSalvatore SorceVito Gentile

subject

Settore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniGround truthSettore INF/01 - InformaticaExploitK-nearest neighborbusiness.industrySpeech recognitioncomputer.software_genreMotion (physics)CorrelationDynamic Time Warping Emotion Recognition K-nearest neighborEmotion RecognitionKey (cryptography)Artificial intelligenceState (computer science)businessAssociation (psychology)PsychologycomputerNatural language processingGestureDynamic Time Warping

description

Humans communicate their affective states through different media, both verbal and non-verbal, often used at the same time. The knowledge of the emotional state plays a key role to provide personalized and context-related information and services. This is the main reason why several algorithms have been proposed in the last few years for the automatic emotion recognition. In this work we exploit the correlation between one's affective state and the simultaneous body expressions in terms of speech and gestures. Here we propose a system for real-time emotion recognition from gestures. In a first step, the system builds a trusted dataset of association pairs (motion data -> emotion pattern), also based on textual information. Such dataset is the ground truth for a further step, where emotion patterns can be extracted from new unclassified gestures. Experimental results demonstrate a good recognition accuracy and real-time capabilities of the proposed system.

10.1145/3125571.3125590https://dx.doi.org/10.1145/3125571.3125590