Search results for "Facial Expression"
showing 10 items of 132 documents
UBFC-Phys: A Multimodal Database For Psychophysiological Studies of Social Stress
2021
As humans, we experience social stress in countless everyday-life situations. Giving a speech in front of an audience, passing a job interview, and similar experiences all lead us to go through stress states that impact both our psychological and physiological states. Therefore, studying the link between stress and physiological responses had become a critical societal issue, and recently, research in this field has grown in popularity. However, publicly available datasets have limitations. In this article, we propose a new dataset, UBFC-Phys, collected with and without contact from participants living social stress situations. A wristband was used to measure contact blood volume pulse (BVP…
When the smile is a cue to familiarity
2000
International audience; The question discussed in the two following experiments concerns the effect of facial expressions on face recognition. Famous and unknown faces with neutral or smiling expression were presented for different inspection durations (15 ms vs 1000 ms). Subjects had to categorize these faces as famous or unknown (Experiment 1), or estimate their degree of familiarity on a rating scale (Experiment 2). Results showed that the smile increased ratings of familiarity for unfamiliar faces (Experiments 1 and 2) and for famous faces (Experiment 2). These data are discussed in the framework of current face-recognition models and are interpreted in terms of social value of the smil…
Automated Characterization of Mouth Activity for Stress and Anxiety Assessment
2016
International audience; Non-verbal information portrayed by human facial expression, apart from emotional cues also encompasses information relevant to psychophysical status. Mouth activities in particular have been found to correlate with signs of several conditions; depressed people smile less, while those in fatigue yawn more. In this paper, we present a semi-automated, robust and efficient algorithm for extracting mouth activity from video recordings based on Eigen-features and template-matching. The algorithm was evaluated for mouth openings and mouth deformations, on a minimum specification dataset of 640x480 resolution and 15 fps. The extracted features were the signals of mouth expa…
Discrimination des expressions faciales et environnement olfactif - Corrélats cérébraux en électroencéphalographie (EEG) chez l'adulte et le très jeu…
2019
This thesis examines the mechanisms subtending the perception of emotional facial expressions and their early development using a Fast Periodic Visual Presentation (FPVS) approach coupled with electroencephalography (EEG). More specifically, we tried to characterize brain responses reflecting facial expression discrimination and to determine whether hedonic odor contexts influence these responses in adults (studies 1 and 2), and in infants at different developmental stages (studies 3 and 4). We showed specific responses to the discrimination of every facial expression in the adult brain, indicating rapid and automatic categorization of basic facial expressions (study 1). In addition, we rev…
Visual exploration of face and facial expression in infancy: A qualitative approach of cognitive and social development
2016
International audience; This article proposes a methodological consideration for the use of "head free" eye-tracking systems, which allowed to extend this technique to the study of infant skills. It explores how the technological developments enable a more qualitative approach, which offers the possibility of considering "how" in addition to "how long" the infant looks at a visual scene, especially the scene of the face.
Age, gender, and puberty influence the development of facial emotion recognition
2015
Our ability to differentiate between simple facial expressions of emotion develops between infancy and early adulthood, yet few studies have explored the developmental trajectory of emotion recognition using a single methodology across a wide age-range. We investigated the development of emotion recognition abilities through childhood and adolescence, testing the hypothesis that children’s ability to recognise simple emotions is modulated by chronological age, pubertal stage and gender. In order to establish norms, we assessed 478 children aged 6-16 years, using the Ekman-Friesen Pictures of Facial Affect. We then modelled these cross-sectional data in terms of competence in accurate recogn…
Source unreliability decreases but does not cancel the impact of social information on metacognitive evaluations
2015
International audience; Through metacognitive evaluations, individuals assess their own cognitive operations with respect to their current goals. We have previously shown that non-verbal social cues spontaneously influence these evaluations, even when the cues are unreliable. Here, we explore whether a belief about the reliability of the source can modulate this form of social impact. Participants performed a two-alternative forced choice task that varied in difficulty. The task was followed by a video of a person who was presented as being either competent or incompetent at performing the task. That person provided random feedback to the participant through facial expressions indicating ag…
Auditory Emotion Word Primes Influence Emotional Face Categorization in Children and Adults, but Not Vice Versa
2018
In order to assess how the perception of audible speech and facial expressions influence one another for the perception of emotions, and how this influence might change over the course of development, we conducted two cross-modal priming experiments with three age groups of children (6-, 9-, and 12-years old), as well as college-aged adults. In Experiment 1, 74 children and 24 adult participants were tasked with categorizing photographs of emotional faces as positive or negative as quickly as possible after being primed with emotion words presented via audio in valence-congruent and valence-incongruent trials. In Experiment 2, 67 children and 24 adult participants carried out a similar cate…
Le traitement des expressions faciales au cours de la première année : développement et rôle de l'olfaction
2015
The first year of life is critical for the development of the abilities to process facial expressions. Olfaction and expressions are both strongly linked to each other, and it is well known that infants are able to multisensorially integrate their environment as early as birth. However, most of the studies interested in multisensory processing of facial expressions are restricted to the investigation of audio-visual interactions.In this thesis, we firstly aimed to resolve different issues concerning the ontogenesis of infants’ ability to process facial expressions. Our results allowed to specify the development of visual exploratory strategies of facial emotions along the first year of life…
Time Unification on Local Binary Patterns Three Orthogonal Planes for Facial Expression Recognition
2019
International audience; Machine learning has known a tremendous growth within the last years, and lately, thanks to that, some computer vision algorithms started to access what is difficult or even impossible to perceive by the human eye. While deep learning based computer vision algorithms have made themselves more and more present in the recent years, more classical feature extraction methods, such as the ones based on Local Binary Patterns (LBP), still present a non negligible interest, especially when dealing with small datasets. Furthermore, this operator has proven to be quite useful for facial emotions and human gestures recognition in general. Micro-Expression (ME) classification is…