WorldWideScience

Sample records for facial emotion processing

  1. Interference among the Processing of Facial Emotion, Face Race, and Face Gender

    Science.gov (United States)

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender). PMID:27840621

  2. Interference among the Processing of Facial Emotion, Face Race, and Face Gender.

    Science.gov (United States)

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender).

  3. Processing of unattended facial emotions: a visual mismatch negativity study.

    Science.gov (United States)

    Stefanics, Gábor; Csukly, Gábor; Komlósi, Sarolta; Czobor, Pál; Czigler, István

    2012-02-01

    Facial emotions express our internal states and are fundamental in social interactions. Here we explore whether the repetition of unattended facial emotions builds up a predictive representation of frequently encountered emotions in the visual system. Participants (n=24) were presented peripherally with facial stimuli expressing emotions while they performed a visual detection task presented in the center of the visual field. Facial stimuli consisted of four faces of different identity, but expressed the same emotion (happy or fearful). Facial stimuli were presented in blocks of oddball sequence (standard emotion: p=0.9, deviant emotion: p=0.1). Event-related potentials (ERPs) to the same emotions were compared when the emotions were deviant and standard, respectively. We found visual mismatch negativity (vMMN) responses to unattended deviant emotions in the 170-360 ms post-stimulus range over bilateral occipito-temporal sites. Our results demonstrate that information about the emotional content of unattended faces presented at the periphery of the visual field is rapidly processed and stored in a predictive memory representation by the visual system. We also found evidence that differential processing of deviant fearful faces starts already at 70-120 ms after stimulus onset. This finding shows a 'negativity bias' under unattended conditions. Differential processing of fearful deviants were more pronounced in the right hemisphere in the 195-275 ms and 360-390 ms intervals, whereas processing of happy deviants evoked larger differential response in the left hemisphere in the 360-390 ms range, indicating differential hemispheric specialization for automatic processing of positive and negative affect. Copyright © 2011 Elsevier Inc. All rights reserved.

  4. Interference among the Processing of Facial Emotion, Face Race, and Face Gender

    OpenAIRE

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People are able to simultaneously process multiple dimensions of facial properties. Facial processing models are based on the processing of facial properties. This paper examined the processing of facial emotion, face race and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interfered with face race in all the tasks. The interaction of face race and face gend...

  5. Processing of Facial Emotion in Bipolar Depression and Euthymia.

    Science.gov (United States)

    Robinson, Lucy J; Gray, John M; Burt, Mike; Ferrier, I Nicol; Gallagher, Peter

    2015-10-01

    Previous studies of facial emotion processing in bipolar disorder (BD) have reported conflicting findings. In independently conducted studies, we investigate facial emotion labeling in euthymic and depressed BD patients using tasks with static and dynamically morphed images of different emotions displayed at different intensities. Study 1 included 38 euthymic BD patients and 28 controls. Participants completed two tasks: labeling of static images of basic facial emotions (anger, disgust, fear, happy, sad) shown at different expression intensities; the Eyes Test (Baron-Cohen, Wheelwright, Hill, Raste, & Plumb, 2001), which involves recognition of complex emotions using only the eye region of the face. Study 2 included 53 depressed BD patients and 47 controls. Participants completed two tasks: labeling of "dynamic" facial expressions of the same five basic emotions; the Emotional Hexagon test (Young, Perret, Calder, Sprengelmeyer, & Ekman, 2002). There were no significant group differences on any measures of emotion perception/labeling, compared to controls. A significant group by intensity interaction was observed in both emotion labeling tasks (euthymia and depression), although this effect did not survive the addition of measures of executive function/psychomotor speed as covariates. Only 2.6-15.8% of euthymic patients and 7.8-13.7% of depressed patients scored below the 10th percentile of the controls for total emotion recognition accuracy. There was no evidence of specific deficits in facial emotion labeling in euthymic or depressed BD patients. Methodological variations-including mood state, sample size, and the cognitive demands of the tasks-may contribute significantly to the variability in findings between studies.

  6. Facial Emotion and Identity Processing Development in 5- to 15-Year-Old Children

    Directory of Open Access Journals (Sweden)

    Patrick eJohnston

    2011-02-01

    Full Text Available Most developmental studies of emotional face processing to date have focussed on infants and very young children. Additionally, studies that examine emotional face processing in older children do not distinguish development in emotion and identity face processing from more generic age-related cognitive improvement. In this study, we developed a paradigm that measures processing of facial expression in comparison to facial identity and complex visual stimuli. The three matching tasks were developed (i.e., facial emotion matching, facial identity matching and butterfly wing matching to include stimuli of similar level of discriminability and to be equated for task difficulty in earlier samples of young adults. Ninety two children aged 5 to 15 years and a new group of 24 young adults completed these three matching tasks. Young children were highly adept at the butterfly wing task relative to their performance on both face-related tasks. More importantly, in older children, development of facial emotion discrimination ability lagged behind that of facial identity discrimination.

  7. Putting the face in context: Body expressions impact facial emotion processing in human infants

    Directory of Open Access Journals (Sweden)

    Purva Rajhans

    2016-06-01

    Full Text Available Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs. We primed infants with body postures (fearful, happy that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception.

  8. Putting the face in context: Body expressions impact facial emotion processing in human infants.

    Science.gov (United States)

    Rajhans, Purva; Jessen, Sarah; Missana, Manuela; Grossmann, Tobias

    2016-06-01

    Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs). We primed infants with body postures (fearful, happy) that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Neurocognition and symptoms identify links between facial recognition and emotion processing in schizophrenia: meta-analytic findings.

    Science.gov (United States)

    Ventura, Joseph; Wood, Rachel C; Jimenez, Amy M; Hellemann, Gerhard S

    2013-12-01

    In schizophrenia patients, one of the most commonly studied deficits of social cognition is emotion processing (EP), which has documented links to facial recognition (FR). But, how are deficits in facial recognition linked to emotion processing deficits? Can neurocognitive and symptom correlates of FR and EP help differentiate the unique contribution of FR to the domain of social cognition? A meta-analysis of 102 studies (combined n=4826) in schizophrenia patients was conducted to determine the magnitude and pattern of relationships between facial recognition, emotion processing, neurocognition, and type of symptom. Meta-analytic results indicated that facial recognition and emotion processing are strongly interrelated (r=.51). In addition, the relationship between FR and EP through voice prosody (r=.58) is as strong as the relationship between FR and EP based on facial stimuli (r=.53). Further, the relationship between emotion recognition, neurocognition, and symptoms is independent of the emotion processing modality - facial stimuli and voice prosody. The association between FR and EP that occurs through voice prosody suggests that FR is a fundamental cognitive process. The observed links between FR and EP might be due to bottom-up associations between neurocognition and EP, and not simply because most emotion recognition tasks use visual facial stimuli. In addition, links with symptoms, especially negative symptoms and disorganization, suggest possible symptom mechanisms that contribute to FR and EP deficits. © 2013 Elsevier B.V. All rights reserved.

  10. Enhanced subliminal emotional responses to dynamic facial expressions

    Directory of Open Access Journals (Sweden)

    Wataru eSato

    2014-09-01

    Full Text Available Emotional processing without conscious awareness plays an important role in human social interaction. Several behavioral studies reported that subliminal presentation of photographs of emotional facial expressions induces unconscious emotional processing. However, it was difficult to elicit strong and robust effects using this method. We hypothesized that dynamic presentations of facial expressions would enhance subliminal emotional effects and tested this hypothesis with two experiments. Fearful or happy facial expressions were presented dynamically or statically in either the left or the right visual field for 20 (Experiment 1 and 30 (Experiment 2 ms. Nonsense target ideographs were then presented, and participants reported their preference for them. The results consistently showed that dynamic presentations of emotional facial expressions induced more evident emotional biases toward subsequent targets than did static ones. These results indicate that dynamic presentations of emotional facial expressions induce more evident unconscious emotional processing.

  11. Are event-related potentials to dynamic facial expressions of emotion related to individual differences in the accuracy of processing facial expressions and identity?

    Science.gov (United States)

    Recio, Guillermo; Wilhelm, Oliver; Sommer, Werner; Hildebrandt, Andrea

    2017-04-01

    Despite a wealth of knowledge about the neural mechanisms behind emotional facial expression processing, little is known about how they relate to individual differences in social cognition abilities. We studied individual differences in the event-related potentials (ERPs) elicited by dynamic facial expressions. First, we assessed the latent structure of the ERPs, reflecting structural face processing in the N170, and the allocation of processing resources and reflexive attention to emotionally salient stimuli, in the early posterior negativity (EPN) and the late positive complex (LPC). Then we estimated brain-behavior relationships between the ERP factors and behavioral indicators of facial identity and emotion-processing abilities. Structural models revealed that the participants who formed faster structural representations of neutral faces (i.e., shorter N170 latencies) performed better at face perception (r = -.51) and memory (r = -.42). The N170 amplitude was not related to individual differences in face cognition or emotion processing. The latent EPN factor correlated with emotion perception (r = .47) and memory (r = .32), and also with face perception abilities (r = .41). Interestingly, the latent factor representing the difference in EPN amplitudes between the two neutral control conditions (chewing and blinking movements) also correlated with emotion perception (r = .51), highlighting the importance of tracking facial changes in the perception of emotional facial expressions. The LPC factor for negative expressions correlated with the memory for emotional facial expressions. The links revealed between the latency and strength of activations of brain systems and individual differences in processing socio-emotional information provide new insights into the brain mechanisms involved in social communication.

  12. People with chronic facial pain perform worse than controls at a facial emotion recognition task, but it is not all about the emotion.

    Science.gov (United States)

    von Piekartz, H; Wallwork, S B; Mohr, G; Butler, D S; Moseley, G L

    2015-04-01

    Alexithymia, or a lack of emotional awareness, is prevalent in some chronic pain conditions and has been linked to poor recognition of others' emotions. Recognising others' emotions from their facial expression involves both emotional and motor processing, but the possible contribution of motor disruption has not been considered. It is possible that poor performance on emotional recognition tasks could reflect problems with emotional processing, motor processing or both. We hypothesised that people with chronic facial pain would be less accurate in recognising others' emotions from facial expressions, would be less accurate in a motor imagery task involving the face, and that performance on both tasks would be positively related. A convenience sample of 19 people (15 females) with chronic facial pain and 19 gender-matched controls participated. They undertook two tasks; in the first task, they identified the facial emotion presented in a photograph. In the second, they identified whether the person in the image had a facial feature pointed towards their left or right side, a well-recognised paradigm to induce implicit motor imagery. People with chronic facial pain performed worse than controls at both tasks (Facially Expressed Emotion Labelling (FEEL) task P facial pain were worse than controls at both the FEEL emotion recognition task and the left/right facial expression task and performance covaried within participants. We propose that disrupted motor processing may underpin or at least contribute to the difficulty that facial pain patients have in emotion recognition and that further research that tests this proposal is warranted. © 2014 John Wiley & Sons Ltd.

  13. Schizophrenia and processing of facial emotions : Sex matters

    NARCIS (Netherlands)

    Scholten, MRM; Aleman, A; Montagne, B; Kahn, RS

    2005-01-01

    The aim of this study was to examine sex differences in emotion processing in patients with schizophrenia and control subjects. To this end, 53 patients with schizophrenia (28 men and 25 women), and 42 controls (21 men and 21 women) were assessed with the use of a facial affect recognition morphing

  14. Neural Temporal Dynamics of Facial Emotion Processing: Age Effects and Relationship to Cognitive Function

    Directory of Open Access Journals (Sweden)

    Xiaoyan Liao

    2017-06-01

    Full Text Available This study used event-related potentials (ERPs to investigate the effects of age on neural temporal dynamics of processing task-relevant facial expressions and their relationship to cognitive functions. Negative (sad, afraid, angry, and disgusted, positive (happy, and neutral faces were presented to 30 older and 31 young participants who performed a facial emotion categorization task. Behavioral and ERP indices of facial emotion processing were analyzed. An enhanced N170 for negative faces, in addition to intact right-hemispheric N170 for positive faces, was observed in older adults relative to their younger counterparts. Moreover, older adults demonstrated an attenuated within-group N170 laterality effect for neutral faces, while younger adults showed the opposite pattern. Furthermore, older adults exhibited sustained temporo-occipital negativity deflection over the time range of 200–500 ms post-stimulus, while young adults showed posterior positivity and subsequent emotion-specific frontal negativity deflections. In older adults, decreased accuracy for labeling negative faces was positively correlated with Montreal Cognitive Assessment Scores, and accuracy for labeling neutral faces was negatively correlated with age. These findings suggest that older people may exert more effort in structural encoding for negative faces and there are different response patterns for the categorization of different facial emotions. Cognitive functioning may be related to facial emotion categorization deficits observed in older adults. This may not be attributable to positivity effects: it may represent a selective deficit for the processing of negative facial expressions in older adults.

  15. Behavioral dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia.

    Science.gov (United States)

    Daini, Roberta; Comparetti, Chiara M; Ricciardelli, Paola

    2014-01-01

    Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition.

  16. Facial emotion processing in pediatric social anxiety disorder: Relevance of situational context.

    Science.gov (United States)

    Schwab, Daniela; Schienle, Anne

    2017-08-01

    Social anxiety disorder (SAD) typically begins in childhood. Previous research has demonstrated that adult patients respond with elevated late positivity (LP) to negative facial expressions. In the present study on pediatric SAD, we investigated responses to negative facial expressions and the role of social context information. Fifteen children with SAD and 15 non-anxious controls were first presented with images of negative facial expressions with masked backgrounds. Following this, the complete images which included context information, were shown. The negative expressions were either a result of an emotion-relevant (e.g., social exclusion) or emotion-irrelevant elicitor (e.g., weight lifting). Relative to controls, the clinical group showed elevated parietal LP during face processing with and without context information. Both groups differed in their frontal LP depending on the type of context. In SAD patients, frontal LP was lower in emotion-relevant than emotion-irrelevant contexts. We conclude that SAD patients direct more automatic attention towards negative facial expressions (parietal effect) and are less capable in integrating affective context information (frontal effect). Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. An fMRI study of facial emotion processing in patients with schizophrenia.

    Science.gov (United States)

    Gur, Raquel E; McGrath, Claire; Chan, Robin M; Schroeder, Lee; Turner, Travis; Turetsky, Bruce I; Kohler, Christian; Alsop, David; Maldjian, Joseph; Ragland, J Daniel; Gur, Ruben C

    2002-12-01

    Emotion processing deficits are notable in schizophrenia. The authors evaluated cerebral blood flow response in schizophrenia patients during facial emotion processing to test the hypothesis of diminished limbic activation related to emotional relevance of facial stimuli. Fourteen patients with schizophrenia and 14 matched comparison subjects viewed facial displays of happiness, sadness, anger, fear, and disgust as well as neutral faces. Functional magnetic resonance imaging was used to measure blood-oxygen-level-dependent signal changes as the subjects alternated between tasks of discriminating emotional valence (positive versus negative) and age (over 30 versus under 30) of the faces with an interleaved crosshair reference condition. The groups did not differ in performance on either task. For both tasks, healthy participants showed activation in the fusiform gyrus, occipital lobe, and inferior frontal cortex relative to the resting baseline condition. The increase was greater in the amygdala and hippocampus during the emotional valence discrimination task than during the age discrimination task. In the patients with schizophrenia, minimal focal response was observed for all tasks relative to the resting baseline condition. Contrasting patients and comparison subjects on the emotional valence discrimination task revealed voxels in the left amygdala and bilateral hippocampus in which the comparison subjects had significantly greater activation. Failure to activate limbic regions during emotional valence discrimination may explain emotion processing deficits in patients with schizophrenia. While the lack of limbic recruitment did not significantly impair simple valence discrimination performance in this clinically stable group, it may impact performance of more demanding tasks.

  18. Neuropsychology of facial expressions. The role of consciousness in processing emotional faces

    Directory of Open Access Journals (Sweden)

    Michela Balconi

    2012-04-01

    Full Text Available Neuropsychological studies have underlined the significant presence of distinct brain correlates deputed to analyze facial expression of emotion. It was observed that some cerebral circuits were considered as specific for emotional face comprehension as a function of conscious vs. unconscious processing of emotional information. Moreover, the emotional content of faces (i.e. positive vs. negative; more or less arousing may have an effect in activating specific cortical networks. Between the others, recent studies have explained the contribution of hemispheres in comprehending face, as a function of type of emotions (mainly related to the distinction positive vs. negative and of specific tasks (comprehending vs. producing facial expressions. Specifically, ERPs (event-related potentials analysis overview is proposed in order to comprehend how face may be processed by an observer and how he can make face a meaningful construct even in absence of awareness. Finally, brain oscillations is considered in order to explain the synchronization of neural populations in response to emotional faces when a conscious vs. unconscious processing is activated

  19. Processing of emotional facial expressions in Korsakoff's syndrome.

    NARCIS (Netherlands)

    Montagne, B.; Kessels, R.P.C.; Wester, A.J.; Haan, E.H.F. de

    2006-01-01

    Interpersonal contacts depend to a large extent on understanding emotional facial expressions of others. Several neurological conditions may affect proficiency in emotional expression recognition. It has been shown that chronic alcoholics are impaired in labelling emotional expressions. More

  20. Judging emotional congruency: Explicit attention to situational context modulates processing of facial expressions of emotion.

    Science.gov (United States)

    Diéguez-Risco, Teresa; Aguado, Luis; Albert, Jacobo; Hinojosa, José Antonio

    2015-12-01

    The influence of explicit evaluative processes on the contextual integration of facial expressions of emotion was studied in a procedure that required the participants to judge the congruency of happy and angry faces with preceding sentences describing emotion-inducing situations. Judgments were faster on congruent trials in the case of happy faces and on incongruent trials in the case of angry faces. At the electrophysiological level, a congruency effect was observed in the face-sensitive N170 component that showed larger amplitudes on incongruent trials. An interactive effect of congruency and emotion appeared on the LPP (late positive potential), with larger amplitudes in response to happy faces that followed anger-inducing situations. These results show that the deliberate intention to judge the contextual congruency of facial expressions influences not only processes involved in affective evaluation such as those indexed by the LPP but also earlier processing stages that are involved in face perception. Copyright © 2015. Published by Elsevier B.V.

  1. Development of the Korean Facial Emotion Stimuli: Korea University Facial Expression Collection 2nd Edition

    Directory of Open Access Journals (Sweden)

    Sun-Min Kim

    2017-05-01

    Full Text Available Background: Developing valid emotional facial stimuli for specific ethnicities creates ample opportunities to investigate both the nature of emotional facial information processing in general and clinical populations as well as the underlying mechanisms of facial emotion processing within and across cultures. Given that most entries in emotional facial stimuli databases were developed with western samples, and given that very few of the eastern emotional facial stimuli sets were based strictly on the Ekman’s Facial Action Coding System, developing valid emotional facial stimuli of eastern samples remains a high priority.Aims: To develop and examine the psychometric properties of six basic emotional facial stimuli recruiting professional Korean actors and actresses based on the Ekman’s Facial Action Coding System for the Korea University Facial Expression Collection-Second Edition (KUFEC-II.Materials And Methods: Stimulus selection was done in two phases. First, researchers evaluated the clarity and intensity of each stimulus developed based on the Facial Action Coding System. Second, researchers selected a total of 399 stimuli from a total of 57 actors and actresses, which were then rated on accuracy, intensity, valence, and arousal by 75 independent raters.Conclusion: The hit rates between the targeted and rated expressions of the KUFEC-II were all above 80%, except for fear (50% and disgust (63%. The KUFEC-II appears to be a valid emotional facial stimuli database, providing the largest set of emotional facial stimuli. The mean intensity score was 5.63 (out of 7, suggesting that the stimuli delivered the targeted emotions with great intensity. All positive expressions were rated as having a high positive valence, whereas all negative expressions were rated as having a high negative valence. The KUFEC II is expected to be widely used in various psychological studies on emotional facial expression. KUFEC-II stimuli can be obtained through

  2. Neural processing of fearful and happy facial expressions during emotion-relevant and emotion-irrelevant tasks: a fixation-to-feature approach

    Science.gov (United States)

    Neath-Tavares, Karly N.; Itier, Roxane J.

    2017-01-01

    Research suggests an important role of the eyes and mouth for discriminating facial expressions of emotion. A gaze-contingent procedure was used to test the impact of fixation to facial features on the neural response to fearful, happy and neutral facial expressions in an emotion discrimination (Exp.1) and an oddball detection (Exp.2) task. The N170 was the only eye-sensitive ERP component, and this sensitivity did not vary across facial expressions. In both tasks, compared to neutral faces, responses to happy expressions were seen as early as 100–120ms occipitally, while responses to fearful expressions started around 150ms, on or after the N170, at both occipital and lateral-posterior sites. Analyses of scalp topographies revealed different distributions of these two emotion effects across most of the epoch. Emotion processing interacted with fixation location at different times between tasks. Results suggest a role of both the eyes and mouth in the neural processing of fearful expressions and of the mouth in the processing of happy expressions, before 350ms. PMID:27430934

  3. Interactions between facial emotion and identity in face processing: evidence based on redundancy gains.

    Science.gov (United States)

    Yankouskaya, Alla; Booth, David A; Humphreys, Glyn

    2012-11-01

    Interactions between the processing of emotion expression and form-based information from faces (facial identity) were investigated using the redundant-target paradigm, in which we specifically tested whether identity and emotional expression are integrated in a superadditive manner (Miller, Cognitive Psychology 14:247-279, 1982). In Experiments 1 and 2, participants performed emotion and face identity judgments on faces with sad or angry emotional expressions. Responses to redundant targets were faster than responses to either single target when a universal emotion was conveyed, and performance violated the predictions from a model assuming independent processing of emotion and face identity. Experiment 4 showed that these effects were not modulated by varying interstimulus and nontarget contingencies, and Experiment 5 demonstrated that the redundancy gains were eliminated when faces were inverted. Taken together, these results suggest that the identification of emotion and facial identity interact in face processing.

  4. Altered Kinematics of Facial Emotion Expression and Emotion Recognition Deficits Are Unrelated in Parkinson's Disease.

    Science.gov (United States)

    Bologna, Matteo; Berardelli, Isabella; Paparella, Giulia; Marsili, Luca; Ricciardi, Lucia; Fabbrini, Giovanni; Berardelli, Alfredo

    2016-01-01

    Altered emotional processing, including reduced emotion facial expression and defective emotion recognition, has been reported in patients with Parkinson's disease (PD). However, few studies have objectively investigated facial expression abnormalities in PD using neurophysiological techniques. It is not known whether altered facial expression and recognition in PD are related. To investigate possible deficits in facial emotion expression and emotion recognition and their relationship, if any, in patients with PD. Eighteen patients with PD and 16 healthy controls were enrolled in this study. Facial expressions of emotion were recorded using a 3D optoelectronic system and analyzed using the facial action coding system. Possible deficits in emotion recognition were assessed using the Ekman test. Participants were assessed in one experimental session. Possible relationship between the kinematic variables of facial emotion expression, the Ekman test scores, and clinical and demographic data in patients were evaluated using the Spearman's test and multiple regression analysis. The facial expression of all six basic emotions had slower velocity and lower amplitude in patients in comparison to healthy controls (all P s facial expression kinematics and emotion recognition deficits were unrelated in patients (all P s > 0.05). Finally, no relationship emerged between kinematic variables of facial emotion expression, the Ekman test scores, and clinical and demographic data in patients (all P s > 0.05). The results in this study provide further evidence of altered emotional processing in PD. The lack of any correlation between altered facial emotion expression kinematics and emotion recognition deficits in patients suggests that these abnormalities are mediated by separate pathophysiological mechanisms.

  5. Recognition of Face and Emotional Facial Expressions in Autism

    Directory of Open Access Journals (Sweden)

    Muhammed Tayyib Kadak

    2013-03-01

    Full Text Available Autism is a genetically transferred neurodevelopmental disorder characterized by severe and permanent deficits in many interpersonal relation areas like communication, social interaction and emotional responsiveness. Patients with autism have deficits in face recognition, eye contact and recognition of emotional expression. Both recognition of face and expression of facial emotion carried on face processing. Structural and functional impairment in fusiform gyrus, amygdala, superior temporal sulcus and other brain regions lead to deficits in recognition of face and facial emotion. Therefore studies suggest that face processing deficits resulted in problems in areas of social interaction and emotion in autism. Studies revealed that children with autism had problems in recognition of facial expression and used mouth region more than eye region. It was also shown that autistic patients interpreted ambiguous expressions as negative emotion. In autism, deficits related in various stages of face processing like detection of gaze, face identity, recognition of emotional expression were determined, so far. Social interaction impairments in autistic spectrum disorders originated from face processing deficits during the periods of infancy, childhood and adolescence. Recognition of face and expression of facial emotion could be affected either automatically by orienting towards faces after birth, or by “learning” processes in developmental periods such as identity and emotion processing. This article aimed to review neurobiological basis of face processing and recognition of emotional facial expressions during normal development and in autism.

  6. Facial dynamics and emotional expressions in facial aging treatments.

    Science.gov (United States)

    Michaud, Thierry; Gassia, Véronique; Belhaouari, Lakhdar

    2015-03-01

    Facial expressions convey emotions that form the foundation of interpersonal relationships, and many of these emotions promote and regulate our social linkages. Hence, the facial aging symptomatological analysis and the treatment plan must of necessity include knowledge of the facial dynamics and the emotional expressions of the face. This approach aims to more closely meet patients' expectations of natural-looking results, by correcting age-related negative expressions while observing the emotional language of the face. This article will successively describe patients' expectations, the role of facial expressions in relational dynamics, the relationship between facial structures and facial expressions, and the way facial aging mimics negative expressions. Eventually, therapeutic implications for facial aging treatment will be addressed. © 2015 Wiley Periodicals, Inc.

  7. Impaired Emotional Mirroring in Parkinson’s Disease—A Study on Brain Activation during Processing of Facial Expressions

    Directory of Open Access Journals (Sweden)

    Anna Pohl

    2017-12-01

    Full Text Available BackgroundAffective dysfunctions are common in patients with Parkinson’s disease, but the underlying neurobiological deviations have rarely been examined. Parkinson’s disease is characterized by a loss of dopamine neurons in the substantia nigra resulting in impairment of motor and non-motor basal ganglia-cortical loops. Concerning emotional deficits, some studies provide evidence for altered brain processing in limbic- and lateral-orbitofrontal gating loops. In a second line of evidence, human premotor and inferior parietal homologs of mirror neuron areas were involved in processing and understanding of emotional facial expressions. We examined deviations in brain activation during processing of facial expressions in patients and related these to emotion recognition accuracy.Methods13 patients and 13 healthy controls underwent an emotion recognition task and a functional magnetic resonance imaging (fMRI measurement. In the Emotion Hexagon test, participants were presented with blends of two emotions and had to indicate which emotion best described the presented picture. Blended pictures with three levels of difficulty were included. During fMRI scanning, participants observed video clips depicting emotional, non-emotional, and neutral facial expressions or were asked to produce these facial expressions themselves.ResultsPatients performed slightly worse in the emotion recognition task, but only when judging the most ambiguous facial expressions. Both groups activated inferior frontal and anterior inferior parietal homologs of mirror neuron areas during observation and execution of the emotional facial expressions. During observation, responses in the pars opercularis of the right inferior frontal gyrus, in the bilateral inferior parietal lobule and in the bilateral supplementary motor cortex were decreased in patients. Furthermore, in patients, activation of the right anterior inferior parietal lobule was positively related to accuracy in

  8. Do facial movements express emotions or communicate motives?

    Science.gov (United States)

    Parkinson, Brian

    2005-01-01

    This article addresses the debate between emotion-expression and motive-communication approaches to facial movements, focusing on Ekman's (1972) and Fridlund's (1994) contrasting models and their historical antecedents. Available evidence suggests that the presence of others either reduces or increases facial responses, depending on the quality and strength of the emotional manipulation and on the nature of the relationship between interactants. Although both display rules and social motives provide viable explanations of audience "inhibition" effects, some audience facilitation effects are less easily accommodated within an emotion-expression perspective. In particular, emotion is not a sufficient condition for a corresponding "expression," even discounting explicit regulation, and, apparently, "spontaneous" facial movements may be facilitated by the presence of others. Further, there is no direct evidence that any particular facial movement provides an unambiguous expression of a specific emotion. However, information communicated by facial movements is not necessarily extrinsic to emotion. Facial movements not only transmit emotion-relevant information but also contribute to ongoing processes of emotional action in accordance with pragmatic theories.

  9. Subliminal and supraliminal processing of facial expression of emotions: brain oscillation in the left/right frontal area.

    Science.gov (United States)

    Balconi, Michela; Ferrari, Chiara

    2012-03-26

    The unconscious effects of an emotional stimulus have been highlighted by a vast amount of research, whereover it remains questionable whether it is possible to assign a specific function to cortical brain oscillations in the unconscious perception of facial expressions of emotions. Alpha band variation was monitored within the right- and left-cortical side when subjects consciously (supraliminal stimulation) or unconsciously (subliminal stimulation) processed facial patterns. Twenty subjects looked at six facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral) under two different conditions: supraliminal (200 ms) vs. subliminal (30 ms) stimulation (140 target-mask pairs for each condition). The results showed that conscious/unconscious processing and the significance of the stimulus can modulate the alpha power. Moreover, it was found that there was an increased right frontal activity for negative emotions vs. an increased left response for positive emotion. The significance of facial expressions was adduced to elucidate cortical different responses to emotional types.

  10. Subliminal and Supraliminal Processing of Facial Expression of Emotions: Brain Oscillation in the Left/Right Frontal Area

    Directory of Open Access Journals (Sweden)

    Michela Balconi

    2012-03-01

    Full Text Available The unconscious effects of an emotional stimulus have been highlighted by a vast amount of research, whereover it remains questionable whether it is possible to assign a specific function to cortical brain oscillations in the unconscious perception of facial expressions of emotions. Alpha band variation was monitored within the right- and left-cortical side when subjects consciously (supraliminal stimulation or unconsciously (subliminal stimulation processed facial patterns. Twenty subjects looked at six facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral under two different conditions: supraliminal (200 ms vs. subliminal (30 ms stimulation (140 target-mask pairs for each condition. The results showed that conscious/unconscious processing and the significance of the stimulus can modulate the alpha power. Moreover, it was found that there was an increased right frontal activity for negative emotions vs. an increased left response for positive emotion. The significance of facial expressions was adduced to elucidate cortical different responses to emotional types.

  11. Altered Kinematics of Facial Emotion Expression and Emotion Recognition Deficits Are Unrelated in Parkinson?s Disease

    OpenAIRE

    Bologna, Matteo; Berardelli, Isabella; Paparella, Giulia; Marsili, Luca; Ricciardi, Lucia; Fabbrini, Giovanni; Berardelli, Alfredo

    2016-01-01

    Background Altered emotional processing, including reduced emotion facial expression and defective emotion recognition, has been reported in patients with Parkinson?s disease (PD). However, few studies have objectively investigated facial expression abnormalities in PD using neurophysiological techniques. It is not known whether altered facial expression and recognition in PD are related. Objective To investigate possible deficits in facial emotion expression and emotion recognition and their...

  12. The implicit processing of categorical and dimensional strategies: an fMRI study of facial emotion perception

    Science.gov (United States)

    Matsuda, Yoshi-Taka; Fujimura, Tomomi; Katahira, Kentaro; Okada, Masato; Ueno, Kenichi; Cheng, Kang; Okanoya, Kazuo

    2013-01-01

    Our understanding of facial emotion perception has been dominated by two seemingly opposing theories: the categorical and dimensional theories. However, we have recently demonstrated that hybrid processing involving both categorical and dimensional perception can be induced in an implicit manner (Fujimura etal., 2012). The underlying neural mechanisms of this hybrid processing remain unknown. In this study, we tested the hypothesis that separate neural loci might intrinsically encode categorical and dimensional processing functions that serve as a basis for hybrid processing. We used functional magnetic resonance imaging to measure neural correlates while subjects passively viewed emotional faces and performed tasks that were unrelated to facial emotion processing. Activity in the right fusiform face area (FFA) increased in response to psychologically obvious emotions and decreased in response to ambiguous expressions, demonstrating the role of the FFA in categorical processing. The amygdala, insula and medial prefrontal cortex exhibited evidence of dimensional (linear) processing that correlated with physical changes in the emotional face stimuli. The occipital face area and superior temporal sulcus did not respond to these changes in the presented stimuli. Our results indicated that distinct neural loci process the physical and psychological aspects of facial emotion perception in a region-specific and implicit manner. PMID:24133426

  13. The implicit processing of categorical and dimensional strategies: an fMRI study of facial emotion perception

    Directory of Open Access Journals (Sweden)

    Yoshi-Taka eMatsuda

    2013-09-01

    Full Text Available Our understanding of facial emotion perception has been dominated by two seemingly opposing theories: the categorical and dimensional theories. However, we have recently demonstrated that hybrid processing involving both categorical and dimensional perception can be induced in an implicit manner (Fujimura et al., 2012. The underlying neural mechanisms of this hybrid processing remain unknown. In this study, we tested the hypothesis that separate neural loci might intrinsically encode categorical and dimensional processing functions that serve as a basis for hybrid processing. We used functional magnetic resonance imaging (fMRI to measure neural correlates while subjects passively viewed emotional faces and performed tasks that were unrelated to facial emotion processing. Activity in the right fusiform face area (FFA increased in response to psychologically obvious emotions and decreased in response to ambiguous expressions, demonstrating the role of the FFA in categorical processing. The amygdala, insula and medial prefrontal cortex exhibited evidence of dimensional (linear processing that correlated with physical changes in the emotional face stimuli. The occipital face area and superior temporal sulcus did not respond to these changes in the presented stimuli. Our results indicated that distinct neural loci process the physical and psychological aspects of facial emotion perception in a region-specific and implicit manner.

  14. Automatic Processing of Changes in Facial Emotions in Dysphoria: A Magnetoencephalography Study.

    Science.gov (United States)

    Xu, Qianru; Ruohonen, Elisa M; Ye, Chaoxiong; Li, Xueqiao; Kreegipuu, Kairi; Stefanics, Gabor; Luo, Wenbo; Astikainen, Piia

    2018-01-01

    It is not known to what extent the automatic encoding and change detection of peripherally presented facial emotion is altered in dysphoria. The negative bias in automatic face processing in particular has rarely been studied. We used magnetoencephalography (MEG) to record automatic brain responses to happy and sad faces in dysphoric (Beck's Depression Inventory ≥ 13) and control participants. Stimuli were presented in a passive oddball condition, which allowed potential negative bias in dysphoria at different stages of face processing (M100, M170, and M300) and alterations of change detection (visual mismatch negativity, vMMN) to be investigated. The magnetic counterpart of the vMMN was elicited at all stages of face processing, indexing automatic deviance detection in facial emotions. The M170 amplitude was modulated by emotion, response amplitudes being larger for sad faces than happy faces. Group differences were found for the M300, and they were indexed by two different interaction effects. At the left occipital region of interest, the dysphoric group had larger amplitudes for sad than happy deviant faces, reflecting negative bias in deviance detection, which was not found in the control group. On the other hand, the dysphoric group showed no vMMN to changes in facial emotions, while the vMMN was observed in the control group at the right occipital region of interest. Our results indicate that there is a negative bias in automatic visual deviance detection, but also a general change detection deficit in dysphoria.

  15. From Facial Emotional Recognition Abilities to Emotional Attribution: A Study in Down Syndrome

    Science.gov (United States)

    Hippolyte, Loyse; Barisnikov, Koviljka; Van der Linden, Martial; Detraux, Jean-Jacques

    2009-01-01

    Facial expression processing and the attribution of facial emotions to a context were investigated in adults with Down syndrome (DS) in two experiments. Their performances were compared with those of a child control group matched for receptive vocabulary. The ability to process faces without emotional content was controlled for, and no differences…

  16. Measuring facial expression of emotion.

    Science.gov (United States)

    Wolf, Karsten

    2015-12-01

    Research into emotions has increased in recent decades, especially on the subject of recognition of emotions. However, studies of the facial expressions of emotion were compromised by technical problems with visible video analysis and electromyography in experimental settings. These have only recently been overcome. There have been new developments in the field of automated computerized facial recognition; allowing real-time identification of facial expression in social environments. This review addresses three approaches to measuring facial expression of emotion and describes their specific contributions to understanding emotion in the healthy population and in persons with mental illness. Despite recent progress, studies on human emotions have been hindered by the lack of consensus on an emotion theory suited to examining the dynamic aspects of emotion and its expression. Studying expression of emotion in patients with mental health conditions for diagnostic and therapeutic purposes will profit from theoretical and methodological progress.

  17. Comparison of emotion recognition from facial expression and music.

    Science.gov (United States)

    Gaspar, Tina; Labor, Marina; Jurić, Iva; Dumancić, Dijana; Ilakovac, Vesna; Heffer, Marija

    2011-01-01

    The recognition of basic emotions in everyday communication involves interpretation of different visual and auditory clues. The ability to recognize emotions is not clearly determined as their presentation is usually very short (micro expressions), whereas the recognition itself does not have to be a conscious process. We assumed that the recognition from facial expressions is selected over the recognition of emotions communicated through music. In order to compare the success rate in recognizing emotions presented as facial expressions or in classical music works we conducted a survey which included 90 elementary school and 87 high school students from Osijek (Croatia). The participants had to match 8 photographs of different emotions expressed on the face and 8 pieces of classical music works with 8 offered emotions. The recognition of emotions expressed through classical music pieces was significantly less successful than the recognition of emotional facial expressions. The high school students were significantly better at recognizing facial emotions than the elementary school students, whereas girls were better than boys. The success rate in recognizing emotions from music pieces was associated with higher grades in mathematics. Basic emotions are far better recognized if presented on human faces than in music, possibly because the understanding of facial emotions is one of the oldest communication skills in human society. Female advantage in emotion recognition was selected due to the necessity of their communication with the newborns during early development. The proficiency in recognizing emotional content of music and mathematical skills probably share some general cognitive skills like attention, memory and motivation. Music pieces were differently processed in brain than facial expressions and consequently, probably differently evaluated as relevant emotional clues.

  18. Face Processing in Children with Autism Spectrum Disorder: Independent or Interactive Processing of Facial Identity and Facial Expression?

    Science.gov (United States)

    Krebs, Julia F.; Biswas, Ajanta; Pascalis, Olivier; Kamp-Becker, Inge; Remschmidt, Helmuth; Schwarzer, Gudrun

    2011-01-01

    The current study investigated if deficits in processing emotional expression affect facial identity processing and vice versa in children with autism spectrum disorder. Children with autism and IQ and age matched typically developing children classified faces either by emotional expression, thereby ignoring facial identity or by facial identity…

  19. A motivational determinant of facial emotion recognition: regulatory focus affects recognition of emotions in faces.

    Science.gov (United States)

    Sassenrath, Claudia; Sassenberg, Kai; Ray, Devin G; Scheiter, Katharina; Jarodzka, Halszka

    2014-01-01

    Two studies examined an unexplored motivational determinant of facial emotion recognition: observer regulatory focus. It was predicted that a promotion focus would enhance facial emotion recognition relative to a prevention focus because the attentional strategies associated with promotion focus enhance performance on well-learned or innate tasks - such as facial emotion recognition. In Study 1, a promotion or a prevention focus was experimentally induced and better facial emotion recognition was observed in a promotion focus compared to a prevention focus. In Study 2, individual differences in chronic regulatory focus were assessed and attention allocation was measured using eye tracking during the facial emotion recognition task. Results indicated that the positive relation between a promotion focus and facial emotion recognition is mediated by shorter fixation duration on the face which reflects a pattern of attention allocation matched to the eager strategy in a promotion focus (i.e., striving to make hits). A prevention focus did not have an impact neither on perceptual processing nor on facial emotion recognition. Taken together, these findings demonstrate important mechanisms and consequences of observer motivational orientation for facial emotion recognition.

  20. Cognitive penetrability and emotion recognition in human facial expressions

    Directory of Open Access Journals (Sweden)

    Francesco eMarchi

    2015-06-01

    Full Text Available Do our background beliefs, desires, and mental images influence our perceptual experience of the emotions of others? In this paper, we will address the possibility of cognitive penetration of perceptual experience in the domain of social cognition. In particular, we focus on emotion recognition based on the visual experience of facial expressions. After introducing the current debate on cognitive penetration, we review examples of perceptual adaptation for facial expressions of emotion. This evidence supports the idea that facial expressions are perceptually processed as wholes. That is, the perceptual system integrates lower-level facial features, such as eyebrow orientation, mouth angle etc., into facial compounds. We then present additional experimental evidence showing that in some cases, emotion recognition on the basis of facial expression is sensitive to and modified by the background knowledge of the subject. We argue that such sensitivity is best explained as a difference in the visual experience of the facial expression, not just as a modification of the judgment based on this experience. The difference in experience is characterized as the result of the interference of background knowledge with the perceptual integration process for faces. Thus, according to the best explanation, we have to accept cognitive penetration in some cases of emotion recognition. Finally, we highlight a recent model of social vision in order to propose a mechanism for cognitive penetration used in the face-based recognition of emotion.

  1. Mere social categorization modulates identification of facial expressions of emotion.

    Science.gov (United States)

    Young, Steven G; Hugenberg, Kurt

    2010-12-01

    The ability of the human face to communicate emotional states via facial expressions is well known, and past research has established the importance and universality of emotional facial expressions. However, recent evidence has revealed that facial expressions of emotion are most accurately recognized when the perceiver and expresser are from the same cultural ingroup. The current research builds on this literature and extends this work. Specifically, we find that mere social categorization, using a minimal-group paradigm, can create an ingroup emotion-identification advantage even when the culture of the target and perceiver is held constant. Follow-up experiments show that this effect is supported by differential motivation to process ingroup versus outgroup faces and that this motivational disparity leads to more configural processing of ingroup faces than of outgroup faces. Overall, the results point to distinct processing modes for ingroup and outgroup faces, resulting in differential identification accuracy for facial expressions of emotion. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  2. Face Processing and Facial Emotion Recognition in Adults with Down Syndrome

    Science.gov (United States)

    Barisnikov, Koviljka; Hippolyte, Loyse; Van der Linden, Martial

    2008-01-01

    Face processing and facial expression recognition was investigated in 17 adults with Down syndrome, and results were compared with those of a child control group matched for receptive vocabulary. On the tasks involving faces without emotional content, the adults with Down syndrome performed significantly worse than did the controls. However, their…

  3. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions.

    Science.gov (United States)

    Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the

  4. Decoding facial blends of emotion: visual field, attentional and hemispheric biases.

    Science.gov (United States)

    Ross, Elliott D; Shayya, Luay; Champlain, Amanda; Monnot, Marilee; Prodan, Calin I

    2013-12-01

    Most clinical research assumes that modulation of facial expressions is lateralized predominantly across the right-left hemiface. However, social psychological research suggests that facial expressions are organized predominantly across the upper-lower face. Because humans learn to cognitively control facial expression for social purposes, the lower face may display a false emotion, typically a smile, to enable approach behavior. In contrast, the upper face may leak a person's true feeling state by producing a brief facial blend of emotion, i.e. a different emotion on the upper versus lower face. Previous studies from our laboratory have shown that upper facial emotions are processed preferentially by the right hemisphere under conditions of directed attention if facial blends of emotion are presented tachistoscopically to the mid left and right visual fields. This paper explores how facial blends are processed within the four visual quadrants. The results, combined with our previous research, demonstrate that lower more so than upper facial emotions are perceived best when presented to the viewer's left and right visual fields just above the horizontal axis. Upper facial emotions are perceived best when presented to the viewer's left visual field just above the horizontal axis under conditions of directed attention. Thus, by gazing at a person's left ear, which also avoids the social stigma of eye-to-eye contact, one's ability to decode facial expressions should be enhanced. Published by Elsevier Inc.

  5. Emotional facial expression detection in the peripheral visual field.

    Directory of Open Access Journals (Sweden)

    Dimitri J Bayle

    Full Text Available BACKGROUND: In everyday life, signals of danger, such as aversive facial expressions, usually appear in the peripheral visual field. Although facial expression processing in central vision has been extensively studied, this processing in peripheral vision has been poorly studied. METHODOLOGY/PRINCIPAL FINDINGS: Using behavioral measures, we explored the human ability to detect fear and disgust vs. neutral expressions and compared it to the ability to discriminate between genders at eccentricities up to 40°. Responses were faster for the detection of emotion compared to gender. Emotion was detected from fearful faces up to 40° of eccentricity. CONCLUSIONS: Our results demonstrate the human ability to detect facial expressions presented in the far periphery up to 40° of eccentricity. The increasing advantage of emotion compared to gender processing with increasing eccentricity might reflect a major implication of the magnocellular visual pathway in facial expression processing. This advantage may suggest that emotion detection, relative to gender identification, is less impacted by visual acuity and within-face crowding in the periphery. These results are consistent with specific and automatic processing of danger-related information, which may drive attention to those messages and allow for a fast behavioral reaction.

  6. Facial emotion recognition in paranoid schizophrenia and autism spectrum disorder.

    Science.gov (United States)

    Sachse, Michael; Schlitt, Sabine; Hainz, Daniela; Ciaramidaro, Angela; Walter, Henrik; Poustka, Fritz; Bölte, Sven; Freitag, Christine M

    2014-11-01

    Schizophrenia (SZ) and autism spectrum disorder (ASD) share deficits in emotion processing. In order to identify convergent and divergent mechanisms, we investigated facial emotion recognition in SZ, high-functioning ASD (HFASD), and typically developed controls (TD). Different degrees of task difficulty and emotion complexity (face, eyes; basic emotions, complex emotions) were used. Two Benton tests were implemented in order to elicit potentially confounding visuo-perceptual functioning and facial processing. Nineteen participants with paranoid SZ, 22 with HFASD and 20 TD were included, aged between 14 and 33 years. Individuals with SZ were comparable to TD in all obtained emotion recognition measures, but showed reduced basic visuo-perceptual abilities. The HFASD group was impaired in the recognition of basic and complex emotions compared to both, SZ and TD. When facial identity recognition was adjusted for, group differences remained for the recognition of complex emotions only. Our results suggest that there is a SZ subgroup with predominantly paranoid symptoms that does not show problems in face processing and emotion recognition, but visuo-perceptual impairments. They also confirm the notion of a general facial and emotion recognition deficit in HFASD. No shared emotion recognition deficit was found for paranoid SZ and HFASD, emphasizing the differential cognitive underpinnings of both disorders. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Facial emotion recognition in adolescents with personality pathology

    NARCIS (Netherlands)

    Berenschot, Fleur; Van Aken, Marcel A G|info:eu-repo/dai/nl/081831218; Hessels, Christel; De Castro, Bram Orobio|info:eu-repo/dai/nl/166985422; Pijl, Ysbrand; Montagne, Barbara; Van Voorst, Guus

    2014-01-01

    It has been argued that a heightened emotional sensitivity interferes with the cognitive processing of facial emotion recognition and may explain the intensified emotional reactions to external emotional stimuli of adults with personality pathology, such as borderline personality disorder (BPD).

  8. Identity modulates short-term memory for facial emotion.

    Science.gov (United States)

    Galster, Murray; Kahana, Michael J; Wilson, Hugh R; Sekuler, Robert

    2009-12-01

    For some time, the relationship between processing of facial expression and facial identity has been in dispute. Using realistic synthetic faces, we reexamined this relationship for both perception and short-term memory. In Experiment 1, subjects tried to identify whether the emotional expression on a probe stimulus face matched the emotional expression on either of two remembered faces that they had just seen. The results showed that identity strongly influenced recognition short-term memory for emotional expression. In Experiment 2, subjects' similarity/dissimilarity judgments were transformed by multidimensional scaling (MDS) into a 2-D description of the faces' perceptual representations. Distances among stimuli in the MDS representation, which showed a strong linkage of emotional expression and facial identity, were good predictors of correct and false recognitions obtained previously in Experiment 1. The convergence of the results from Experiments 1 and 2 suggests that the overall structure and configuration of faces' perceptual representations may parallel their representation in short-term memory and that facial identity modulates the representation of facial emotion, both in perception and in memory. The stimuli from this study may be downloaded from http://cabn.psychonomic-journals.org/content/supplemental.

  9. Facial responsiveness of psychopaths to the emotional expressions of others.

    Directory of Open Access Journals (Sweden)

    Janina Künecke

    Full Text Available Psychopathic individuals show selfish, manipulative, and antisocial behavior in addition to emotional detachment and reduced empathy. Their empathic deficits are thought to be associated with a reduced responsiveness to emotional stimuli. Immediate facial muscle responses to the emotional expressions of others reflect the expressive part of emotional responsiveness and are positively related to trait empathy. Empirical evidence for reduced facial muscle responses in adult psychopathic individuals to the emotional expressions of others is rare. In the present study, 261 male criminal offenders and non-offenders categorized dynamically presented facial emotion expressions (angry, happy, sad, and neutral during facial electromyography recording of their corrugator muscle activity. We replicated a measurement model of facial muscle activity, which controls for general facial responsiveness to face stimuli, and modeled three correlated emotion-specific factors (i.e., anger, happiness, and sadness representing emotion specific activity. In a multi-group confirmatory factor analysis, we compared the means of the anger, happiness, and sadness latent factors between three groups: 1 non-offenders, 2 low, and 3 high psychopathic offenders. There were no significant mean differences between groups. Our results challenge current theories that focus on deficits in emotional responsiveness as leading to the development of psychopathy and encourage further theoretical development on deviant emotional processes in psychopathic individuals.

  10. A comparison of facial emotion processing in neurological and psychiatric conditions

    Directory of Open Access Journals (Sweden)

    Benoit eBediou

    2012-04-01

    Full Text Available Investigating the relative severity of emotion recognition deficit across different clinical and high-risk populations has potential implications not only for the prevention, diagnosis and treatment of these diseases, but also for our understanding of the neurobiological mechanisms of emotion perception itself. We reanalyzed data from 4 studies in which we examined facial expression and gender recognition using the same tasks and stimuli. We used a standardized and bias-corrected measure of effect size (Cohen’s D to assess the extent of impairments in frontotemporal dementia (FTD, Parkinson’s disease treated by L-DOPA (PD-ON or not (PD-OFF, amnestic Mild Cognitive Impairment (aMCI, Alzheimer’s disease at mild dementia stage (AD, major depressive disorder (MDD, remitted schizophrenia (SCZ-rem, first-episode schizophrenia before (SCZ-OFF and after (SCZ-ON medication, as well as unaffected siblings of partients with schizophrenia (SIB. Analyses revealed a pattern of differential impairment of emotion (but not gender recognition, consistent with the extent of impairment of the fronto-temporal neural networks involved in the processing of faces and facial expressions. Our transnosographic approach combining clinical and high-risk populations with the impact of medication brings new information on the trajectory of impaired emotion perception in neuropsychiatric conditions, and on the neural networks and neurotransmitter systems subserving emotion perception.

  11. Agency and facial emotion judgment in context.

    Science.gov (United States)

    Ito, Kenichi; Masuda, Takahiko; Li, Liman Man Wai

    2013-06-01

    Past research showed that East Asians' belief in holism was expressed as their tendencies to include background facial emotions into the evaluation of target faces more than North Americans. However, this pattern can be interpreted as North Americans' tendency to downplay background facial emotions due to their conceptualization of facial emotion as volitional expression of internal states. Examining this alternative explanation, we investigated whether different types of contextual information produce varying degrees of effect on one's face evaluation across cultures. In three studies, European Canadians and East Asians rated the intensity of target facial emotions surrounded with either affectively salient landscape sceneries or background facial emotions. The results showed that, although affectively salient landscapes influenced the judgment of both cultural groups, only European Canadians downplayed the background facial emotions. The role of agency as differently conceptualized across cultures and multilayered systems of cultural meanings are discussed.

  12. Subliminal and Supraliminal Processing of Facial Expression of Emotions: Brain Oscillation in the Left/Right Frontal Area

    OpenAIRE

    Balconi, Michela; Ferrari, Chiara

    2012-01-01

    The unconscious effects of an emotional stimulus have been highlighted by a vast amount of research, whereover it remains questionable whether it is possible to assign a specific function to cortical brain oscillations in the unconscious perception of facial expressions of emotions. Alpha band variation was monitored within the right- and left-cortical side when subjects consciously (supraliminal stimulation) or unconsciously (subliminal stimulation) processed facial patterns. Twenty subjects...

  13. Lateralization for Processing Facial Emotions in Gay Men, Heterosexual Men, and Heterosexual Women.

    Science.gov (United States)

    Rahman, Qazi; Yusuf, Sifat

    2015-07-01

    This study tested whether male sexual orientation and gender nonconformity influenced functional cerebral lateralization for the processing of facial emotions. We also tested for the effects of sex of poser and emotion displayed on putative differences. Thirty heterosexual men, 30 heterosexual women, and 40 gay men completed measures of demographic variables, recalled childhood gender nonconformity (CGN), IQ, and the Chimeric Faces Test (CFT). The CFT depicts vertically split chimeric faces, formed with one half showing a neutral expression and the other half showing an emotional expression and performance is measured using a "laterality quotient" (LQ) score. We found that heterosexual men were significantly more right-lateralized when viewing female faces compared to heterosexual women and gay men, who did not differ significantly from each other. Heterosexual women and gay men were more left-lateralized for processing female faces. There were no significant group differences in lateralization for male faces. These results remained when controlling for age and IQ scores. There was no significant effect of CGN on LQ scores. These data suggest that gay men are feminized in some aspects of functional cerebral lateralization for facial emotion. The results were discussed in relation to the selectivity of functional lateralization and putative brain mechanisms underlying sexual attraction towards opposite-sex and same-sex targets.

  14. Remnants and changes in facial emotion processing in women with remitted borderline personality disorder: an EEG study.

    Science.gov (United States)

    Schneider, Isabella; Bertsch, Katja; Izurieta Hidalgo, Natalie A; Müller, Laura E; Schmahl, Christian; Herpertz, Sabine C

    2017-09-27

    According to longitudinal studies, most individuals with borderline personality disorder (BPD) achieve remission. Since BPD is characterized by disturbed emotion recognition, this study investigated behavioral and electrophysiological correlates of facial emotion classification and processing in remitted BPD. 32 women with remitted BPD (rBPD), 32 women with current BPD (cBPD), and 28 healthy women (HC) participated in an emotion classification paradigm comprising blends of angry and happy faces while behavioral and electroencephalographic (event-related potentials) data were recorded. rBPD demonstrated a convergence in behavior towards HC in terms of responses and reaction times. They evaluated maximally ambiguous faces more positively and exhibited faster reaction times when classifying predominantly happy faces compared to cBPD. Group × facial emotion interaction effects were found in early electrophysiological processes with post hoc tests indicating differences between rBPD and cBPD but not between rBPD and HC. However, BPD-like impairments were still found in rBPD in later processing (P300). Our results suggest a reduction in negativity bias in rBPD on the behavioral level and a normalization of earlier stages of facial processing on the neural level, while alterations in later, more cognitive processing do not remit. Early processing may be more state-like, while later impairments may be more trait-like. Further research may need to focus on these stable components.

  15. Patterns of Emotion Experiences as Predictors of Facial Expressions of Emotion.

    Science.gov (United States)

    Blumberg, Samuel H.; Izard, Carroll E.

    1991-01-01

    Examined the relations between emotion and facial expressions of emotion in 8- to 12-year-old male psychiatric patients. Results indicated that patterns or combinations of emotion experiences had an impact on facial expressions of emotion. (Author/BB)

  16. Emotional facial expressions reduce neural adaptation to face identity.

    Science.gov (United States)

    Gerlicher, Anna M V; van Loon, Anouk M; Scholte, H Steven; Lamme, Victor A F; van der Leij, Andries R

    2014-05-01

    In human social interactions, facial emotional expressions are a crucial source of information. Repeatedly presented information typically leads to an adaptation of neural responses. However, processing seems sustained with emotional facial expressions. Therefore, we tested whether sustained processing of emotional expressions, especially threat-related expressions, would attenuate neural adaptation. Neutral and emotional expressions (happy, mixed and fearful) of same and different identity were presented at 3 Hz. We used electroencephalography to record the evoked steady-state visual potentials (ssVEP) and tested to what extent the ssVEP amplitude adapts to the same when compared with different face identities. We found adaptation to the identity of a neutral face. However, for emotional faces, adaptation was reduced, decreasing linearly with negative valence, with the least adaptation to fearful expressions. This short and straightforward method may prove to be a valuable new tool in the study of emotional processing.

  17. Association of impaired facial affect recognition with basic facial and visual processing deficits in schizophrenia.

    Science.gov (United States)

    Norton, Daniel; McBain, Ryan; Holt, Daphne J; Ongur, Dost; Chen, Yue

    2009-06-15

    Impaired emotion recognition has been reported in schizophrenia, yet the nature of this impairment is not completely understood. Recognition of facial emotion depends on processing affective and nonaffective facial signals, as well as basic visual attributes. We examined whether and how poor facial emotion recognition in schizophrenia is related to basic visual processing and nonaffective face recognition. Schizophrenia patients (n = 32) and healthy control subjects (n = 29) performed emotion discrimination, identity discrimination, and visual contrast detection tasks, where the emotionality, distinctiveness of identity, or visual contrast was systematically manipulated. Subjects determined which of two presentations in a trial contained the target: the emotional face for emotion discrimination, a specific individual for identity discrimination, and a sinusoidal grating for contrast detection. Patients had significantly higher thresholds (worse performance) than control subjects for discriminating both fearful and happy faces. Furthermore, patients' poor performance in fear discrimination was predicted by performance in visual detection and face identity discrimination. Schizophrenia patients require greater emotional signal strength to discriminate fearful or happy face images from neutral ones. Deficient emotion recognition in schizophrenia does not appear to be determined solely by affective processing but is also linked to the processing of basic visual and facial information.

  18. A small-world network model of facial emotion recognition.

    Science.gov (United States)

    Takehara, Takuma; Ochiai, Fumio; Suzuki, Naoto

    2016-01-01

    Various models have been proposed to increase understanding of the cognitive basis of facial emotions. Despite those efforts, interactions between facial emotions have received minimal attention. If collective behaviours relating to each facial emotion in the comprehensive cognitive system could be assumed, specific facial emotion relationship patterns might emerge. In this study, we demonstrate that the frameworks of complex networks can effectively capture those patterns. We generate 81 facial emotion images (6 prototypes and 75 morphs) and then ask participants to rate degrees of similarity in 3240 facial emotion pairs in a paired comparison task. A facial emotion network constructed on the basis of similarity clearly forms a small-world network, which features an extremely short average network distance and close connectivity. Further, even if two facial emotions have opposing valences, they are connected within only two steps. In addition, we show that intermediary morphs are crucial for maintaining full network integration, whereas prototypes are not at all important. These results suggest the existence of collective behaviours in the cognitive systems of facial emotions and also describe why people can efficiently recognize facial emotions in terms of information transmission and propagation. For comparison, we construct three simulated networks--one based on the categorical model, one based on the dimensional model, and one random network. The results reveal that small-world connectivity in facial emotion networks is apparently different from those networks, suggesting that a small-world network is the most suitable model for capturing the cognitive basis of facial emotions.

  19. Computerised analysis of facial emotion expression in eating disorders

    Science.gov (United States)

    2017-01-01

    Background Problems with social-emotional processing are known to be an important contributor to the development and maintenance of eating disorders (EDs). Diminished facial communication of emotion has been frequently reported in individuals with anorexia nervosa (AN). Less is known about facial expressivity in bulimia nervosa (BN) and in people who have recovered from AN (RecAN). This study aimed to pilot the use of computerised facial expression analysis software to investigate emotion expression across the ED spectrum and recovery in a large sample of participants. Method 297 participants with AN, BN, RecAN, and healthy controls were recruited. Participants watched film clips designed to elicit happy or sad emotions, and facial expressions were then analysed using FaceReader. Results The finding mirrored those from previous work showing that healthy control and RecAN participants expressed significantly more positive emotions during the positive clip compared to the AN group. There were no differences in emotion expression during the sad film clip. Discussion These findings support the use of computerised methods to analyse emotion expression in EDs. The findings also demonstrate that reduced positive emotion expression is likely to be associated with the acute stage of AN illness, with individuals with BN showing an intermediate profile. PMID:28575109

  20. Computerised analysis of facial emotion expression in eating disorders.

    Science.gov (United States)

    Leppanen, Jenni; Dapelo, Marcela Marin; Davies, Helen; Lang, Katie; Treasure, Janet; Tchanturia, Kate

    2017-01-01

    Problems with social-emotional processing are known to be an important contributor to the development and maintenance of eating disorders (EDs). Diminished facial communication of emotion has been frequently reported in individuals with anorexia nervosa (AN). Less is known about facial expressivity in bulimia nervosa (BN) and in people who have recovered from AN (RecAN). This study aimed to pilot the use of computerised facial expression analysis software to investigate emotion expression across the ED spectrum and recovery in a large sample of participants. 297 participants with AN, BN, RecAN, and healthy controls were recruited. Participants watched film clips designed to elicit happy or sad emotions, and facial expressions were then analysed using FaceReader. The finding mirrored those from previous work showing that healthy control and RecAN participants expressed significantly more positive emotions during the positive clip compared to the AN group. There were no differences in emotion expression during the sad film clip. These findings support the use of computerised methods to analyse emotion expression in EDs. The findings also demonstrate that reduced positive emotion expression is likely to be associated with the acute stage of AN illness, with individuals with BN showing an intermediate profile.

  1. Facial EMG responses to emotional expressions are related to emotion perception ability.

    Science.gov (United States)

    Künecke, Janina; Hildebrandt, Andrea; Recio, Guillermo; Sommer, Werner; Wilhelm, Oliver

    2014-01-01

    Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a "reactivation" of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG)--in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110) in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective.

  2. Spontaneous Facial Mimicry Is Enhanced by the Goal of Inferring Emotional States: Evidence for Moderation of "Automatic" Mimicry by Higher Cognitive Processes.

    Science.gov (United States)

    Murata, Aiko; Saito, Hisamichi; Schug, Joanna; Ogawa, Kenji; Kameda, Tatsuya

    2016-01-01

    A number of studies have shown that individuals often spontaneously mimic the facial expressions of others, a tendency known as facial mimicry. This tendency has generally been considered a reflex-like "automatic" response, but several recent studies have shown that the degree of mimicry may be moderated by contextual information. However, the cognitive and motivational factors underlying the contextual moderation of facial mimicry require further empirical investigation. In this study, we present evidence that the degree to which participants spontaneously mimic a target's facial expressions depends on whether participants are motivated to infer the target's emotional state. In the first study we show that facial mimicry, assessed by facial electromyography, occurs more frequently when participants are specifically instructed to infer a target's emotional state than when given no instruction. In the second study, we replicate this effect using the Facial Action Coding System to show that participants are more likely to mimic facial expressions of emotion when they are asked to infer the target's emotional state, rather than make inferences about a physical trait unrelated to emotion. These results provide convergent evidence that the explicit goal of understanding a target's emotional state affects the degree of facial mimicry shown by the perceiver, suggesting moderation of reflex-like motor activities by higher cognitive processes.

  3. Facial Emotion Recognition and Expression in Parkinson's Disease: An Emotional Mirror Mechanism?

    Science.gov (United States)

    Ricciardi, Lucia; Visco-Comandini, Federica; Erro, Roberto; Morgante, Francesca; Bologna, Matteo; Fasano, Alfonso; Ricciardi, Diego; Edwards, Mark J; Kilner, James

    2017-01-01

    Parkinson's disease (PD) patients have impairment of facial expressivity (hypomimia) and difficulties in interpreting the emotional facial expressions produced by others, especially for aversive emotions. We aimed to evaluate the ability to produce facial emotional expressions and to recognize facial emotional expressions produced by others in a group of PD patients and a group of healthy participants in order to explore the relationship between these two abilities and any differences between the two groups of participants. Twenty non-demented, non-depressed PD patients and twenty healthy participants (HC) matched for demographic characteristics were studied. The ability of recognizing emotional facial expressions was assessed with the Ekman 60-faces test (Emotion recognition task). Participants were video-recorded while posing facial expressions of 6 primary emotions (happiness, sadness, surprise, disgust, fear and anger). The most expressive pictures for each emotion were derived from the videos. Ten healthy raters were asked to look at the pictures displayed on a computer-screen in pseudo-random fashion and to identify the emotional label in a six-forced-choice response format (Emotion expressivity task). Reaction time (RT) and accuracy of responses were recorded. At the end of each trial the participant was asked to rate his/her confidence in his/her perceived accuracy of response. For emotion recognition, PD reported lower score than HC for Ekman total score (pemotions sub-scores happiness, fear, anger, sadness (pfacial emotion expressivity task, PD and HC significantly differed in the total score (p = 0.05) and in the sub-scores for happiness, sadness, anger (all pemotions. There was a significant positive correlation between the emotion facial recognition and expressivity in both groups; the correlation was even stronger when ranking emotions from the best recognized to the worst (R = 0.75, p = 0.004). PD patients showed difficulties in recognizing emotional

  4. The recognition of facial emotion expressions in Parkinson's disease.

    Science.gov (United States)

    Assogna, Francesca; Pontieri, Francesco E; Caltagirone, Carlo; Spalletta, Gianfranco

    2008-11-01

    A limited number of studies in Parkinson's Disease (PD) suggest a disturbance of recognition of facial emotion expressions. In particular, disgust recognition impairment has been reported in unmedicated and medicated PD patients. However, the results are rather inconclusive in the definition of the degree and the selectivity of emotion recognition impairment, and an associated impairment of almost all basic facial emotions in PD is also described. Few studies have investigated the relationship with neuropsychiatric and neuropsychological symptoms with mainly negative results. This inconsistency may be due to many different problems, such as emotion assessment, perception deficit, cognitive impairment, behavioral symptoms, illness severity and antiparkinsonian therapy. Here we review the clinical characteristics and neural structures involved in the recognition of specific facial emotion expressions, and the plausible role of dopamine transmission and dopamine replacement therapy in these processes. It is clear that future studies should be directed to clarify all these issues.

  5. Examining speed of processing of facial emotion recognition in individuals at ultra-high risk for psychosis

    DEFF Research Database (Denmark)

    Glenthøj, Louise Birkedal; Fagerlund, Birgitte; Bak, Nikolaj

    2018-01-01

    Emotion recognition is an aspect of social cognition that may be a key predictor of functioning and transition to psychosis in individuals at ultra-high risk (UHR) for psychosis ( Allott et al., 2014 ). UHR individuals exhibit deficits in accurately identifying facial emotions ( van Donkersgoed et...... al., 2015 ), but other potential anomalies in facial emotion recognition are largely unexplored. This study aimed to extend current knowledge on emotion recognition deficits in UHR individuals by examining: 1) whether UHR would display significantly slower facial emotion recognition than healthy...... controls, 2) whether an association between emotion recognition accuracy and emotion recognition latency is present in UHR, 3) the relationships between emotion recognition accuracy, neurocognition and psychopathology in UHR....

  6. Facial EMG responses to emotional expressions are related to emotion perception ability.

    Directory of Open Access Journals (Sweden)

    Janina Künecke

    Full Text Available Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a "reactivation" of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG--in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110 in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective.

  7. [Impact of facial emotional recognition alterations in Dementia of the Alzheimer type].

    Science.gov (United States)

    Rubinstein, Wanda; Cossini, Florencia; Politis, Daniel

    2016-07-01

    Face recognition of basic emotions is independent of other deficits in dementia of the Alzheimer type. Among these deficits, there is disagreement about what emotions are more difficult to recognize. Our aim was to study the presence of alterations in the process of facial recognition of basic emotions, and to investigate if there were differences in the recognition of each type of emotion in Alzheimer's disease. With three tests of recognition of basic facial emotions we evaluated 29 patients who had been diagnosed with dementia of the Alzheimer type and 18 control subjects. Significant differences were obtained in tests of recognition of basic facial emotions and between each. Since the amygdala, one of the brain structures responsible for emotional reaction, is affected in the early stages of this disease, our findings become relevant to understand how this alteration of the process of emotional recognition impacts the difficulties these patients have in both interpersonal relations and behavioral disorders.

  8. Effects of Repeated Concussions and Sex on Early Processing of Emotional Facial Expressions as Revealed by Electrophysiology.

    Science.gov (United States)

    Carrier-Toutant, Frédérike; Guay, Samuel; Beaulieu, Christelle; Léveillé, Édith; Turcotte-Giroux, Alexandre; Papineau, Samaël D; Brisson, Benoit; D'Hondt, Fabien; De Beaumont, Louis

    2018-05-06

    Concussions affect the processing of emotional stimuli. This study aimed to investigate how sex interacts with concussion effects on early event-related brain potentials (ERP) measures (P1, N1) of emotional facial expressions (EFE) processing in asymptomatic, multi-concussion athletes during an EFE identification task. Forty control athletes (20 females and 20 males) and 43 multi-concussed athletes (22 females and 21 males), recruited more than 3 months after their last concussion, were tested. Participants completed the Beck Depression Inventory II, the Beck Anxiety Inventory, the Post-Concussion Symptom Scale, and an Emotional Facial Expression Identification Task. Pictures of male and female faces expressing neutral, angry, and happy emotions were randomly presented and the emotion depicted had to be identified as fast as possible during EEG acquisition. Relative to controls, concussed athletes of both sex exhibited a significant suppression of P1 amplitude recorded from the dominant right hemisphere while performing the emotional face expression identification task. The present study also highlighted a sex-specific suppression of the N1 component amplitude after concussion which affected male athletes. These findings suggest that repeated concussions alter the typical pattern of right-hemisphere response dominance to EFE in early stages of EFE processing and that the neurophysiological mechanisms underlying the processing of emotional stimuli are distinctively affected across sex. (JINS, 2018, 24, 1-11).

  9. Does Facial Amimia Impact the Recognition of Facial Emotions? An EMG Study in Parkinson’s Disease

    Science.gov (United States)

    Argaud, Soizic; Delplanque, Sylvain; Houvenaghel, Jean-François; Auffret, Manon; Duprez, Joan; Vérin, Marc; Grandjean, Didier; Sauleau, Paul

    2016-01-01

    According to embodied simulation theory, understanding other people’s emotions is fostered by facial mimicry. However, studies assessing the effect of facial mimicry on the recognition of emotion are still controversial. In Parkinson’s disease (PD), one of the most distinctive clinical features is facial amimia, a reduction in facial expressiveness, but patients also show emotional disturbances. The present study used the pathological model of PD to examine the role of facial mimicry on emotion recognition by investigating EMG responses in PD patients during a facial emotion recognition task (anger, joy, neutral). Our results evidenced a significant decrease in facial mimicry for joy in PD, essentially linked to the absence of reaction of the zygomaticus major and the orbicularis oculi muscles in response to happy avatars, whereas facial mimicry for expressions of anger was relatively preserved. We also confirmed that PD patients were less accurate in recognizing positive and neutral facial expressions and highlighted a beneficial effect of facial mimicry on the recognition of emotion. We thus provide additional arguments for embodied simulation theory suggesting that facial mimicry is a potential lever for therapeutic actions in PD even if it seems not to be necessarily required in recognizing emotion as such. PMID:27467393

  10. Facial Emotion Recognition and Expression in Parkinson’s Disease: An Emotional Mirror Mechanism?

    Science.gov (United States)

    Ricciardi, Lucia; Visco-Comandini, Federica; Erro, Roberto; Morgante, Francesca; Bologna, Matteo; Fasano, Alfonso; Ricciardi, Diego; Edwards, Mark J.; Kilner, James

    2017-01-01

    Background and aim Parkinson’s disease (PD) patients have impairment of facial expressivity (hypomimia) and difficulties in interpreting the emotional facial expressions produced by others, especially for aversive emotions. We aimed to evaluate the ability to produce facial emotional expressions and to recognize facial emotional expressions produced by others in a group of PD patients and a group of healthy participants in order to explore the relationship between these two abilities and any differences between the two groups of participants. Methods Twenty non-demented, non-depressed PD patients and twenty healthy participants (HC) matched for demographic characteristics were studied. The ability of recognizing emotional facial expressions was assessed with the Ekman 60-faces test (Emotion recognition task). Participants were video-recorded while posing facial expressions of 6 primary emotions (happiness, sadness, surprise, disgust, fear and anger). The most expressive pictures for each emotion were derived from the videos. Ten healthy raters were asked to look at the pictures displayed on a computer-screen in pseudo-random fashion and to identify the emotional label in a six-forced-choice response format (Emotion expressivity task). Reaction time (RT) and accuracy of responses were recorded. At the end of each trial the participant was asked to rate his/her confidence in his/her perceived accuracy of response. Results For emotion recognition, PD reported lower score than HC for Ekman total score (pemotions sub-scores happiness, fear, anger, sadness (pfacial emotion expressivity task, PD and HC significantly differed in the total score (p = 0.05) and in the sub-scores for happiness, sadness, anger (all pemotions. There was a significant positive correlation between the emotion facial recognition and expressivity in both groups; the correlation was even stronger when ranking emotions from the best recognized to the worst (R = 0.75, p = 0.004). Conclusions PD

  11. Incongruence Between Observers’ and Observed Facial Muscle Activation Reduces Recognition of Emotional Facial Expressions From Video Stimuli

    Directory of Open Access Journals (Sweden)

    Tanja S. H. Wingenbach

    2018-06-01

    Full Text Available According to embodied cognition accounts, viewing others’ facial emotion can elicit the respective emotion representation in observers which entails simulations of sensory, motor, and contextual experiences. In line with that, published research found viewing others’ facial emotion to elicit automatic matched facial muscle activation, which was further found to facilitate emotion recognition. Perhaps making congruent facial muscle activity explicit produces an even greater recognition advantage. If there is conflicting sensory information, i.e., incongruent facial muscle activity, this might impede recognition. The effects of actively manipulating facial muscle activity on facial emotion recognition from videos were investigated across three experimental conditions: (a explicit imitation of viewed facial emotional expressions (stimulus-congruent condition, (b pen-holding with the lips (stimulus-incongruent condition, and (c passive viewing (control condition. It was hypothesised that (1 experimental condition (a and (b result in greater facial muscle activity than (c, (2 experimental condition (a increases emotion recognition accuracy from others’ faces compared to (c, (3 experimental condition (b lowers recognition accuracy for expressions with a salient facial feature in the lower, but not the upper face area, compared to (c. Participants (42 males, 42 females underwent a facial emotion recognition experiment (ADFES-BIV while electromyography (EMG was recorded from five facial muscle sites. The experimental conditions’ order was counter-balanced. Pen-holding caused stimulus-incongruent facial muscle activity for expressions with facial feature saliency in the lower face region, which reduced recognition of lower face region emotions. Explicit imitation caused stimulus-congruent facial muscle activity without modulating recognition. Methodological implications are discussed.

  12. Incongruence Between Observers' and Observed Facial Muscle Activation Reduces Recognition of Emotional Facial Expressions From Video Stimuli.

    Science.gov (United States)

    Wingenbach, Tanja S H; Brosnan, Mark; Pfaltz, Monique C; Plichta, Michael M; Ashwin, Chris

    2018-01-01

    According to embodied cognition accounts, viewing others' facial emotion can elicit the respective emotion representation in observers which entails simulations of sensory, motor, and contextual experiences. In line with that, published research found viewing others' facial emotion to elicit automatic matched facial muscle activation, which was further found to facilitate emotion recognition. Perhaps making congruent facial muscle activity explicit produces an even greater recognition advantage. If there is conflicting sensory information, i.e., incongruent facial muscle activity, this might impede recognition. The effects of actively manipulating facial muscle activity on facial emotion recognition from videos were investigated across three experimental conditions: (a) explicit imitation of viewed facial emotional expressions (stimulus-congruent condition), (b) pen-holding with the lips (stimulus-incongruent condition), and (c) passive viewing (control condition). It was hypothesised that (1) experimental condition (a) and (b) result in greater facial muscle activity than (c), (2) experimental condition (a) increases emotion recognition accuracy from others' faces compared to (c), (3) experimental condition (b) lowers recognition accuracy for expressions with a salient facial feature in the lower, but not the upper face area, compared to (c). Participants (42 males, 42 females) underwent a facial emotion recognition experiment (ADFES-BIV) while electromyography (EMG) was recorded from five facial muscle sites. The experimental conditions' order was counter-balanced. Pen-holding caused stimulus-incongruent facial muscle activity for expressions with facial feature saliency in the lower face region, which reduced recognition of lower face region emotions. Explicit imitation caused stimulus-congruent facial muscle activity without modulating recognition. Methodological implications are discussed.

  13. Incongruence Between Observers’ and Observed Facial Muscle Activation Reduces Recognition of Emotional Facial Expressions From Video Stimuli

    Science.gov (United States)

    Wingenbach, Tanja S. H.; Brosnan, Mark; Pfaltz, Monique C.; Plichta, Michael M.; Ashwin, Chris

    2018-01-01

    According to embodied cognition accounts, viewing others’ facial emotion can elicit the respective emotion representation in observers which entails simulations of sensory, motor, and contextual experiences. In line with that, published research found viewing others’ facial emotion to elicit automatic matched facial muscle activation, which was further found to facilitate emotion recognition. Perhaps making congruent facial muscle activity explicit produces an even greater recognition advantage. If there is conflicting sensory information, i.e., incongruent facial muscle activity, this might impede recognition. The effects of actively manipulating facial muscle activity on facial emotion recognition from videos were investigated across three experimental conditions: (a) explicit imitation of viewed facial emotional expressions (stimulus-congruent condition), (b) pen-holding with the lips (stimulus-incongruent condition), and (c) passive viewing (control condition). It was hypothesised that (1) experimental condition (a) and (b) result in greater facial muscle activity than (c), (2) experimental condition (a) increases emotion recognition accuracy from others’ faces compared to (c), (3) experimental condition (b) lowers recognition accuracy for expressions with a salient facial feature in the lower, but not the upper face area, compared to (c). Participants (42 males, 42 females) underwent a facial emotion recognition experiment (ADFES-BIV) while electromyography (EMG) was recorded from five facial muscle sites. The experimental conditions’ order was counter-balanced. Pen-holding caused stimulus-incongruent facial muscle activity for expressions with facial feature saliency in the lower face region, which reduced recognition of lower face region emotions. Explicit imitation caused stimulus-congruent facial muscle activity without modulating recognition. Methodological implications are discussed. PMID:29928240

  14. Emotion elicitor or emotion messenger? Subliminal priming reveals two faces of facial expressions.

    Science.gov (United States)

    Ruys, Kirsten I; Stapel, Diederik A

    2008-06-01

    Facial emotional expressions can serve both as emotional stimuli and as communicative signals. The research reported here was conducted to illustrate how responses to both roles of facial emotional expressions unfold over time. As an emotion elicitor, a facial emotional expression (e.g., a disgusted face) activates a response that is similar to responses to other emotional stimuli of the same valence (e.g., a dirty, nonflushed toilet). As an emotion messenger, the same facial expression (e.g., a disgusted face) serves as a communicative signal by also activating the knowledge that the sender is experiencing a specific emotion (e.g., the sender feels disgusted). By varying the duration of exposure to disgusted, fearful, angry, and neutral faces in two subliminal-priming studies, we demonstrated that responses to faces as emotion elicitors occur prior to responses to faces as emotion messengers, and that both types of responses may unfold unconsciously.

  15. Brain correlates of musical and facial emotion recognition: evidence from the dementias.

    Science.gov (United States)

    Hsieh, S; Hornberger, M; Piguet, O; Hodges, J R

    2012-07-01

    The recognition of facial expressions of emotion is impaired in semantic dementia (SD) and is associated with right-sided brain atrophy in areas known to be involved in emotion processing, notably the amygdala. Whether patients with SD also experience difficulty recognizing emotions conveyed by other media, such as music, is unclear. Prior studies have used excerpts of known music from classical or film repertoire but not unfamiliar melodies designed to convey distinct emotions. Patients with SD (n = 11), Alzheimer's disease (n = 12) and healthy control participants (n = 20) underwent tests of emotion recognition in two modalities: unfamiliar musical tunes and unknown faces as well as volumetric MRI. Patients with SD were most impaired with the recognition of facial and musical emotions, particularly for negative emotions. Voxel-based morphometry showed that the labelling of emotions, regardless of modality, correlated with the degree of atrophy in the right temporal pole, amygdala and insula. The recognition of musical (but not facial) emotions was also associated with atrophy of the left anterior and inferior temporal lobe, which overlapped with regions correlating with standardized measures of verbal semantic memory. These findings highlight the common neural substrates supporting the processing of emotions by facial and musical stimuli but also indicate that the recognition of emotions from music draws upon brain regions that are associated with semantics in language. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Time perception and dynamics of facial expressions of emotions.

    Directory of Open Access Journals (Sweden)

    Sophie L Fayolle

    Full Text Available Two experiments were run to examine the effects of dynamic displays of facial expressions of emotions on time judgments. The participants were given a temporal bisection task with emotional facial expressions presented in a dynamic or a static display. Two emotional facial expressions and a neutral expression were tested and compared. Each of the emotional expressions had the same affective valence (unpleasant, but one was high-arousing (expressing anger and the other low-arousing (expressing sadness. Our results showed that time judgments are highly sensitive to movements in facial expressions and the emotions expressed. Indeed, longer perceived durations were found in response to the dynamic faces and the high-arousing emotional expressions compared to the static faces and low-arousing expressions. In addition, the facial movements amplified the effect of emotions on time perception. Dynamic facial expressions are thus interesting tools for examining variations in temporal judgments in different social contexts.

  17. Unsupervised learning of facial emotion decoding skills.

    Science.gov (United States)

    Huelle, Jan O; Sack, Benjamin; Broer, Katja; Komlewa, Irina; Anders, Silke

    2014-01-01

    Research on the mechanisms underlying human facial emotion recognition has long focussed on genetically determined neural algorithms and often neglected the question of how these algorithms might be tuned by social learning. Here we show that facial emotion decoding skills can be significantly and sustainably improved by practice without an external teaching signal. Participants saw video clips of dynamic facial expressions of five different women and were asked to decide which of four possible emotions (anger, disgust, fear, and sadness) was shown in each clip. Although no external information about the correctness of the participant's response or the sender's true affective state was provided, participants showed a significant increase of facial emotion recognition accuracy both within and across two training sessions two days to several weeks apart. We discuss several similarities and differences between the unsupervised improvement of facial decoding skills observed in the current study, unsupervised perceptual learning of simple stimuli described in previous studies and practice effects often observed in cognitive tasks.

  18. Impaired recognition of facial emotions from low-spatial frequencies in Asperger syndrome.

    Science.gov (United States)

    Kätsyri, Jari; Saalasti, Satu; Tiippana, Kaisa; von Wendt, Lennart; Sams, Mikko

    2008-01-01

    The theory of 'weak central coherence' [Happe, F., & Frith, U. (2006). The weak coherence account: Detail-focused cognitive style in autism spectrum disorders. Journal of Autism and Developmental Disorders, 36(1), 5-25] implies that persons with autism spectrum disorders (ASDs) have a perceptual bias for local but not for global stimulus features. The recognition of emotional facial expressions representing various different levels of detail has not been studied previously in ASDs. We analyzed the recognition of four basic emotional facial expressions (anger, disgust, fear and happiness) from low-spatial frequencies (overall global shapes without local features) in adults with an ASD. A group of 20 participants with Asperger syndrome (AS) was compared to a group of non-autistic age- and sex-matched controls. Emotion recognition was tested from static and dynamic facial expressions whose spatial frequency contents had been manipulated by low-pass filtering at two levels. The two groups recognized emotions similarly from non-filtered faces and from dynamic vs. static facial expressions. In contrast, the participants with AS were less accurate than controls in recognizing facial emotions from very low-spatial frequencies. The results suggest intact recognition of basic facial emotions and dynamic facial information, but impaired visual processing of global features in ASDs.

  19. Is empathy necessary to comprehend the emotional faces? The empathic effect on attentional mechanisms (eye movements), cortical correlates (N200 event-related potentials) and facial behaviour (electromyography) in face processing.

    Science.gov (United States)

    Balconi, Michela; Canavesio, Ylenia

    2016-01-01

    The present research explored the effect of social empathy on processing emotional facial expressions. Previous evidence suggested a close relationship between emotional empathy and both the ability to detect facial emotions and the attentional mechanisms involved. A multi-measure approach was adopted: we investigated the association between trait empathy (Balanced Emotional Empathy Scale) and individuals' performance (response times; RTs), attentional mechanisms (eye movements; number and duration of fixations), correlates of cortical activation (event-related potential (ERP) N200 component), and facial responsiveness (facial zygomatic and corrugator activity). Trait empathy was found to affect face detection performance (reduced RTs), attentional processes (more scanning eye movements in specific areas of interest), ERP salience effect (increased N200 amplitude), and electromyographic activity (more facial responses). A second important result was the demonstration of strong, direct correlations among these measures. We suggest that empathy may function as a social facilitator of the processes underlying the detection of facial emotion, and a general "facial response effect" is proposed to explain these results. We assumed that empathy influences cognitive and the facial responsiveness, such that empathic individuals are more skilful in processing facial emotion.

  20. Developmental changes in the primacy of facial cues for emotion recognition.

    Science.gov (United States)

    Leitzke, Brian T; Pollak, Seth D

    2016-04-01

    There have been long-standing differences of opinion regarding the influence of the face relative to that of contextual information on how individuals process and judge facial expressions of emotion. However, developmental changes in how individuals use such information have remained largely unexplored and could be informative in attempting to reconcile these opposing views. The current study tested for age-related differences in how individuals prioritize viewing emotional faces versus contexts when making emotion judgments. To do so, we asked 4-, 8-, and 12-year-old children as well as college students to categorize facial expressions of emotion that were presented with scenes that were either congruent or incongruent with the facial displays. During this time, we recorded participants' gaze patterns via eye tracking. College students directed their visual attention primarily to the face, regardless of contextual information. Children, however, divided their attention between both the face and the context as sources of emotional information depending on the valence of the context. These findings reveal a developmental shift in how individuals process and integrate emotional cues. (c) 2016 APA, all rights reserved).

  1. P2-28: An Amplification of Feedback from Facial Muscles Strengthened Sympathetic Activations to Emotional Facial Cues

    Directory of Open Access Journals (Sweden)

    Younbyoung Chae

    2012-10-01

    Full Text Available The facial feedback hypothesis suggests that feedback from cutaneous and muscular afferents influences our emotions during the control of facial expressions. Enhanced facial expressiveness is correlated with an increase in autonomic arousal, and self-reported emotional experience, while limited facial expression attenuates these responses. The present study was aimed at investigating the difference in emotional response in imitated versus observed facial expressions. For this, we measured the facial electromyogram of the corrugator muscle as well as the skin conductance response (SCR while participants were either imitating or simply observing emotional facial expressions. We found that participants produced significantly greater facial electromyogram activation during imitations compared to observations of angry faces. Similarly, they exhibited significantly greater SCR during imitations to angry faces compared to observations. An amplification of feedback from face muscles during imitation strengthened sympathetic activation to negative emotional cues. These findings suggest that manipulations of muscular feedback could modulate the bodily expression of emotion and perhaps also the emotional response itself.

  2. The MPI facial expression database--a validated database of emotional and conversational facial expressions.

    Directory of Open Access Journals (Sweden)

    Kathrin Kaulard

    Full Text Available The ability to communicate is one of the core aspects of human life. For this, we use not only verbal but also nonverbal signals of remarkable complexity. Among the latter, facial expressions belong to the most important information channels. Despite the large variety of facial expressions we use in daily life, research on facial expressions has so far mostly focused on the emotional aspect. Consequently, most databases of facial expressions available to the research community also include only emotional expressions, neglecting the largely unexplored aspect of conversational expressions. To fill this gap, we present the MPI facial expression database, which contains a large variety of natural emotional and conversational expressions. The database contains 55 different facial expressions performed by 19 German participants. Expressions were elicited with the help of a method-acting protocol, which guarantees both well-defined and natural facial expressions. The method-acting protocol was based on every-day scenarios, which are used to define the necessary context information for each expression. All facial expressions are available in three repetitions, in two intensities, as well as from three different camera angles. A detailed frame annotation is provided, from which a dynamic and a static version of the database have been created. In addition to describing the database in detail, we also present the results of an experiment with two conditions that serve to validate the context scenarios as well as the naturalness and recognizability of the video sequences. Our results provide clear evidence that conversational expressions can be recognized surprisingly well from visual information alone. The MPI facial expression database will enable researchers from different research fields (including the perceptual and cognitive sciences, but also affective computing, as well as computer vision to investigate the processing of a wider range of natural

  3. Recognition of facial expressions and prosodic cues with graded emotional intensities in adults with Asperger syndrome.

    Science.gov (United States)

    Doi, Hirokazu; Fujisawa, Takashi X; Kanai, Chieko; Ohta, Haruhisa; Yokoi, Hideki; Iwanami, Akira; Kato, Nobumasa; Shinohara, Kazuyuki

    2013-09-01

    This study investigated the ability of adults with Asperger syndrome to recognize emotional categories of facial expressions and emotional prosodies with graded emotional intensities. The individuals with Asperger syndrome showed poorer recognition performance for angry and sad expressions from both facial and vocal information. The group difference in facial expression recognition was prominent for stimuli with low or intermediate emotional intensities. In contrast to this, the individuals with Asperger syndrome exhibited lower recognition accuracy than typically-developed controls mainly for emotional prosody with high emotional intensity. In facial expression recognition, Asperger and control groups showed an inversion effect for all categories. The magnitude of this effect was less in the Asperger group for angry and sad expressions, presumably attributable to reduced recruitment of the configural mode of face processing. The individuals with Asperger syndrome outperformed the control participants in recognizing inverted sad expressions, indicating enhanced processing of local facial information representing sad emotion. These results suggest that the adults with Asperger syndrome rely on modality-specific strategies in emotion recognition from facial expression and prosodic information.

  4. Face or body? Oxytocin improves perception of emotions from facial expressions in incongruent emotional body context.

    Science.gov (United States)

    Perry, Anat; Aviezer, Hillel; Goldstein, Pavel; Palgi, Sharon; Klein, Ehud; Shamay-Tsoory, Simone G

    2013-11-01

    The neuropeptide oxytocin (OT) has been repeatedly reported to play an essential role in the regulation of social cognition in humans in general, and specifically in enhancing the recognition of emotions from facial expressions. The later was assessed in different paradigms that rely primarily on isolated and decontextualized emotional faces. However, recent evidence has indicated that the perception of basic facial expressions is not context invariant and can be categorically altered by context, especially body context, at early perceptual levels. Body context has a strong effect on our perception of emotional expressions, especially when the actual target face and the contextually expected face are perceptually similar. To examine whether and how OT affects emotion recognition, we investigated the role of OT in categorizing facial expressions in incongruent body contexts. Our results show that in the combined process of deciphering emotions from facial expressions and from context, OT gives an advantage to the face. This advantage is most evident when the target face and the contextually expected face are perceptually similar. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Facial EMG responses to dynamic emotional facial expressions in boys with disruptive behavior disorders

    NARCIS (Netherlands)

    Wied, de M.; Boxtel, van Anton; Zaalberg, R.; Goudena, P.P.; Matthys, W.

    2006-01-01

    Based on the assumption that facial mimicry is a key factor in emotional empathy, and clinical observations that children with disruptive behavior disorders (DBD) are weak empathizers, the present study explored whether DBD boys are less facially responsive to facial expressions of emotions than

  6. Unsupervised learning of facial emotion decoding skills

    Directory of Open Access Journals (Sweden)

    Jan Oliver Huelle

    2014-02-01

    Full Text Available Research on the mechanisms underlying human facial emotion recognition has long focussed on genetically determined neural algorithms and often neglected the question of how these algorithms might be tuned by social learning. Here we show that facial emotion decoding skills can be significantly and sustainably improved by practise without an external teaching signal. Participants saw video clips of dynamic facial expressions of five different women and were asked to decide which of four possible emotions (anger, disgust, fear and sadness was shown in each clip. Although no external information about the correctness of the participant’s response or the sender’s true affective state was provided, participants showed a significant increase of facial emotion recognition accuracy both within and across two training sessions two days to several weeks apart. We discuss several similarities and differences between the unsupervised improvement of facial decoding skills observed in the current study, unsupervised perceptual learning of simple stimuli described in previous studies and practise effects often observed in cognitive tasks.

  7. Association between Ability Emotional Intelligence and Left Insula during Social Judgment of Facial Emotions.

    Science.gov (United States)

    Quarto, Tiziana; Blasi, Giuseppe; Maddalena, Chiara; Viscanti, Giovanna; Lanciano, Tiziana; Soleti, Emanuela; Mangiulli, Ivan; Taurisano, Paolo; Fazio, Leonardo; Bertolino, Alessandro; Curci, Antonietta

    2016-01-01

    The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain "somatic marker circuitry" (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.

  8. A facial expression of pax: Assessing children's "recognition" of emotion from faces.

    Science.gov (United States)

    Nelson, Nicole L; Russell, James A

    2016-01-01

    In a classic study, children were shown an array of facial expressions and asked to choose the person who expressed a specific emotion. Children were later asked to name the emotion in the face with any label they wanted. Subsequent research often relied on the same two tasks--choice from array and free labeling--to support the conclusion that children recognize basic emotions from facial expressions. Here five studies (N=120, 2- to 10-year-olds) showed that these two tasks produce illusory recognition; a novel nonsense facial expression was included in the array. Children "recognized" a nonsense emotion (pax or tolen) and two familiar emotions (fear and jealousy) from the same nonsense face. Children likely used a process of elimination; they paired the unknown facial expression with a label given in the choice-from-array task and, after just two trials, freely labeled the new facial expression with the new label. These data indicate that past studies using this method may have overestimated children's expression knowledge. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Perception of emotional facial expressions in individuals with high Autism-spectrum Quotient (AQ

    Directory of Open Access Journals (Sweden)

    Ervin Poljac

    2012-10-01

    Full Text Available Autism is characterized by difficulties in social interaction, communication, restrictive and repetitive behaviours and specific impairments in emotional processing. The present study employed The Autism Spectrum Quotient (Baron-Cohen et al. 2006 to quantify autistic traits in a group of 260 healthy individuals and to investigate whether this measure is related to the perception of facial emotional expressions. The emotional processing of twelve participants that scored significantly higher than the average on the AQ was compared to twelve participants with significantly lower AQ scores. Perception of emotional expressions was estimated by The Facial Recognition Task (Montagne et al. 2007. There were significant differences between the two groups with regard to accuracy and sensitivity of the perception of emotional facial expressions. Specifically, the group with high AQ score was less accurate and needed higher emotional content to recognize emotions of anger, disgust, happiness and sadness. This result implies a selective impairment that might be helpful in understanding the psychopathology of autism spectrum disorders.

  10. When the mask falls: the role of facial motor resonance in memory for emotional language.

    Science.gov (United States)

    Baumeister, Jenny-Charlotte; Rumiati, Raffaella Ida; Foroni, Francesco

    2015-02-01

    The recognition and interpretation of emotional information (e.g., about happiness) has been shown to elicit, amongst other bodily reactions, spontaneous facial expressions occurring in accordance to the relevant emotion (e.g. a smile). Theories of embodied cognition act on the assumption that such embodied simulations are not only an accessorial, but a crucial factor in the processing of emotional information. While several studies have confirmed the importance of facial motor resonance during the initial recognition of emotional information, its role at later stages of processing, such as during memory for emotional content, remains unexplored. The present study bridges this gap by exploring the impact of facial motor resonance on the retrieval of emotional stimuli. In a novel approach, the specific effects of embodied simulations were investigated at different stages of emotional memory processing (during encoding and/or retrieval). Eighty participants underwent a memory task involving emotional and neutral words consisting of an encoding and retrieval phase. Depending on the experimental condition, facial muscles were blocked by a hardening facial mask either during encoding, during retrieval, during both encoding and retrieval, or were left free to resonate (control). The results demonstrate that not only initial recognition but also memory of emotional items benefits from embodied simulations occurring during their encoding and retrieval. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. An Age-Related Dissociation of Short-Term Memory for Facial Identity and Facial Emotional Expression.

    Science.gov (United States)

    Hartley, Alan A; Ravich, Zoe; Stringer, Sarah; Wiley, Katherine

    2015-09-01

    Memory for both facial emotional expression and facial identity was explored in younger and older adults in 3 experiments using a delayed match-to-sample procedure. Memory sets of 1, 2, or 3 faces were presented, which were followed by a probe after a 3-s retention interval. There was very little difference between younger and older adults in memory for emotional expressions, but memory for identity was substantially impaired in the older adults. Possible explanations for spared memory for emotional expressions include socioemotional selectivity theory as well as the existence of overlapping yet distinct brain networks for processing of different emotions. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Emotion unfolded by motion: a role for parietal lobe in decoding dynamic facial expressions.

    Science.gov (United States)

    Sarkheil, Pegah; Goebel, Rainer; Schneider, Frank; Mathiak, Klaus

    2013-12-01

    Facial expressions convey important emotional and social information and are frequently applied in investigations of human affective processing. Dynamic faces may provide higher ecological validity to examine perceptual and cognitive processing of facial expressions. Higher order processing of emotional faces was addressed by varying the task and virtual face models systematically. Blood oxygenation level-dependent activation was assessed using functional magnetic resonance imaging in 20 healthy volunteers while viewing and evaluating either emotion or gender intensity of dynamic face stimuli. A general linear model analysis revealed that high valence activated a network of motion-responsive areas, indicating that visual motion areas support perceptual coding for the motion-based intensity of facial expressions. The comparison of emotion with gender discrimination task revealed increased activation of inferior parietal lobule, which highlights the involvement of parietal areas in processing of high level features of faces. Dynamic emotional stimuli may help to emphasize functions of the hypothesized 'extended' over the 'core' system for face processing.

  13. Expression intensity, gender and facial emotion recognition: Women recognize only subtle facial emotions better than men.

    Science.gov (United States)

    Hoffmann, Holger; Kessler, Henrik; Eppel, Tobias; Rukavina, Stefanie; Traue, Harald C

    2010-11-01

    Two experiments were conducted in order to investigate the effect of expression intensity on gender differences in the recognition of facial emotions. The first experiment compared recognition accuracy between female and male participants when emotional faces were shown with full-blown (100% emotional content) or subtle expressiveness (50%). In a second experiment more finely grained analyses were applied in order to measure recognition accuracy as a function of expression intensity (40%-100%). The results show that although women were more accurate than men in recognizing subtle facial displays of emotion, there was no difference between male and female participants when recognizing highly expressive stimuli. Copyright © 2010 Elsevier B.V. All rights reserved.

  14. The MPI Facial Expression Database — A Validated Database of Emotional and Conversational Facial Expressions

    Science.gov (United States)

    Kaulard, Kathrin; Cunningham, Douglas W.; Bülthoff, Heinrich H.; Wallraven, Christian

    2012-01-01

    The ability to communicate is one of the core aspects of human life. For this, we use not only verbal but also nonverbal signals of remarkable complexity. Among the latter, facial expressions belong to the most important information channels. Despite the large variety of facial expressions we use in daily life, research on facial expressions has so far mostly focused on the emotional aspect. Consequently, most databases of facial expressions available to the research community also include only emotional expressions, neglecting the largely unexplored aspect of conversational expressions. To fill this gap, we present the MPI facial expression database, which contains a large variety of natural emotional and conversational expressions. The database contains 55 different facial expressions performed by 19 German participants. Expressions were elicited with the help of a method-acting protocol, which guarantees both well-defined and natural facial expressions. The method-acting protocol was based on every-day scenarios, which are used to define the necessary context information for each expression. All facial expressions are available in three repetitions, in two intensities, as well as from three different camera angles. A detailed frame annotation is provided, from which a dynamic and a static version of the database have been created. In addition to describing the database in detail, we also present the results of an experiment with two conditions that serve to validate the context scenarios as well as the naturalness and recognizability of the video sequences. Our results provide clear evidence that conversational expressions can be recognized surprisingly well from visual information alone. The MPI facial expression database will enable researchers from different research fields (including the perceptual and cognitive sciences, but also affective computing, as well as computer vision) to investigate the processing of a wider range of natural facial expressions

  15. Dynamic Facial Expression of Emotion Made Easy

    OpenAIRE

    Broekens, Joost; Qu, Chao; Brinkman, Willem-Paul

    2012-01-01

    Facial emotion expression for virtual characters is used in a wide variety of areas. Often, the primary reason to use emotion expression is not to study emotion expression generation per se, but to use emotion expression in an application or research project. What is then needed is an easy to use and flexible, but also validated mechanism to do so. In this report we present such a mechanism. It enables developers to build virtual characters with dynamic affective facial expressions. The mecha...

  16. Effects of sad mood on facial emotion recognition in Chinese people.

    Science.gov (United States)

    Lee, Tatia M C; Ng, Emily H H; Tang, S W; Chan, Chetwyn C H

    2008-05-30

    This study examined the influence of sad mood on the judgment of ambiguous facial emotion expressions among 47 healthy volunteers who had been induced to feel sad (n=13), neutral (n=15), or happy (n=19) emotions by watching video clips. The findings suggest that when the targets were ambiguous, participants who were in a sad mood tended to classify them in the negative emotional categories rather than the positive emotional categories. Also, this observation indicates that emotion-specific negative bias in the judgment of facial expressions is associated with a sad mood. The finding argues against a general impairment in decoding facial expressions. Furthermore, the observed mood-congruent negative bias was best predicted by spatial perception. The findings of this study provide insights into the cognitive processes underlying the interpersonal difficulties experienced by people in a sad mood, which may be predisposing factors in the development of clinical depression.

  17. Emotional availability, understanding emotions, and recognition of facial emotions in obese mothers with young children.

    Science.gov (United States)

    Bergmann, Sarah; von Klitzing, Kai; Keitel-Korndörfer, Anja; Wendt, Verena; Grube, Matthias; Herpertz, Sarah; Schütz, Astrid; Klein, Annette M

    2016-01-01

    Recent research has identified mother-child relationships of low quality as possible risk factors for childhood obesity. However, it remains open how mothers' own obesity influences the quality of mother-child interaction, and particularly emotional availability (EA). Also unclear is the influence of maternal emotional competencies, i.e. understanding emotions and recognizing facial emotions. This study aimed to (1) investigate differences between obese and normal-weight mothers regarding mother-child EA, maternal understanding emotions and recognition of facial emotions, and (2) explore how maternal emotional competencies and maternal weight interact with each other in predicting EA. A better understanding of these associations could inform strategies of obesity prevention especially in children at risk. We assessed EA, understanding emotions and recognition of facial emotions in 73 obese versus 73 normal-weight mothers, and their children aged 6 to 47 months (Mchild age=24.49, 80 females). Obese mothers showed lower EA and understanding emotions. Mothers' normal weight and their ability to understand emotions were positively associated with EA. The ability to recognize facial emotions was positively associated with EA in obese but not in normal-weight mothers. Maternal weight status indirectly influenced EA through its effect on understanding emotions. Maternal emotional competencies may play an important role for establishing high EA in interaction with the child. Children of obese mothers experience lower EA, which may contribute to overweight development. We suggest including elements that aim to improve maternal emotional competencies and mother-child EA in prevention or intervention programmes targeting childhood obesity. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Facial emotion perception impairments in schizophrenia patients with comorbid antisocial personality disorder.

    Science.gov (United States)

    Tang, Dorothy Y Y; Liu, Amy C Y; Lui, Simon S Y; Lam, Bess Y H; Siu, Bonnie W M; Lee, Tatia M C; Cheung, Eric F C

    2016-02-28

    Impairment in facial emotion perception is believed to be associated with aggression. Schizophrenia patients with antisocial features are more impaired in facial emotion perception than their counterparts without these features. However, previous studies did not define the comorbidity of antisocial personality disorder (ASPD) using stringent criteria. We recruited 30 participants with dual diagnoses of ASPD and schizophrenia, 30 participants with schizophrenia and 30 controls. We employed the Facial Emotional Recognition paradigm to measure facial emotion perception, and administered a battery of neurocognitive tests. The Life History of Aggression scale was used. ANOVAs and ANCOVAs were conducted to examine group differences in facial emotion perception, and control for the effect of other neurocognitive dysfunctions on facial emotion perception. Correlational analyses were conducted to examine the association between facial emotion perception and aggression. Patients with dual diagnoses performed worst in facial emotion perception among the three groups. The group differences in facial emotion perception remained significant, even after other neurocognitive impairments were controlled for. Severity of aggression was correlated with impairment in perceiving negative-valenced facial emotions in patients with dual diagnoses. Our findings support the presence of facial emotion perception impairment and its association with aggression in schizophrenia patients with comorbid ASPD. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. More emotional facial expressions during episodic than during semantic autobiographical retrieval.

    Science.gov (United States)

    El Haj, Mohamad; Antoine, Pascal; Nandrino, Jean Louis

    2016-04-01

    There is a substantial body of research on the relationship between emotion and autobiographical memory. Using facial analysis software, our study addressed this relationship by investigating basic emotional facial expressions that may be detected during autobiographical recall. Participants were asked to retrieve 3 autobiographical memories, each of which was triggered by one of the following cue words: happy, sad, and city. The autobiographical recall was analyzed by a software for facial analysis that detects and classifies basic emotional expressions. Analyses showed that emotional cues triggered the corresponding basic facial expressions (i.e., happy facial expression for memories cued by happy). Furthermore, we dissociated episodic and semantic retrieval, observing more emotional facial expressions during episodic than during semantic retrieval, regardless of the emotional valence of cues. Our study provides insight into facial expressions that are associated with emotional autobiographical memory. It also highlights an ecological tool to reveal physiological changes that are associated with emotion and memory.

  20. Mirroring Facial Expressions and Emotions in Dyadic Conversations

    DEFF Research Database (Denmark)

    Navarretta, Costanza

    2016-01-01

    This paper presents an investigation of mirroring facial expressions and the emotions which they convey in dyadic naturally occurring first encounters. Mirroring facial expressions are a common phenomenon in face-to-face interactions, and they are due to the mirror neuron system which has been...... and overlapping facial expressions are very frequent. In this study, we want to determine whether the overlapping facial expressions are mirrored or are otherwise correlated in the encounters, and to what extent mirroring facial expressions convey the same emotion. The results of our study show that the majority...... of smiles and laughs, and one fifth of the occurrences of raised eyebrows are mirrored in the data. Moreover some facial traits in co-occurring expressions co-occur more often than it would be expected by chance. Finally, amusement, and to a lesser extent friendliness, are often emotions shared by both...

  1. Association between Ability Emotional Intelligence and Left Insula during Social Judgment of Facial Emotions.

    Directory of Open Access Journals (Sweden)

    Tiziana Quarto

    Full Text Available The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI. Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain "somatic marker circuitry" (SMC sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC. Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.

  2. Sex Hormones and Processing of Facial Expressions of Emotion: A Systematic Literature Review

    Directory of Open Access Journals (Sweden)

    Flávia L. Osório

    2018-04-01

    Full Text Available Background: We systematically reviewed the literature to determine the influence of sex hormones on facial emotion processing (FEP in healthy women at different phases of life.Methods: Searches were performed in PubMed, Web of Science, PsycINFO, LILACS, and SciELO. Twenty-seven articles were included in the review and allocated into five different categories according to their objectives and sample characteristics (menstrual cycle, oral contraceptives, pregnancy/postpartum, testosterone, and progesterone.Results: Despite the limited number of studies in some categories and the existence of inconsistencies in the results of interest, the findings of the review suggest that FEP may be enhanced during the follicular phase. Studies with women taking oral contraceptives showed reduced recognition accuracy and decreased responsiveness of different brain structures during FEP tasks. Studies with pregnant women and women in the postpartum showed that hormonal changes are associated with alterations in FEP and in brain functioning that could indicate the existence of a hypervigilant state in new and future mothers. Exogenous administration of testosterone enhanced the recognition of threatening facial expressions and the activation of brain structures involved in the processing of emotional stimuli.Conclusions: We conclude that sex hormones affect FEP in women, which may have an impact in adaptive processes of the species and in the onset of mood symptoms associated with the premenstrual syndrome.

  3. Psychophysical measures of sensitivity to facial expression of emotion.

    Directory of Open Access Journals (Sweden)

    Michelle eMarneweck

    2013-02-01

    Full Text Available We report the development of two simple, objective, psychophysical measures of the ability to discriminate facial expressions of emotion that vary in intensity from a neutral facial expression and to discriminate between varying intensities of emotional facial expression. The stimuli were created by morphing photographs of models expressing four basic emotions, anger, disgust, happiness and sadness with neutral expressions. Psychometric functions were obtained for 15 healthy young adults using the Method of Constant Stimuli with a two-interval forced-choice procedure. Individual data points were fitted by Quick functions for each task and each emotion, allowing estimates of absolute thresholds and slopes. The tasks give objective and sensitive measures of the basic perceptual abilities required for perceiving and interpreting emotional facial expressions.

  4. More Pronounced Deficits in Facial Emotion Recognition for Schizophrenia than Bipolar Disorder

    Science.gov (United States)

    Goghari, Vina M; Sponheim, Scott R

    2012-01-01

    Schizophrenia and bipolar disorder are typically separated in diagnostic systems. Behavioural, cognitive, and brain abnormalities associated with each disorder nonetheless overlap. We evaluated the diagnostic specificity of facial emotion recognition deficits in schizophrenia and bipolar disorder to determine whether select aspects of emotion recognition differed for the two disorders. The investigation used an experimental task that included the same facial images in an emotion recognition condition and an age recognition condition (to control for processes associated with general face recognition) in 27 schizophrenia patients, 16 bipolar I patients, and 30 controls. Schizophrenia and bipolar patients exhibited both shared and distinct aspects of facial emotion recognition deficits. Schizophrenia patients had deficits in recognizing angry facial expressions compared to healthy controls and bipolar patients. Compared to control participants, both schizophrenia and bipolar patients were more likely to mislabel facial expressions of anger as fear. Given that schizophrenia patients exhibited a deficit in emotion recognition for angry faces, which did not appear due to generalized perceptual and cognitive dysfunction, improving recognition of threat-related expression may be an important intervention target to improve social functioning in schizophrenia. PMID:23218816

  5. Testosterone reactivity to facial display of emotions in men and women.

    Science.gov (United States)

    Zilioli, Samuele; Caldbick, Evan; Watson, Neil V

    2014-05-01

    Previous studies have examined testosterone's role in regulating the processing of facial displays of emotions (FDEs). However, the reciprocal process - the influence of FDEs, an evolutionarily ancient and potent class of social signals, on the secretion of testosterone - has not yet been studied. To address this gap, we examined the effects of emotional content and sex of facial stimuli in modulating endogenous testosterone fluctuations, as well as sex differences in the endocrine responses to faces. One hundred and sixty-four young healthy men and women were exposed, in a between-subjects design, to happy or angry same-sex or opposite-sex facial expressions. Results showed that in both men (n=85) and women (n=79), extended exposure to faces of the opposite sex, regardless of their apparent emotional content, was accompanied by an accumulation in salivary testosterone when compared to exposure to faces of the same sex. Furthermore, testosterone change in women exposed to angry expressions was greater than testosterone change in women exposed to happy expressions. These results add emotional facial stimuli to the collection of social signals that modulate endocrine status, and are discussed with regard to the evolutionary roles of testosterone. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Processing of Facial Expressions of Emotions by Adults with Down Syndrome and Moderate Intellectual Disability

    Science.gov (United States)

    Carvajal, Fernando; Fernandez-Alcaraz, Camino; Rueda, Maria; Sarrion, Louise

    2012-01-01

    The processing of facial expressions of emotions by 23 adults with Down syndrome and moderate intellectual disability was compared with that of adults with intellectual disability of other etiologies (24 matched in cognitive level and 26 with mild intellectual disability). Each participant performed 4 tasks of the Florida Affect Battery and an…

  7. The human amygdala parametrically encodes the intensity of specific facial emotions and their categorical ambiguity

    Science.gov (United States)

    Wang, Shuo; Yu, Rongjun; Tyszka, J. Michael; Zhen, Shanshan; Kovach, Christopher; Sun, Sai; Huang, Yi; Hurlemann, Rene; Ross, Ian B.; Chung, Jeffrey M.; Mamelak, Adam N.; Adolphs, Ralph; Rutishauser, Ueli

    2017-01-01

    The human amygdala is a key structure for processing emotional facial expressions, but it remains unclear what aspects of emotion are processed. We investigated this question with three different approaches: behavioural analysis of 3 amygdala lesion patients, neuroimaging of 19 healthy adults, and single-neuron recordings in 9 neurosurgical patients. The lesion patients showed a shift in behavioural sensitivity to fear, and amygdala BOLD responses were modulated by both fear and emotion ambiguity (the uncertainty that a facial expression is categorized as fearful or happy). We found two populations of neurons, one whose response correlated with increasing degree of fear, or happiness, and a second whose response primarily decreased as a linear function of emotion ambiguity. Together, our results indicate that the human amygdala processes both the degree of emotion in facial expressions and the categorical ambiguity of the emotion shown and that these two aspects of amygdala processing can be most clearly distinguished at the level of single neurons. PMID:28429707

  8. Implicit attentional bias for facial emotion in dissociative seizures: Additional evidence.

    Science.gov (United States)

    Pick, Susannah; Mellers, John D C; Goldstein, Laura H

    2018-03-01

    This study sought to extend knowledge about the previously reported preconscious attentional bias (AB) for facial emotion in patients with dissociative seizures (DS) by exploring whether the finding could be replicated, while controlling for concurrent anxiety, depression, and potentially relevant cognitive impairments. Patients diagnosed with DS (n=38) were compared with healthy controls (n=43) on a pictorial emotional Stroop test, in which backwardly masked emotional faces (angry, happy, neutral) were processed implicitly. The group with DS displayed a significantly greater AB to facial emotion relative to controls; however, the bias was not specific to negative or positive emotions. The group effect could not be explained by performance on standardized cognitive tests or self-reported depression/anxiety. The study provides additional evidence of a disproportionate and automatic allocation of attention to facial affect in patients with DS, including both positive and negative facial expressions. Such a tendency could act as a predisposing factor for developing DS initially, or may contribute to triggering individuals' seizures on an ongoing basis. Psychological interventions such as Cognitive Behavioral Therapy (CBT) or AB modification might be suitable approaches to target this bias in clinical practice. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Facial Emotions Recognition using Gabor Transform and Facial Animation Parameters with Neural Networks

    Science.gov (United States)

    Harit, Aditya; Joshi, J. C., Col; Gupta, K. K.

    2018-03-01

    The paper proposed an automatic facial emotion recognition algorithm which comprises of two main components: feature extraction and expression recognition. The algorithm uses a Gabor filter bank on fiducial points to find the facial expression features. The resulting magnitudes of Gabor transforms, along with 14 chosen FAPs (Facial Animation Parameters), compose the feature space. There are two stages: the training phase and the recognition phase. Firstly, for the present 6 different emotions, the system classifies all training expressions in 6 different classes (one for each emotion) in the training stage. In the recognition phase, it recognizes the emotion by applying the Gabor bank to a face image, then finds the fiducial points, and then feeds it to the trained neural architecture.

  10. Development of Facial Emotion Recognition in Childhood : Age-related Differences in a Shortened Version of the Facial Expressions of Emotion - Stimuli and Tests

    NARCIS (Netherlands)

    Coenen, Maraike; Aarnoudse, Ceciel; Huitema, Rients; Braams, Olga; Veenstra, Wencke S.

    2013-01-01

    Introduction Facial emotion recognition is essential for social interaction. The development of emotion recognition abilities is not yet entirely understood (Tonks et al. 2007). Facial emotion recognition emerges gradually, with happiness recognized earliest (Herba & Phillips, 2004). The recognition

  11. A Facial Control Method Using Emotional Parameters in Sensibility Robot

    Science.gov (United States)

    Shibata, Hiroshi; Kanoh, Masayoshi; Kato, Shohei; Kunitachi, Tsutomu; Itoh, Hidenori

    The “Ifbot” robot communicates with people by considering its own “emotions”. Ifbot has many facial expressions to communicate enjoyment. These are used to express its internal emotions, purposes, reactions caused by external stimulus, and entertainment such as singing songs. All these facial expressions are developed by designers manually. Using this approach, we must design all facial motions, if we want Ifbot to express them. It, however, is not realistic. We have therefore developed a system which convert Ifbot's emotions to its facial expressions automatically. In this paper, we propose a method for creating Ifbot's facial expressions from parameters, emotional parameters, which handle its internal emotions computationally.

  12. Facial Emotion Recognition in Schizophrenia: The Impact of Gender

    OpenAIRE

    Erol, Alm?la; Putgul, Gulperi; Kosger, Ferdi; Ersoy, Bilal

    2013-01-01

    Objective Previous studies reported gender differences for facial emotion recognition in healthy people, with women performing better than men. Few studies that examined gender differences for facial emotion recognition in schizophrenia brought out inconsistent findings. The aim of this study is to investigate gender differences for facial emotion identification and discrimination abilities in patients with schizophrenia. Methods 35 female and 35 male patients with schizophrenia, along with 3...

  13. Emotional facial expressions evoke faster orienting responses, but weaker emotional responses at neural and behavioural levels compared to scenes: A simultaneous EEG and facial EMG study.

    Science.gov (United States)

    Mavratzakis, Aimee; Herbert, Cornelia; Walla, Peter

    2016-01-01

    In the current study, electroencephalography (EEG) was recorded simultaneously with facial electromyography (fEMG) to determine whether emotional faces and emotional scenes are processed differently at the neural level. In addition, it was investigated whether these differences can be observed at the behavioural level via spontaneous facial muscle activity. Emotional content of the stimuli did not affect early P1 activity. Emotional faces elicited enhanced amplitudes of the face-sensitive N170 component, while its counterpart, the scene-related N100, was not sensitive to emotional content of scenes. At 220-280ms, the early posterior negativity (EPN) was enhanced only slightly for fearful as compared to neutral or happy faces. However, its amplitudes were significantly enhanced during processing of scenes with positive content, particularly over the right hemisphere. Scenes of positive content also elicited enhanced spontaneous zygomatic activity from 500-750ms onwards, while happy faces elicited no such changes. Contrastingly, both fearful faces and negative scenes elicited enhanced spontaneous corrugator activity at 500-750ms after stimulus onset. However, relative to baseline EMG changes occurred earlier for faces (250ms) than for scenes (500ms) whereas for scenes activity changes were more pronounced over the whole viewing period. Taking into account all effects, the data suggests that emotional facial expressions evoke faster attentional orienting, but weaker affective neural activity and emotional behavioural responses compared to emotional scenes. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Neuroticism and facial emotion recognition in healthy adults

    NARCIS (Netherlands)

    Andric, Sanja; Maric, Nadja P.; Knezevic, Goran; Mihaljevic, Marina; Mirjanic, Tijana; Velthorst, Eva; van Os, Jim

    2016-01-01

    The aim of the present study was to examine whether healthy individuals with higher levels of neuroticism, a robust independent predictor of psychopathology, exhibit altered facial emotion recognition performance. Facial emotion recognition accuracy was investigated in 104 healthy adults using the

  15. Spontaneous facial expressions of emotion of congenitally and noncongenitally blind individuals.

    Science.gov (United States)

    Matsumoto, David; Willingham, Bob

    2009-01-01

    The study of the spontaneous expressions of blind individuals offers a unique opportunity to understand basic processes concerning the emergence and source of facial expressions of emotion. In this study, the authors compared the expressions of congenitally and noncongenitally blind athletes in the 2004 Paralympic Games with each other and with those produced by sighted athletes in the 2004 Olympic Games. The authors also examined how expressions change from 1 context to another. There were no differences between congenitally blind, noncongenitally blind, and sighted athletes, either on the level of individual facial actions or in facial emotion configurations. Blind athletes did produce more overall facial activity, but these were isolated to head and eye movements. The blind athletes' expressions differentiated whether they had won or lost a medal match at 3 different points in time, and there were no cultural differences in expression. These findings provide compelling evidence that the production of spontaneous facial expressions of emotion is not dependent on observational learning but simultaneously demonstrates a learned component to the social management of expressions, even among blind individuals.

  16. Facial emotion recognition in Parkinson's disease: A review and new hypotheses

    Science.gov (United States)

    Vérin, Marc; Sauleau, Paul; Grandjean, Didier

    2018-01-01

    Abstract Parkinson's disease is a neurodegenerative disorder classically characterized by motor symptoms. Among them, hypomimia affects facial expressiveness and social communication and has a highly negative impact on patients' and relatives' quality of life. Patients also frequently experience nonmotor symptoms, including emotional‐processing impairments, leading to difficulty in recognizing emotions from faces. Aside from its theoretical importance, understanding the disruption of facial emotion recognition in PD is crucial for improving quality of life for both patients and caregivers, as this impairment is associated with heightened interpersonal difficulties. However, studies assessing abilities in recognizing facial emotions in PD still report contradictory outcomes. The origins of this inconsistency are unclear, and several questions (regarding the role of dopamine replacement therapy or the possible consequences of hypomimia) remain unanswered. We therefore undertook a fresh review of relevant articles focusing on facial emotion recognition in PD to deepen current understanding of this nonmotor feature, exploring multiple significant potential confounding factors, both clinical and methodological, and discussing probable pathophysiological mechanisms. This led us to examine recent proposals about the role of basal ganglia‐based circuits in emotion and to consider the involvement of facial mimicry in this deficit from the perspective of embodied simulation theory. We believe our findings will inform clinical practice and increase fundamental knowledge, particularly in relation to potential embodied emotion impairment in PD. © 2018 The Authors. Movement Disorders published by Wiley Periodicals, Inc. on behalf of International Parkinson and Movement Disorder Society. PMID:29473661

  17. Emotion Unchained: Facial Expression Modulates Gaze Cueing under Cognitive Load.

    Science.gov (United States)

    Pecchinenda, Anna; Petrucci, Manuel

    2016-01-01

    Direction of eye gaze cues spatial attention, and typically this cueing effect is not modulated by the expression of a face unless top-down processes are explicitly or implicitly involved. To investigate the role of cognitive control on gaze cueing by emotional faces, participants performed a gaze cueing task with happy, angry, or neutral faces under high (i.e., counting backward by 7) or low cognitive load (i.e., counting forward by 2). Results show that high cognitive load enhances gaze cueing effects for angry facial expressions. In addition, cognitive load reduces gaze cueing for neutral faces, whereas happy facial expressions and gaze affected object preferences regardless of load. This evidence clearly indicates a differential role of cognitive control in processing gaze direction and facial expression, suggesting that under typical conditions, when we shift attention based on social cues from another person, cognitive control processes are used to reduce interference from emotional information.

  18. Facial decoding in schizophrenia is underpinned by basic visual processing impairments.

    Science.gov (United States)

    Belge, Jan-Baptist; Maurage, Pierre; Mangelinckx, Camille; Leleux, Dominique; Delatte, Benoît; Constant, Eric

    2017-09-01

    Schizophrenia is associated with a strong deficit in the decoding of emotional facial expression (EFE). Nevertheless, it is still unclear whether this deficit is specific for emotions or due to a more general impairment for any type of facial processing. This study was designed to clarify this issue. Thirty patients suffering from schizophrenia and 30 matched healthy controls performed several tasks evaluating the recognition of both changeable (i.e. eyes orientation and emotions) and stable (i.e. gender, age) facial characteristics. Accuracy and reaction times were recorded. Schizophrenic patients presented a performance deficit (accuracy and reaction times) in the perception of both changeable and stable aspects of faces, without any specific deficit for emotional decoding. Our results demonstrate a generalized face recognition deficit in schizophrenic patients, probably caused by a perceptual deficit in basic visual processing. It seems that the deficit in the decoding of emotional facial expression (EFE) is not a specific deficit of emotion processing, but is at least partly related to a generalized perceptual deficit in lower-level perceptual processing, occurring before the stage of emotion processing, and underlying more complex cognitive dysfunctions. These findings should encourage future investigations to explore the neurophysiologic background of these generalized perceptual deficits, and stimulate a clinical approach focusing on more basic visual processing. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  19. [Emotional intelligence and oscillatory responses on the emotional facial expressions].

    Science.gov (United States)

    Kniazev, G G; Mitrofanova, L G; Bocharov, A V

    2013-01-01

    Emotional intelligence-related differences in oscillatory responses to emotional facial expressions were investigated in 48 subjects (26 men and 22 women) in age 18-30 years. Participants were instructed to evaluate emotional expression (angry, happy and neutral) of each presented face on an analog scale ranging from -100 (very hostile) to + 100 (very friendly). High emotional intelligence (EI) participants were found to be more sensitive to the emotional content of the stimuli. It showed up both in their subjective evaluation of the stimuli and in a stronger EEG theta synchronization at an earlier (between 100 and 500 ms after face presentation) processing stage. Source localization using sLORETA showed that this effect was localized in the fusiform gyrus upon the presentation of angry faces and in the posterior cingulate gyrus upon the presentation of happy faces. At a later processing stage (500-870 ms) event-related theta synchronization in high emotional intelligence subject was higher in the left prefrontal cortex upon the presentation of happy faces, but it was lower in the anterior cingulate cortex upon presentation of angry faces. This suggests the existence of a mechanism that can be selectively increase the positive emotions and reduce negative emotions.

  20. Processing environmental stimuli in paranoid schizophrenia: recognizing facial emotions and performing executive functions.

    Science.gov (United States)

    Yu, Shao Hua; Zhu, Jun Peng; Xu, You; Zheng, Lei Lei; Chai, Hao; He, Wei; Liu, Wei Bo; Li, Hui Chun; Wang, Wei

    2012-12-01

    To study the contribution of executive function to abnormal recognition of facial expressions of emotion in schizophrenia patients. Abnormal recognition of facial expressions of emotion was assayed according to Japanese and Caucasian facial expressions of emotion (JACFEE), Wisconsin card sorting test (WCST), positive and negative symptom scale, and Hamilton anxiety and depression scale, respectively, in 88 paranoid schizophrenia patients and 75 healthy volunteers. Patients scored higher on the Positive and Negative Symptom Scale and the Hamilton Anxiety and Depression Scales, displayed lower JACFEE recognition accuracies and poorer WCST performances. The JACFEE recognition accuracy of contempt and disgust was negatively correlated with the negative symptom scale score while the recognition accuracy of fear was positively with the positive symptom scale score and the recognition accuracy of surprise was negatively with the general psychopathology score in patients. Moreover, the WCST could predict the JACFEE recognition accuracy of contempt, disgust, and sadness in patients, and the perseverative errors negatively predicted the recognition accuracy of sadness in healthy volunteers. The JACFEE recognition accuracy of sadness could predict the WCST categories in paranoid schizophrenia patients. Recognition accuracy of social-/moral emotions, such as contempt, disgust and sadness is related to the executive function in paranoid schizophrenia patients, especially when regarding sadness. Copyright © 2012 The Editorial Board of Biomedical and Environmental Sciences. Published by Elsevier B.V. All rights reserved.

  1. Facial Emotion Recognition Using Context Based Multimodal Approach

    Directory of Open Access Journals (Sweden)

    Priya Metri

    2011-12-01

    Full Text Available Emotions play a crucial role in person to person interaction. In recent years, there has been a growing interest in improving all aspects of interaction between humans and computers. The ability to understand human emotions is desirable for the computer in several applications especially by observing facial expressions. This paper explores a ways of human-computer interaction that enable the computer to be more aware of the user’s emotional expressions we present a approach for the emotion recognition from a facial expression, hand and body posture. Our model uses multimodal emotion recognition system in which we use two different models for facial expression recognition and for hand and body posture recognition and then combining the result of both classifiers using a third classifier which give the resulting emotion . Multimodal system gives more accurate result than a signal or bimodal system

  2. Differential judgement of static facial expressions of emotions in three cultures.

    Science.gov (United States)

    Huang, Y; Tang, S; Helmeste, D; Shioiri, T; Someya, T

    2001-10-01

    Judging facial expressions of emotions has important clinical value in the assessment of psychiatric patients. Judging facial emotional expressions in foreign patients however, is not always easy. Controversy has existed in previous reports on cultural differences in identifying static facial expressions of emotions. While it has been argued that emotional expressions on the face are universally recognized, experimental data obtained were not necessarily totally supportive. Using the data reported in the literature, our previous pilot study showed that the Japanese interpreted many emotional expressions differently from USA viewers of the same emotions. In order to explore such discrepancies further, we conducted the same experiments on Chinese subjects residing in Beijing. The data showed that, similar to the Japanese viewers, Chinese viewers also judged many static facial emotional expressions differently from USA viewers. The combined results of the Chinese and the Japanese experiments suggest a major cross-cultural difference between American and Asian viewers in identifying some static facial emotional expressions, particularly when the posed emotion has negative connotations. The results have important implications for cross-cultural communications when facial emotional expressions are presented as static images.

  3. What do facial expressions of emotion express in young children? The relationship between facial display and EMG measures

    Directory of Open Access Journals (Sweden)

    Michela Balconi

    2014-04-01

    Full Text Available The present paper explored the relationship between emotional facial response and electromyographic modulation in children when they observe facial expression of emotions. Facial responsiveness (evaluated by arousal and valence ratings and psychophysiological correlates (facial electromyography, EMG were analyzed when children looked at six facial expressions of emotions (happiness, anger, fear, sadness, surprise and disgust. About EMG measure, corrugator and zygomatic muscle activity was monitored in response to different emotional types. ANOVAs showed differences for both EMG and facial response across the subjects, as a function of different emotions. Specifically, some emotions were well expressed by all the subjects (such as happiness, anger and fear in terms of high arousal, whereas some others were less level arousal (such as sadness. Zygomatic activity was increased mainly for happiness, from one hand, corrugator activity was increased mainly for anger, fear and surprise, from the other hand. More generally, EMG and facial behavior were highly correlated each other, showing a “mirror” effect with respect of the observed faces.

  4. Facial expressions of emotion and the course of conjugal bereavement.

    Science.gov (United States)

    Bonanno, G A; Keltner, D

    1997-02-01

    The common assumption that emotional expression mediates the course of bereavement is tested. Competing hypotheses about the direction of mediation were formulated from the grief work and social-functional accounts of emotional expression. Facial expressions of emotion in conjugally bereaved adults were coded at 6 months post-loss as they described their relationship with the deceased; grief and perceived health were measured at 6, 14, and 25 months. Facial expressions of negative emotion, in particular anger, predicted increased grief at 14 months and poorer perceived health through 25 months. Facial expressions of positive emotion predicted decreased grief through 25 months and a positive but nonsignificant relation to perceived health. Predictive relations between negative and positive emotional expression persisted when initial levels of self-reported emotion, grief, and health were statistically controlled, demonstrating the mediating role of facial expressions of emotion in adjustment to conjugal loss. Theoretical and clinical implications are discussed.

  5. Faces and bodies: perception and mimicry of emotionally congruent and incongruent facial and bodily expressions

    Directory of Open Access Journals (Sweden)

    Mariska eKret

    2013-02-01

    Full Text Available Traditional emotion theories stress the importance of the face in the expression of emotions but bodily expressions are becoming increasingly important. Here we tested the hypothesis that similar physiological responses can be evoked by observing emotional face and body signals and that the reaction to angry signals is amplified in anxious individuals. We designed three experiments in which participants categorized emotional expressions from isolated facial and bodily expressions and from emotionally congruent and incongruent face-body compounds. Participants’ fixations were measured and their pupil size recorded with eye-tracking equipment, and their facial reactions measured with electromyography (EMG. The behavioral results support our prediction that the recognition of a facial expression is improved in the context of a matching posture and importantly, also vice versa. From their facial expression, it appeared that observers acted with signs of negative emotionality (increased corrugator activity to angry and fearful facial expressions and with positive emotionality (increased zygomaticus to happy facial expressions. What we predicted and found, was that angry and fearful cues from the face or the body, attracted more attention than happy cues. We further observed that responses evoked by angry cues were amplified in individuals with high anxiety scores. In sum, we show that people process bodily expressions of emotion in a similar fashion as facial expressions and that the congruency between the emotional signals from the face and body ameliorates the recognition of the emotion.

  6. Psychometric challenges and proposed solutions when scoring facial emotion expression codes

    OpenAIRE

    Olderbak, Sally; Hildebrandt, Andrea; Pinkpank, Thomas; Sommer, Werner; Wilhelm, Oliver

    2013-01-01

    Coding of facial emotion expressions is increasingly performed by automated emotion expression scoring software; however, there is limited discussion on how best to score the resulting codes. We present a discussion of facial emotion expression theories and a review of contemporary emotion expression coding methodology. We highlight methodological challenges pertinent to scoring software-coded facial emotion expression codes and present important psychometric research questions centered on co...

  7. [Recognition of facial expression of emotions in Parkinson's disease: a theoretical review].

    Science.gov (United States)

    Alonso-Recio, L; Serrano-Rodriguez, J M; Carvajal-Molina, F; Loeches-Alonso, A; Martin-Plasencia, P

    2012-04-16

    Emotional facial expression is a basic guide during social interaction and, therefore, alterations in their expression or recognition are important limitations for communication. To examine facial expression recognition abilities and their possible impairment in Parkinson's disease. First, we review the studies on this topic which have not found entirely similar results. Second, we analyze the factors that may explain these discrepancies and, in particular, as third objective, we consider the relationship between emotional recognition problems and cognitive impairment associated with the disease. Finally, we propose alternatives strategies for the development of studies that could clarify the state of these abilities in Parkinson's disease. Most studies suggest deficits in facial expression recognition, especially in those with negative emotional content. However, it is possible that these alterations are related to those that also appear in the course of the disease in other perceptual and executive processes. To advance in this issue, we consider necessary to design emotional recognition studies implicating differentially the executive or visuospatial processes, and/or contrasting cognitive abilities with facial expressions and non emotional stimuli. The precision of the status of these abilities, as well as increase our knowledge of the functional consequences of the characteristic brain damage in the disease, may indicate if we should pay special attention in their rehabilitation inside the programs implemented.

  8. Estimation of human emotions using thermal facial information

    Science.gov (United States)

    Nguyen, Hung; Kotani, Kazunori; Chen, Fan; Le, Bac

    2014-01-01

    In recent years, research on human emotion estimation using thermal infrared (IR) imagery has appealed to many researchers due to its invariance to visible illumination changes. Although infrared imagery is superior to visible imagery in its invariance to illumination changes and appearance differences, it has difficulties in handling transparent glasses in the thermal infrared spectrum. As a result, when using infrared imagery for the analysis of human facial information, the regions of eyeglasses are dark and eyes' thermal information is not given. We propose a temperature space method to correct eyeglasses' effect using the thermal facial information in the neighboring facial regions, and then use Principal Component Analysis (PCA), Eigen-space Method based on class-features (EMC), and PCA-EMC method to classify human emotions from the corrected thermal images. We collected the Kotani Thermal Facial Emotion (KTFE) database and performed the experiments, which show the improved accuracy rate in estimating human emotions.

  9. A new look at emotion perception: Concepts speed and shape facial emotion recognition.

    Science.gov (United States)

    Nook, Erik C; Lindquist, Kristen A; Zaki, Jamil

    2015-10-01

    Decades ago, the "New Look" movement challenged how scientists thought about vision by suggesting that conceptual processes shape visual perceptions. Currently, affective scientists are likewise debating the role of concepts in emotion perception. Here, we utilized a repetition-priming paradigm in conjunction with signal detection and individual difference analyses to examine how providing emotion labels-which correspond to discrete emotion concepts-affects emotion recognition. In Study 1, pairing emotional faces with emotion labels (e.g., "sad") increased individuals' speed and sensitivity in recognizing emotions. Additionally, individuals with alexithymia-who have difficulty labeling their own emotions-struggled to recognize emotions based on visual cues alone, but not when emotion labels were provided. Study 2 replicated these findings and further demonstrated that emotion concepts can shape perceptions of facial expressions. Together, these results suggest that emotion perception involves conceptual processing. We discuss the implications of these findings for affective, social, and clinical psychology. (c) 2015 APA, all rights reserved).

  10. Theory of mind as a mediator of reasoning and facial emotion recognition: findings from 200 healthy people.

    Science.gov (United States)

    Lee, Seul Bee; Koo, Se Jun; Song, Yun Young; Lee, Mi Kyung; Jeong, Yu-Jin; Kwon, Catherine; Park, Kyoung Ri; Park, Jin Young; Kang, Jee In; Lee, Eun; An, Suk Kyoon

    2014-04-01

    It was proposed that the ability to recognize facial emotions is closely related to complex neurocognitive processes and/or skills related to theory of mind (ToM). This study examines whether ToM skills mediate the relationship between higher neurocognitive functions, such as reasoning ability, and facial emotion recognition. A total of 200 healthy subjects (101 males, 99 females) were recruited. Facial emotion recognition was measured through the use of 64 facial emotional stimuli that were selected from photographs from the Korean Facial Expressions of Emotion (KOFEE). Participants were requested to complete the Theory of Mind Picture Stories task and Standard Progressive Matrices (SPM). Multiple regression analysis showed that the SPM score (t=3.19, p=0.002, β=0.22) and the overall ToM score (t=2.56, p=0.011, β=0.18) were primarily associated with a total hit rate (%) of the emotion recognition task. Hierarchical regression analysis through a three-step mediation model showed that ToM may partially mediate the relationship between SPM and performance on facial emotion recognition. These findings imply that higher neurocognitive functioning, inclusive of reasoning, may not only directly contribute towards facial emotion recognition but also influence ToM, which in turn, influences facial emotion recognition. These findings are particularly true for healthy young people.

  11. Considering sex differences clarifies the effects of depression on facial emotion processing during fMRI.

    Science.gov (United States)

    Jenkins, L M; Kendall, A D; Kassel, M T; Patrón, V G; Gowins, J R; Dion, C; Shankman, S A; Weisenbach, S L; Maki, P; Langenecker, S A

    2018-01-01

    Sex differences in emotion processing may play a role in women's increased risk for Major Depressive Disorder (MDD). However, studies of sex differences in brain mechanisms involved in emotion processing in MDD (or interactions of sex and diagnosis) are sparse. We conducted an event-related fMRI study examining the interactive and distinct effects of sex and MDD on neural activity during a facial emotion perception task. To minimize effects of current affective state and cumulative disease burden, we studied participants with remitted MDD (rMDD) who were early in the course of the illness. In total, 88 individuals aged 18-23 participated, including 48 with rMDD (32 female) and 40 healthy controls (HC; 25 female). fMRI revealed an interaction between sex and diagnosis for sad and neutral facial expressions in the superior frontal gyrus and left middle temporal gyrus. Results also revealed an interaction of sex with diagnosis in the amygdala. Data was from two sites, which might increase variability, but it also increases power to examine sex by diagnosis interactions. This study demonstrates the importance of taking sex differences into account when examining potential trait (or scar) mechanisms that could be useful in identifying individuals at-risk for MDD as well as for evaluating potential therapeutic innovations. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Visual Working Memory Capacity for Emotional Facial Expressions

    Directory of Open Access Journals (Sweden)

    Domagoj Švegar

    2011-12-01

    Full Text Available The capacity of visual working memory is limited to no more than four items. At the same time, it is limited not only by the number of objects, but also by the total amount of information that needs to be memorized, and the relation between the information load per object and the number of objects that can be stored into visual working memory is inverse. The objective of the present experiment was to compute visual working memory capacity for emotional facial expressions, and in order to do so, change detection tasks were applied. Pictures of human emotional facial expressions were presented to 24 participants in 1008 experimental trials, each of which began with a presentation of a fixation mark, which was followed by a short simultaneous presentation of six emotional facial expressions. After that, a blank screen was presented, and after such inter-stimulus interval, one facial expression was presented at one of previously occupied locations. Participants had to answer if the facial expression presented at test is different or identical as the expression presented at that same location before the retention interval. Memory capacity was estimated through accuracy of responding, by the formula constructed by Pashler (1988, adopted from signal detection theory. It was found that visual working memory capacity for emotional facial expressions equals 3.07, which is high compared to capacity for facial identities and other visual stimuli. The obtained results were explained within the framework of evolutionary psychology.

  13. Neural circuitry of emotional and cognitive conflict revealed through facial expressions.

    Science.gov (United States)

    Chiew, Kimberly S; Braver, Todd S

    2011-03-09

    Neural systems underlying conflict processing have been well studied in the cognitive realm, but the extent to which these overlap with those underlying emotional conflict processing remains unclear. A novel adaptation of the AX Continuous Performance Task (AX-CPT), a stimulus-response incompatibility paradigm, was examined that permits close comparison of emotional and cognitive conflict conditions, through the use of affectively-valenced facial expressions as the response modality. Brain activity was monitored with functional magnetic resonance imaging (fMRI) during performance of the emotional AX-CPT. Emotional conflict was manipulated on a trial-by-trial basis, by requiring contextually pre-cued facial expressions to emotional probe stimuli (IAPS images) that were either affectively compatible (low-conflict) or incompatible (high-conflict). The emotion condition was contrasted against a matched cognitive condition that was identical in all respects, except that probe stimuli were emotionally neutral. Components of the brain cognitive control network, including dorsal anterior cingulate cortex (ACC) and lateral prefrontal cortex (PFC), showed conflict-related activation increases in both conditions, but with higher activity during emotion conditions. In contrast, emotion conflict effects were not found in regions associated with affective processing, such as rostral ACC. These activation patterns provide evidence for a domain-general neural system that is active for both emotional and cognitive conflict processing. In line with previous behavioural evidence, greatest activity in these brain regions occurred when both emotional and cognitive influences additively combined to produce increased interference.

  14. Women's greater ability to perceive happy facial emotion automatically: gender differences in affective priming.

    Directory of Open Access Journals (Sweden)

    Uta-Susan Donges

    Full Text Available There is evidence that women are better in recognizing their own and others' emotions. The female advantage in emotion recognition becomes even more apparent under conditions of rapid stimulus presentation. Affective priming paradigms have been developed to examine empirically whether facial emotion stimuli presented outside of conscious awareness color our impressions. It was observed that masked emotional facial expression has an affect congruent influence on subsequent judgments of neutral stimuli. The aim of the present study was to examine the effect of gender on affective priming based on negative and positive facial expression. In our priming experiment sad, happy, neutral, or no facial expression was briefly presented (for 33 ms and masked by neutral faces which had to be evaluated. 81 young healthy volunteers (53 women participated in the study. Subjects had no subjective awareness of emotional primes. Women did not differ from men with regard to age, education, intelligence, trait anxiety, or depressivity. In the whole sample, happy but not sad facial expression elicited valence congruent affective priming. Between-group analyses revealed that women manifested greater affective priming due to happy faces than men. Women seem to have a greater ability to perceive and respond to positive facial emotion at an automatic processing level compared to men. High perceptual sensitivity to minimal social-affective signals may contribute to women's advantage in understanding other persons' emotional states.

  15. Women's greater ability to perceive happy facial emotion automatically: gender differences in affective priming.

    Science.gov (United States)

    Donges, Uta-Susan; Kersting, Anette; Suslow, Thomas

    2012-01-01

    There is evidence that women are better in recognizing their own and others' emotions. The female advantage in emotion recognition becomes even more apparent under conditions of rapid stimulus presentation. Affective priming paradigms have been developed to examine empirically whether facial emotion stimuli presented outside of conscious awareness color our impressions. It was observed that masked emotional facial expression has an affect congruent influence on subsequent judgments of neutral stimuli. The aim of the present study was to examine the effect of gender on affective priming based on negative and positive facial expression. In our priming experiment sad, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces which had to be evaluated. 81 young healthy volunteers (53 women) participated in the study. Subjects had no subjective awareness of emotional primes. Women did not differ from men with regard to age, education, intelligence, trait anxiety, or depressivity. In the whole sample, happy but not sad facial expression elicited valence congruent affective priming. Between-group analyses revealed that women manifested greater affective priming due to happy faces than men. Women seem to have a greater ability to perceive and respond to positive facial emotion at an automatic processing level compared to men. High perceptual sensitivity to minimal social-affective signals may contribute to women's advantage in understanding other persons' emotional states.

  16. Influences on Facial Emotion Recognition in Deaf Children

    Science.gov (United States)

    Sidera, Francesc; Amadó, Anna; Martínez, Laura

    2017-01-01

    This exploratory research is aimed at studying facial emotion recognition abilities in deaf children and how they relate to linguistic skills and the characteristics of deafness. A total of 166 participants (75 deaf) aged 3-8 years were administered the following tasks: facial emotion recognition, naming vocabulary and cognitive ability. The…

  17. Four not six: Revealing culturally common facial expressions of emotion.

    Science.gov (United States)

    Jack, Rachael E; Sun, Wei; Delis, Ioannis; Garrod, Oliver G B; Schyns, Philippe G

    2016-06-01

    As a highly social species, humans generate complex facial expressions to communicate a diverse range of emotions. Since Darwin's work, identifying among these complex patterns which are common across cultures and which are culture-specific has remained a central question in psychology, anthropology, philosophy, and more recently machine vision and social robotics. Classic approaches to addressing this question typically tested the cross-cultural recognition of theoretically motivated facial expressions representing 6 emotions, and reported universality. Yet, variable recognition accuracy across cultures suggests a narrower cross-cultural communication supported by sets of simpler expressive patterns embedded in more complex facial expressions. We explore this hypothesis by modeling the facial expressions of over 60 emotions across 2 cultures, and segregating out the latent expressive patterns. Using a multidisciplinary approach, we first map the conceptual organization of a broad spectrum of emotion words by building semantic networks in 2 cultures. For each emotion word in each culture, we then model and validate its corresponding dynamic facial expression, producing over 60 culturally valid facial expression models. We then apply to the pooled models a multivariate data reduction technique, revealing 4 latent and culturally common facial expression patterns that each communicates specific combinations of valence, arousal, and dominance. We then reveal the face movements that accentuate each latent expressive pattern to create complex facial expressions. Our data questions the widely held view that 6 facial expression patterns are universal, instead suggesting 4 latent expressive patterns with direct implications for emotion communication, social psychology, cognitive neuroscience, and social robotics. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  18. Elevated responses to constant facial emotions in different faces in the human amygdala: an fMRI study of facial identity and expression

    Directory of Open Access Journals (Sweden)

    Weiller Cornelius

    2004-11-01

    Full Text Available Abstract Background Human faces provide important signals in social interactions by inferring two main types of information, individual identity and emotional expression. The ability to readily assess both, the variability and consistency among emotional expressions in different individuals, is central to one's own interpretation of the imminent environment. A factorial design was used to systematically test the interaction of either constant or variable emotional expressions with constant or variable facial identities in areas involved in face processing using functional magnetic resonance imaging. Results Previous studies suggest a predominant role of the amygdala in the assessment of emotional variability. Here we extend this view by showing that this structure activated to faces with changing identities that display constant emotional expressions. Within this condition, amygdala activation was dependent on the type and intensity of displayed emotion, with significant responses to fearful expressions and, to a lesser extent so to neutral and happy expressions. In contrast, the lateral fusiform gyrus showed a binary pattern of increased activation to changing stimulus features while it was also differentially responsive to the intensity of displayed emotion when processing different facial identities. Conclusions These results suggest that the amygdala might serve to detect constant facial emotions in different individuals, complementing its established role for detecting emotional variability.

  19. Facial Expression Generation from Speaker's Emotional States in Daily Conversation

    Science.gov (United States)

    Mori, Hiroki; Ohshima, Koh

    A framework for generating facial expressions from emotional states in daily conversation is described. It provides a mapping between emotional states and facial expressions, where the former is represented by vectors with psychologically-defined abstract dimensions, and the latter is coded by the Facial Action Coding System. In order to obtain the mapping, parallel data with rated emotional states and facial expressions were collected for utterances of a female speaker, and a neural network was trained with the data. The effectiveness of proposed method is verified by a subjective evaluation test. As the result, the Mean Opinion Score with respect to the suitability of generated facial expression was 3.86 for the speaker, which was close to that of hand-made facial expressions.

  20. [Measuring impairment of facial affects recognition in schizophrenia. Preliminary study of the facial emotions recognition task (TREF)].

    Science.gov (United States)

    Gaudelus, B; Virgile, J; Peyroux, E; Leleu, A; Baudouin, J-Y; Franck, N

    2015-06-01

    The impairment of social cognition, including facial affects recognition, is a well-established trait in schizophrenia, and specific cognitive remediation programs focusing on facial affects recognition have been developed by different teams worldwide. However, even though social cognitive impairments have been confirmed, previous studies have also shown heterogeneity of the results between different subjects. Therefore, assessment of personal abilities should be measured individually before proposing such programs. Most research teams apply tasks based on facial affects recognition by Ekman et al. or Gur et al. However, these tasks are not easily applicable in a clinical exercise. Here, we present the Facial Emotions Recognition Test (TREF), which is designed to identify facial affects recognition impairments in a clinical practice. The test is composed of 54 photos and evaluates abilities in the recognition of six universal emotions (joy, anger, sadness, fear, disgust and contempt). Each of these emotions is represented with colored photos of 4 different models (two men and two women) at nine intensity levels from 20 to 100%. Each photo is presented during 10 seconds; no time limit for responding is applied. The present study compared the scores of the TREF test in a sample of healthy controls (64 subjects) and people with stabilized schizophrenia (45 subjects) according to the DSM IV-TR criteria. We analysed global scores for all emotions, as well as sub scores for each emotion between these two groups, taking into account gender differences. Our results were coherent with previous findings. Applying TREF, we confirmed an impairment in facial affects recognition in schizophrenia by showing significant differences between the two groups in their global results (76.45% for healthy controls versus 61.28% for people with schizophrenia), as well as in sub scores for each emotion except for joy. Scores for women were significantly higher than for men in the population

  1. Neurophysiological evidence (ERPs) for hemispheric processing of facial expressions of emotions: Evidence from whole face and chimeric face stimuli.

    Science.gov (United States)

    Damaskinou, Nikoleta; Watling, Dawn

    2018-05-01

    This study was designed to investigate the patterns of electrophysiological responses of early emotional processing at frontocentral sites in adults and to explore whether adults' activation patterns show hemispheric lateralization for facial emotion processing. Thirty-five adults viewed full face and chimeric face stimuli. After viewing two faces, sequentially, participants were asked to decide which of the two faces was more emotive. The findings from the standard faces and the chimeric faces suggest that emotion processing is present during the early phases of face processing in the frontocentral sites. In particular, sad emotional faces are processed differently than neutral and happy (including happy chimeras) faces in these early phases of processing. Further, there were differences in the electrode amplitudes over the left and right hemisphere, particularly in the early temporal window. This research provides supporting evidence that the chimeric face test is a test of emotion processing that elicits right hemispheric processing.

  2. Dimensional Information-Theoretic Measurement of Facial Emotion Expressions in Schizophrenia

    Directory of Open Access Journals (Sweden)

    Jihun Hamm

    2014-01-01

    Full Text Available Altered facial expressions of emotions are characteristic impairments in schizophrenia. Ratings of affect have traditionally been limited to clinical rating scales and facial muscle movement analysis, which require extensive training and have limitations based on methodology and ecological validity. To improve reliable assessment of dynamic facial expression changes, we have developed automated measurements of facial emotion expressions based on information-theoretic measures of expressivity of ambiguity and distinctiveness of facial expressions. These measures were examined in matched groups of persons with schizophrenia (n=28 and healthy controls (n=26 who underwent video acquisition to assess expressivity of basic emotions (happiness, sadness, anger, fear, and disgust in evoked conditions. Persons with schizophrenia scored higher on ambiguity, the measure of conditional entropy within the expression of a single emotion, and they scored lower on distinctiveness, the measure of mutual information across expressions of different emotions. The automated measures compared favorably with observer-based ratings. This method can be applied for delineating dynamic emotional expressivity in healthy and clinical populations.

  3. The accuracy of intensity ratings of emotions from facial expressions

    Directory of Open Access Journals (Sweden)

    Kostić Aleksandra P.

    2003-01-01

    Full Text Available The results of a study on the accuracy of intensity ratings of emotion from facial expressions are reported. The so far research into the field has shown that spontaneous facial expressions of basic emotions are a reliable source of information about the category of emotion. The question is raised of whether this can be true for the intensity of emotion as well and whether the accuracy of intensity ratings is dependent on the observer’s sex and vocational orientation. A total of 228 observers of both sexes and of various vocational orientations rated the emotional intensity of presented facial expressions on a scale-range from 0 to 8. The results have supported the hypothesis that spontaneous facial expressions of basic emotions do provide sufficient information about emotional intensity. The hypothesis on the interdependence between the accuracy of intensity ratings of emotion and the observer’s sex and vocational orientation has not been confirmed. However, the accuracy of intensity rating has been proved to vary with the category of the emotion presented.

  4. Communal and agentic behaviour in response to facial emotion expressions

    NARCIS (Netherlands)

    aan het Rot, Marije; Hogenelst, Koen; Gesing, Christina M

    Facial emotions are important for human communication. Unfortunately, traditional facial emotion recognition tasks do not inform about how respondents might behave towards others expressing certain emotions. Approach-avoidance tasks do measure behaviour, but only on one dimension. In this study 81

  5. Capacity limitations to extract the mean emotion from multiple facial expressions depend on emotion variance.

    Science.gov (United States)

    Ji, Luyan; Pourtois, Gilles

    2018-04-20

    We examined the processing capacity and the role of emotion variance in ensemble representation for multiple facial expressions shown concurrently. A standard set size manipulation was used, whereby the sets consisted of 4, 8, or 16 morphed faces each uniquely varying along a happy-angry continuum (Experiment 1) or a neutral-happy/angry continuum (Experiments 2 & 3). Across the three experiments, we reduced the amount of emotion variance in the sets to explore the boundaries of this process. Participants judged the perceived average emotion from each set on a continuous scale. We computed and compared objective and subjective difference scores, using the morph units and post-experiment ratings, respectively. Results of the subjective scores were more consistent than the objective ones across the first two experiments where the variance was relatively large, and revealed each time that increasing set size led to a poorer averaging ability, suggesting capacity limitations in establishing ensemble representations for multiple facial expressions. However, when the emotion variance in the sets was reduced in Experiment 3, both subjective and objective scores remained unaffected by set size, suggesting that the emotion averaging process was unlimited in these conditions. Collectively, these results suggest that extracting mean emotion from a set composed of multiple faces depends on both structural (attentional) and stimulus-related effects. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Facial expression primes and implicit regulation of negative emotion.

    Science.gov (United States)

    Yoon, HeungSik; Kim, Shin Ah; Kim, Sang Hee

    2015-06-17

    An individual's responses to emotional information are influenced not only by the emotional quality of the information, but also by the context in which the information is presented. We hypothesized that facial expressions of happiness and anger would serve as primes to modulate subjective and neural responses to subsequently presented negative information. To test this hypothesis, we conducted a functional MRI study in which the brains of healthy adults were scanned while they performed an emotion-rating task. During the task, participants viewed a series of negative and neutral photos, one at a time; each photo was presented after a picture showing a face expressing a happy, angry, or neutral emotion. Brain imaging results showed that compared with neutral primes, happy facial primes increased activation during negative emotion in the dorsal anterior cingulated cortex and the right ventrolateral prefrontal cortex, which are typically implicated in conflict detection and implicit emotion control, respectively. Conversely, relative to neutral primes, angry primes activated the right middle temporal gyrus and the left supramarginal gyrus during the experience of negative emotion. Activity in the amygdala in response to negative emotion was marginally reduced after exposure to happy primes compared with angry primes. Relative to neutral primes, angry facial primes increased the subjectively experienced intensity of negative emotion. The current study results suggest that prior exposure to facial expressions of emotions modulates the subsequent experience of negative emotion by implicitly activating the emotion-regulation system.

  7. Theory of Mind as a Mediator of Reasoning and Facial Emotion Recognition: Findings from 200 Healthy People

    Science.gov (United States)

    Lee, Seul Bee; Koo, Se Jun; Song, Yun Young; Lee, Mi Kyung; Jeong, Yu-Jin; Kwon, Catherine; Park, Kyoung Ri; Kang, Jee In; Lee, Eun

    2014-01-01

    Objective It was proposed that the ability to recognize facial emotions is closely related to complex neurocognitive processes and/or skills related to theory of mind (ToM). This study examines whether ToM skills mediate the relationship between higher neurocognitive functions, such as reasoning ability, and facial emotion recognition. Methods A total of 200 healthy subjects (101 males, 99 females) were recruited. Facial emotion recognition was measured through the use of 64 facial emotional stimuli that were selected from photographs from the Korean Facial Expressions of Emotion (KOFEE). Participants were requested to complete the Theory of Mind Picture Stories task and Standard Progressive Matrices (SPM). Results Multiple regression analysis showed that the SPM score (t=3.19, p=0.002, β=0.22) and the overall ToM score (t=2.56, p=0.011, β=0.18) were primarily associated with a total hit rate (%) of the emotion recognition task. Hierarchical regression analysis through a three-step mediation model showed that ToM may partially mediate the relationship between SPM and performance on facial emotion recognition. Conclusion These findings imply that higher neurocognitive functioning, inclusive of reasoning, may not only directly contribute towards facial emotion recognition but also influence ToM, which in turn, influences facial emotion recognition. These findings are particularly true for healthy young people. PMID:24843363

  8. Recognition of Facial Expressions of Different Emotional Intensities in Patients with Frontotemporal Lobar Degeneration

    Directory of Open Access Journals (Sweden)

    Roy P. C. Kessels

    2007-01-01

    Full Text Available Behavioural problems are a key feature of frontotemporal lobar degeneration (FTLD. Also, FTLD patients show impairments in emotion processing. Specifically, the perception of negative emotional facial expressions is affected. Generally, however, negative emotional expressions are regarded as more difficult to recognize than positive ones, which thus may have been a confounding factor in previous studies. Also, ceiling effects are often present on emotion recognition tasks using full-blown emotional facial expressions. In the present study with FTLD patients, we examined the perception of sadness, anger, fear, happiness, surprise and disgust at different emotional intensities on morphed facial expressions to take task difficulty into account. Results showed that our FTLD patients were specifically impaired at the recognition of the emotion anger. Also, the patients performed worse than the controls on recognition of surprise, but performed at control levels on disgust, happiness, sadness and fear. These findings corroborate and extend previous results showing deficits in emotion perception in FTLD.

  9. Facial emotion identification in early-onset psychosis.

    Science.gov (United States)

    Barkl, Sophie J; Lah, Suncica; Starling, Jean; Hainsworth, Cassandra; Harris, Anthony W F; Williams, Leanne M

    2014-12-01

    Facial emotion identification (FEI) deficits are common in patients with chronic schizophrenia and are strongly related to impaired functioning. The objectives of this study were to determine whether FEI deficits are present and emotion specific in people experiencing early-onset psychosis (EOP), and related to current clinical symptoms and functioning. Patients with EOP (n=34, mean age=14.11, 53% female) and healthy controls (HC, n=42, mean age 13.80, 51% female) completed a task of FEI that measured accuracy, error pattern and response time. Relative to HC, patients with EOP (i) had lower accuracy for identifying facial expressions of emotions, especially fear, anger and disgust, (ii) were more likely to misattribute other emotional expressions as fear or disgust, and (iii) were slower at accurately identifying all facial expressions. FEI accuracy was not related to clinical symptoms or current functioning. Deficits in FEI (especially for fear, anger and disgust) are evident in EOP. Our findings suggest that while emotion identification deficits may reflect a trait susceptibility marker, functional deficits may represent a sequelae of illness. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Culture shapes 7-month-olds' perceptual strategies in discriminating facial expressions of emotion.

    Science.gov (United States)

    Geangu, Elena; Ichikawa, Hiroko; Lao, Junpeng; Kanazawa, So; Yamaguchi, Masami K; Caldara, Roberto; Turati, Chiara

    2016-07-25

    Emotional facial expressions are thought to have evolved because they play a crucial role in species' survival. From infancy, humans develop dedicated neural circuits [1] to exhibit and recognize a variety of facial expressions [2]. But there is increasing evidence that culture specifies when and how certain emotions can be expressed - social norms - and that the mature perceptual mechanisms used to transmit and decode the visual information from emotional signals differ between Western and Eastern adults [3-5]. Specifically, the mouth is more informative for transmitting emotional signals in Westerners and the eye region for Easterners [4], generating culture-specific fixation biases towards these features [5]. During development, it is recognized that cultural differences can be observed at the level of emotional reactivity and regulation [6], and to the culturally dominant modes of attention [7]. Nonetheless, to our knowledge no study has explored whether culture shapes the processing of facial emotional signals early in development. The data we report here show that, by 7 months, infants from both cultures visually discriminate facial expressions of emotion by relying on culturally distinct fixation strategies, resembling those used by the adults from the environment in which they develop [5]. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Sex differences in event-related potentials and attentional biases to emotional facial stimuli.

    Science.gov (United States)

    Pfabigan, Daniela M; Lamplmayr-Kragl, Elisabeth; Pintzinger, Nina M; Sailer, Uta; Tran, Ulrich S

    2014-01-01

    Attentional processes play an important role in the processing of emotional information. Previous research reported attentional biases during stimulus processing in anxiety and depression. However, sex differences in the processing of emotional stimuli and higher prevalence rates of anxiety disorders among women, compared to men, suggest that attentional biases may also differ between the two sexes. The present study used a modified version of the dot probe task with happy, angry, and neutral facial stimuli to investigate the time course of attentional biases in healthy volunteers. Moreover, associations of attentional biases with alexithymia were examined on the behavioral and physiological level. Event-related potentials were measured while 21 participants (11 women) performed the task, utilizing also for the first time a difference wave approach in the analysis to highlight emotion-specific aspects. Women showed overall enhanced probe P1 amplitudes compared to men, in particular after rewarding facial stimuli. Using the difference wave approach, probe P1 amplitudes appeared specifically enhanced with regard to congruently presented happy facial stimuli among women, compared to men. Both methods yielded enhanced probe P1 amplitudes after presentation of the emotional stimulus in the left compared to the right visual hemifield. Probe P1 amplitudes correlated negatively with self-reported alexithymia, most of these correlations were only observable in women. Our results suggest that women orient their attention to a greater extent to facial stimuli than men and corroborate that alexithymia is a correlate of reduced emotional reactivity on a neuronal level. We recommend using a difference wave approach when addressing attentional processes of orientation and disengagement also in future studies.

  12. Sex differences in event-related potentials and attentional biases to emotional facial stimuli

    Directory of Open Access Journals (Sweden)

    Daniela M. Pfabigan

    2014-12-01

    Full Text Available Attentional processes play an important role in the processing of emotional information. Previous research reported attentional biases during stimulus processing in anxiety and depression. However, sex differences in the processing of emotional stimuli and higher prevalence rates of anxiety disorders among women, compared to men, suggest that attentional biases may also differ between the two sexes. The present study used a modified version of the dot probe task with happy, angry, and neutral facial stimuli to investigate the time course of attentional biases in healthy volunteers. Moreover, associations of attentional biases with alexithymia were examined on the behavioral and physiological level. Event-related potentials were measured while 21 participants (11 women performed the task, utilizing also for the first time a difference wave approach in the analysis to highlight emotion-specific aspects. Women showed overall enhanced probe P1 amplitudes compared to men, in particular after rewarding facial stimuli. Under the difference wave approach, probe P1 amplitudes appeared specifically enhanced with regard to congruently presented happy facial stimuli among women, compared to men. Both methods yielded enhanced probe P1 amplitudes after presentation of the emotional stimulus in the left compared to the right visual hemifield. Probe P1 amplitudes correlated negatively with self-reported alexithymia, most of these correlations were only observable in women. Our results suggest that women orient their attention to a greater extent to facial stimuli than men and corroborate that alexithymia is a correlate of reduced emotional reactivity on a neuronal level. We recommend using a difference wave approach when addressing attentional processes of orientation and disengagement also in future studies.

  13. Multimedia Content Development as a Facial Expression Datasets for Recognition of Human Emotions

    Science.gov (United States)

    Mamonto, N. E.; Maulana, H.; Liliana, D. Y.; Basaruddin, T.

    2018-02-01

    Datasets that have been developed before contain facial expression from foreign people. The development of multimedia content aims to answer the problems experienced by the research team and other researchers who will conduct similar research. The method used in the development of multimedia content as facial expression datasets for human emotion recognition is the Villamil-Molina version of the multimedia development method. Multimedia content developed with 10 subjects or talents with each talent performing 3 shots with each capturing talent having to demonstrate 19 facial expressions. After the process of editing and rendering, tests are carried out with the conclusion that the multimedia content can be used as a facial expression dataset for recognition of human emotions.

  14. Facial expressions of emotion are not culturally universal.

    Science.gov (United States)

    Jack, Rachael E; Garrod, Oliver G B; Yu, Hui; Caldara, Roberto; Schyns, Philippe G

    2012-05-08

    Since Darwin's seminal works, the universality of facial expressions of emotion has remained one of the longest standing debates in the biological and social sciences. Briefly stated, the universality hypothesis claims that all humans communicate six basic internal emotional states (happy, surprise, fear, disgust, anger, and sad) using the same facial movements by virtue of their biological and evolutionary origins [Susskind JM, et al. (2008) Nat Neurosci 11:843-850]. Here, we refute this assumed universality. Using a unique computer graphics platform that combines generative grammars [Chomsky N (1965) MIT Press, Cambridge, MA] with visual perception, we accessed the mind's eye of 30 Western and Eastern culture individuals and reconstructed their mental representations of the six basic facial expressions of emotion. Cross-cultural comparisons of the mental representations challenge universality on two separate counts. First, whereas Westerners represent each of the six basic emotions with a distinct set of facial movements common to the group, Easterners do not. Second, Easterners represent emotional intensity with distinctive dynamic eye activity. By refuting the long-standing universality hypothesis, our data highlight the powerful influence of culture on shaping basic behaviors once considered biologically hardwired. Consequently, our data open a unique nature-nurture debate across broad fields from evolutionary psychology and social neuroscience to social networking via digital avatars.

  15. Affective theory of mind inferences contextually influence the recognition of emotional facial expressions.

    Science.gov (United States)

    Stewart, Suzanne L K; Schepman, Astrid; Haigh, Matthew; McHugh, Rhian; Stewart, Andrew J

    2018-03-14

    The recognition of emotional facial expressions is often subject to contextual influence, particularly when the face and the context convey similar emotions. We investigated whether spontaneous, incidental affective theory of mind inferences made while reading vignettes describing social situations would produce context effects on the identification of same-valenced emotions (Experiment 1) as well as differently-valenced emotions (Experiment 2) conveyed by subsequently presented faces. Crucially, we found an effect of context on reaction times in both experiments while, in line with previous work, we found evidence for a context effect on accuracy only in Experiment 1. This demonstrates that affective theory of mind inferences made at the pragmatic level of a text can automatically, contextually influence the perceptual processing of emotional facial expressions in a separate task even when those emotions are of a distinctive valence. Thus, our novel findings suggest that language acts as a contextual influence to the recognition of emotional facial expressions for both same and different valences.

  16. Developmental differences in the neural mechanisms of facial emotion labeling

    Science.gov (United States)

    Adleman, Nancy E.; Kim, Pilyoung; Oakes, Allison H.; Hsu, Derek; Reynolds, Richard C.; Chen, Gang; Pine, Daniel S.; Brotman, Melissa A.; Leibenluft, Ellen

    2016-01-01

    Adolescence is a time of increased risk for the onset of psychological disorders associated with deficits in face emotion labeling. We used functional magnetic resonance imaging (fMRI) to examine age-related differences in brain activation while adolescents and adults labeled the emotion on fearful, happy and angry faces of varying intensities [0% (i.e. neutral), 50%, 75%, 100%]. Adolescents and adults did not differ on accuracy to label emotions. In the superior temporal sulcus, ventrolateral prefrontal cortex and middle temporal gyrus, adults show an inverted-U-shaped response to increasing intensities of fearful faces and a U-shaped response to increasing intensities of happy faces, whereas adolescents show the opposite patterns. In addition, adults, but not adolescents, show greater inferior occipital gyrus activation to negative (angry, fearful) vs positive (happy) emotions. In sum, when subjects classify subtly varying facial emotions, developmental differences manifest in several ‘ventral stream’ brain regions. Charting the typical developmental course of the brain mechanisms of socioemotional processes, such as facial emotion labeling, is an important focus for developmental psychopathology research. PMID:26245836

  17. Compound facial expressions of emotion: from basic research to clinical applications

    Science.gov (United States)

    Du, Shichuan; Martinez, Aleix M.

    2015-01-01

    Emotions are sometimes revealed through facial expressions. When these natural facial articulations involve the contraction of the same muscle groups in people of distinct cultural upbringings, this is taken as evidence of a biological origin of these emotions. While past research had identified facial expressions associated with a single internally felt category (eg, the facial expression of happiness when we feel joyful), we have recently studied facial expressions observed when people experience compound emotions (eg, the facial expression of happy surprise when we feel joyful in a surprised way, as, for example, at a surprise birthday party). Our research has identified 17 compound expressions consistently produced across cultures, suggesting that the number of facial expressions of emotion of biological origin is much larger than previously believed. The present paper provides an overview of these findings and shows evidence supporting the view that spontaneous expressions are produced using the same facial articulations previously identified in laboratory experiments. We also discuss the implications of our results in the study of psychopathologies, and consider several open research questions. PMID:26869845

  18. Compound facial expressions of emotion: from basic research to clinical applications.

    Science.gov (United States)

    Du, Shichuan; Martinez, Aleix M

    2015-12-01

    Emotions are sometimes revealed through facial expressions. When these natural facial articulations involve the contraction of the same muscle groups in people of distinct cultural upbringings, this is taken as evidence of a biological origin of these emotions. While past research had identified facial expressions associated with a single internally felt category (eg, the facial expression of happiness when we feel joyful), we have recently studied facial expressions observed when people experience compound emotions (eg, the facial expression of happy surprise when we feel joyful in a surprised way, as, for example, at a surprise birthday party). Our research has identified 17 compound expressions consistently produced across cultures, suggesting that the number of facial expressions of emotion of biological origin is much larger than previously believed. The present paper provides an overview of these findings and shows evidence supporting the view that spontaneous expressions are produced using the same facial articulations previously identified in laboratory experiments. We also discuss the implications of our results in the study of psychopathologies, and consider several open research questions.

  19. Expression-dependent susceptibility to face distortions in processing of facial expressions of emotion.

    Science.gov (United States)

    Guo, Kun; Soornack, Yoshi; Settle, Rebecca

    2018-03-05

    Our capability of recognizing facial expressions of emotion under different viewing conditions implies the existence of an invariant expression representation. As natural visual signals are often distorted and our perceptual strategy changes with external noise level, it is essential to understand how expression perception is susceptible to face distortion and whether the same facial cues are used to process high- and low-quality face images. We systematically manipulated face image resolution (experiment 1) and blur (experiment 2), and measured participants' expression categorization accuracy, perceived expression intensity and associated gaze patterns. Our analysis revealed a reasonable tolerance to face distortion in expression perception. Reducing image resolution up to 48 × 64 pixels or increasing image blur up to 15 cycles/image had little impact on expression assessment and associated gaze behaviour. Further distortion led to decreased expression categorization accuracy and intensity rating, increased reaction time and fixation duration, and stronger central fixation bias which was not driven by distortion-induced changes in local image saliency. Interestingly, the observed distortion effects were expression-dependent with less deterioration impact on happy and surprise expressions, suggesting this distortion-invariant facial expression perception might be achieved through the categorical model involving a non-linear configural combination of local facial features. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Emotion processing deficits in alexithymia and response to a depth of processing intervention.

    Science.gov (United States)

    Constantinou, Elena; Panayiotou, Georgia; Theodorou, Marios

    2014-12-01

    Findings on alexithymic emotion difficulties have been inconsistent. We examined potential differences between alexithymic and control participants in general arousal, reactivity, facial and subjective expression, emotion labeling, and covariation between emotion response systems. A depth of processing intervention was introduced. Fifty-four participants (27 alexithymic), selected using the Toronto Alexithymia Scale-20, completed an imagery experiment (imagining joy, fear and neutral scripts), under instructions for shallow or deep emotion processing. Heart rate, skin conductance, facial electromyography and startle reflex were recorded along with subjective ratings. Results indicated hypo-reactivity to emotion among high alexithymic individuals, smaller and slower startle responses, and low covariation between physiology and self-report. No deficits in facial expression, labeling and emotion ratings were identified. Deep processing was associated with increased physiological reactivity and lower perceived dominance and arousal in high alexithymia. Findings suggest a tendency for avoidance of intense, unpleasant emotions and less defensive action preparation in alexithymia. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Processing emotions in children with Moebius syndrome. A behavioral and thermal imaging study

    OpenAIRE

    Nicolini, Ylenia

    2018-01-01

    According to the “facial feedback hypothesis”, the proprioceptive feedback from facial muscles while mimicking face movements is crucial in regulating emotional experience. Facial mimicry, intended as a spontaneous reaction of individuals when observing emotional faces, plays a key role in understanding others’ facial expression of emotions. Studies on facial expression processing and emotion understanding have revealed that the neuronal bases of facial mimicry are underpinned by a mirror mec...

  2. Appraisals Generate Specific Configurations of Facial Muscle Movements in a Gambling Task: Evidence for the Component Process Model of Emotion.

    Science.gov (United States)

    Gentsch, Kornelia; Grandjean, Didier; Scherer, Klaus R

    2015-01-01

    Scherer's Component Process Model provides a theoretical framework for research on the production mechanism of emotion and facial emotional expression. The model predicts that appraisal results drive facial expressions, which unfold sequentially and cumulatively over time. In two experiments, we examined facial muscle activity changes (via facial electromyography recordings over the corrugator, cheek, and frontalis regions) in response to events in a gambling task. These events were experimentally manipulated feedback stimuli which presented simultaneous information directly affecting goal conduciveness (gambling outcome: win, loss, or break-even) and power appraisals (Experiment 1 and 2), as well as control appraisal (Experiment 2). We repeatedly found main effects of goal conduciveness (starting ~600 ms), and power appraisals (starting ~800 ms after feedback onset). Control appraisal main effects were inconclusive. Interaction effects of goal conduciveness and power appraisals were obtained in both experiments (Experiment 1: over the corrugator and cheek regions; Experiment 2: over the frontalis region) suggesting amplified goal conduciveness effects when power was high in contrast to invariant goal conduciveness effects when power was low. Also an interaction of goal conduciveness and control appraisals was found over the cheek region, showing differential goal conduciveness effects when control was high and invariant effects when control was low. These interaction effects suggest that the appraisal of having sufficient control or power affects facial responses towards gambling outcomes. The result pattern suggests that corrugator and frontalis regions are primarily related to cognitive operations that process motivational pertinence, whereas the cheek region would be more influenced by coping implications. Our results provide first evidence demonstrating that cognitive-evaluative mechanisms related to goal conduciveness, control, and power appraisals affect

  3. Common and distinct neural correlates of facial emotion processing in social anxiety disorder and Williams syndrome: A systematic review and voxel-based meta-analysis of functional resonance imaging studies.

    Science.gov (United States)

    Binelli, C; Subirà, S; Batalla, A; Muñiz, A; Sugranyés, G; Crippa, J A; Farré, M; Pérez-Jurado, L; Martín-Santos, R

    2014-11-01

    Social Anxiety Disorder (SAD) and Williams-Beuren Syndrome (WS) are two conditions which seem to be at opposite ends in the continuum of social fear but show compromised abilities in some overlapping areas, including some social interactions, gaze contact and processing of facial emotional cues. The increase in the number of neuroimaging studies has greatly expanded our knowledge of the neural bases of facial emotion processing in both conditions. However, to date, SAD and WS have not been compared. We conducted a systematic review of functional magnetic resonance imaging (fMRI) studies comparing SAD and WS cases to healthy control participants (HC) using facial emotion processing paradigms. Two researchers conducted comprehensive PubMed/Medline searches to identify all fMRI studies of facial emotion processing in SAD and WS. The following search key-words were used: "emotion processing"; "facial emotion"; "social anxiety"; "social phobia"; "Williams syndrome"; "neuroimaging"; "functional magnetic resonance"; "fMRI" and their combinations, as well as terms specifying individual facial emotions. We extracted spatial coordinates from each study and conducted two separate voxel-wise activation likelihood estimation meta-analyses, one for SAD and one for WS. Twenty-two studies met the inclusion criteria: 17 studies of SAD and five of WS. We found evidence for both common and distinct patterns of neural activation. Limbic engagement was common to SAD and WS during facial emotion processing, although we observed opposite patterns of activation for each disorder. Compared to HC, SAD cases showed hyperactivation of the amygdala, the parahippocampal gyrus and the globus pallidus. Compared to controls, participants with WS showed hypoactivation of these regions. Differential activation in a number of regions specific to either condition was also identified: SAD cases exhibited greater activation of the insula, putamen, the superior temporal gyrus, medial frontal regions and

  4. Emotional and behavioral reactions to facially deformed patients before and after craniofacial surgery.

    Science.gov (United States)

    Barden, R C; Ford, M E; Wilhelm, W M; Rogers-Salyer, M; Salyer, K E

    1988-09-01

    The present experiment investigated whether observers' emotional and behavioral reactions to facially deformed patients could be substantially improved by surgical procedures conducted by well-trained specialists in an experienced multidisciplinary team. Also investigated was the hypothesis that emotional states mediate the effects of physical attractiveness and facial deformity on social interaction. Twenty patients between the ages of 3 months and 17 years were randomly selected from over 2000 patients' files of Kenneth E. Salyer of Dallas, Texas. Patient diagnoses included facial clefts, hypertelorism, Treacher Collins syndrome, and craniofacial dysostoses (Crouzon's and Apert's syndromes). Rigorously standardized photographs of patients taken before and after surgery were shown to 22 "naive" raters ranging in age from 18 to 54 years. Raters were asked to predict their emotional and behavioral responses to the patients. These ratings indicated that observers' behavioral reactions to facially deformed children and adolescents would be more positive following craniofacial surgery. Similarly, the ratings indicated that observers' emotional reactions to these patients would be more positive following surgery. The results are discussed in terms of current sociopsychologic theoretical models for the effects of attractiveness on social interaction. A new model is presented that implicates induced emotional states as a mediating process in explaining the effects of attractiveness and facial deformity on the quality of social interactions. Limitations of the current investigation and directions for future research are also discussed.

  5. Endogenous testosterone levels are associated with neural activity in men with schizophrenia during facial emotion processing.

    Science.gov (United States)

    Ji, Ellen; Weickert, Cynthia Shannon; Lenroot, Rhoshel; Catts, Stanley V; Vercammen, Ans; White, Christopher; Gur, Raquel E; Weickert, Thomas W

    2015-06-01

    Growing evidence suggests that testosterone may play a role in the pathophysiology of schizophrenia given that testosterone has been linked to cognition and negative symptoms in schizophrenia. Here, we determine the extent to which serum testosterone levels are related to neural activity in affective processing circuitry in men with schizophrenia. Functional magnetic resonance imaging was used to measure blood-oxygen-level-dependent signal changes as 32 healthy controls and 26 people with schizophrenia performed a facial emotion identification task. Whole brain analyses were performed to determine regions of differential activity between groups during processing of angry versus non-threatening faces. A follow-up ROI analysis using a regression model in a subset of 16 healthy men and 16 men with schizophrenia was used to determine the extent to which serum testosterone levels were related to neural activity. Healthy controls displayed significantly greater activation than people with schizophrenia in the left inferior frontal gyrus (IFG). There was no significant difference in circulating testosterone levels between healthy men and men with schizophrenia. Regression analyses between activation in the IFG and circulating testosterone levels revealed a significant positive correlation in men with schizophrenia (r=.63, p=.01) and no significant relationship in healthy men. This study provides the first evidence that circulating serum testosterone levels are related to IFG activation during emotion face processing in men with schizophrenia but not in healthy men, which suggests that testosterone levels modulate neural processes relevant to facial emotion processing that may interfere with social functioning in men with schizophrenia. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.

  6. Deeper than skin deep - The effect of botulinum toxin-A on emotion processing.

    Science.gov (United States)

    Baumeister, J-C; Papa, G; Foroni, F

    2016-08-01

    The effect of facial botulinum Toxin-A (BTX) injections on the processing of emotional stimuli was investigated. The hypothesis, that BTX would interfere with processing of slightly emotional stimuli and less with very emotional or neutral stimuli, was largely confirmed. BTX-users rated slightly emotional sentences and facial expressions, but not very emotional or neutral ones, as less emotional after the treatment. Furthermore, they became slower at categorizing slightly emotional facial expressions under time pressure. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. [Recognition of facial emotions and theory of mind in schizophrenia: could the theory of mind deficit be due to the non-recognition of facial emotions?].

    Science.gov (United States)

    Besche-Richard, C; Bourrin-Tisseron, A; Olivier, M; Cuervo-Lombard, C-V; Limosin, F

    2012-06-01

    The deficits of recognition of facial emotions and attribution of mental states are now well-documented in schizophrenic patients. However, we don't clearly know about the link between these two complex cognitive functions, especially in schizophrenia. In this study, we attempted to test the link between the recognition of facial emotions and the capacities of mentalization, notably the attribution of beliefs, in health and schizophrenic participants. We supposed that the level of performance of recognition of facial emotions, compared to the working memory and executive functioning, was the best predictor of the capacities to attribute a belief. Twenty schizophrenic participants according to DSM-IVTR (mean age: 35.9 years, S.D. 9.07; mean education level: 11.15 years, S.D. 2.58) clinically stabilized, receiving neuroleptic or antipsychotic medication participated in the study. They were matched on age (mean age: 36.3 years, S.D. 10.9) and educational level (mean educational level: 12.10, S.D. 2.25) with 30 matched healthy participants. All the participants were evaluated with a pool of tasks testing the recognition of facial emotions (the faces of Baron-Cohen), the attribution of beliefs (two stories of first order and two stories of second order), the working memory (the digit span of the WAIS-III and the Corsi test) and the executive functioning (Trail Making Test A et B, Wisconsin Card Sorting Test brief version). Comparing schizophrenic and healthy participants, our results confirmed a difference between the performances of the recognition of facial emotions and those of the attribution of beliefs. The result of the simple linear regression showed that the recognition of facial emotions, compared to the performances of working memory and executive functioning, was the best predictor of the performances in the theory of mind stories. Our results confirmed, in a sample of schizophrenic patients, the deficits in the recognition of facial emotions and in the

  8. Seeing Mixed Emotions: The Specificity of Emotion Perception From Static and Dynamic Facial Expressions Across Cultures.

    Science.gov (United States)

    Fang, Xia; Sauter, Disa A; Van Kleef, Gerben A

    2018-01-01

    Although perceivers often agree about the primary emotion that is conveyed by a particular expression, observers may concurrently perceive several additional emotions from a given facial expression. In the present research, we compared the perception of two types of nonintended emotions in Chinese and Dutch observers viewing facial expressions: emotions which were morphologically similar to the intended emotion and emotions which were morphologically dissimilar to the intended emotion. Findings were consistent across two studies and showed that (a) morphologically similar emotions were endorsed to a greater extent than dissimilar emotions and (b) Chinese observers endorsed nonintended emotions more than did Dutch observers. Furthermore, the difference between Chinese and Dutch observers was more pronounced for the endorsement of morphologically similar emotions than of dissimilar emotions. We also obtained consistent evidence that Dutch observers endorsed nonintended emotions that were congruent with the preceding expressions to a greater degree. These findings suggest that culture and morphological similarity both influence the extent to which perceivers see several emotions in a facial expression.

  9. Seeing Mixed Emotions: The Specificity of Emotion Perception From Static and Dynamic Facial Expressions Across Cultures

    Science.gov (United States)

    Fang, Xia; Sauter, Disa A.; Van Kleef, Gerben A.

    2017-01-01

    Although perceivers often agree about the primary emotion that is conveyed by a particular expression, observers may concurrently perceive several additional emotions from a given facial expression. In the present research, we compared the perception of two types of nonintended emotions in Chinese and Dutch observers viewing facial expressions: emotions which were morphologically similar to the intended emotion and emotions which were morphologically dissimilar to the intended emotion. Findings were consistent across two studies and showed that (a) morphologically similar emotions were endorsed to a greater extent than dissimilar emotions and (b) Chinese observers endorsed nonintended emotions more than did Dutch observers. Furthermore, the difference between Chinese and Dutch observers was more pronounced for the endorsement of morphologically similar emotions than of dissimilar emotions. We also obtained consistent evidence that Dutch observers endorsed nonintended emotions that were congruent with the preceding expressions to a greater degree. These findings suggest that culture and morphological similarity both influence the extent to which perceivers see several emotions in a facial expression. PMID:29386689

  10. Laterality of Facial Expressions of Emotion: Universal and Culture-Specific Influences

    Directory of Open Access Journals (Sweden)

    Manas K. Mandal

    2004-01-01

    Full Text Available Recent research indicates that (a the perception and expression of facial emotion are lateralized to a great extent in the right hemisphere, and, (b whereas facial expressions of emotion embody universal signals, culture-specific learning moderates the expression and interpretation of these emotions. In the present article, we review the literature on laterality and universality, and propose that, although some components of facial expressions of emotion are governed biologically, others are culturally influenced. We suggest that the left side of the face is more expressive of emotions, is more uninhibited, and displays culture-specific emotional norms. The right side of face, on the other hand, is less susceptible to cultural display norms and exhibits more universal emotional signals.

  11. Laterality of facial expressions of emotion: Universal and culture-specific influences.

    Science.gov (United States)

    Mandal, Manas K; Ambady, Nalini

    2004-01-01

    Recent research indicates that (a) the perception and expression of facial emotion are lateralized to a great extent in the right hemisphere, and, (b) whereas facial expressions of emotion embody universal signals, culture-specific learning moderates the expression and interpretation of these emotions. In the present article, we review the literature on laterality and universality, and propose that, although some components of facial expressions of emotion are governed biologically, others are culturally influenced. We suggest that the left side of the face is more expressive of emotions, is more uninhibited, and displays culture-specific emotional norms. The right side of face, on the other hand, is less susceptible to cultural display norms and exhibits more universal emotional signals. Copyright 2004 IOS Press

  12. The perception and identification of facial emotions in individuals with autism spectrum disorders using the Let's Face It! Emotion Skills Battery.

    Science.gov (United States)

    Tanaka, James W; Wolf, Julie M; Klaiman, Cheryl; Koenig, Kathleen; Cockburn, Jeffrey; Herlihy, Lauren; Brown, Carla; Stahl, Sherin S; South, Mikle; McPartland, James C; Kaiser, Martha D; Schultz, Robert T

    2012-12-01

    Although impaired social-emotional ability is a hallmark of autism spectrum disorder (ASD), the perceptual skills and mediating strategies contributing to the social deficits of autism are not well understood. A perceptual skill that is fundamental to effective social communication is the ability to accurately perceive and interpret facial emotions. To evaluate the expression processing of participants with ASD, we designed the Let's Face It! Emotion Skills Battery (LFI! Battery), a computer-based assessment composed of three subscales measuring verbal and perceptual skills implicated in the recognition of facial emotions. We administered the LFI! Battery to groups of participants with ASD and typically developing control (TDC) participants that were matched for age and IQ. On the Name Game labeling task, participants with ASD (N = 68) performed on par with TDC individuals (N = 66) in their ability to name the facial emotions of happy, sad, disgust and surprise and were only impaired in their ability to identify the angry expression. On the Matchmaker Expression task that measures the recognition of facial emotions across different facial identities, the ASD participants (N = 66) performed reliably worse than TDC participants (N = 67) on the emotions of happy, sad, disgust, frighten and angry. In the Parts-Wholes test of perceptual strategies of expression, the TDC participants (N = 67) displayed more holistic encoding for the eyes than the mouths in expressive faces whereas ASD participants (N = 66) exhibited the reverse pattern of holistic recognition for the mouth and analytic recognition of the eyes. In summary, findings from the LFI! Battery show that participants with ASD were able to label the basic facial emotions (with the exception of angry expression) on par with age- and IQ-matched TDC participants. However, participants with ASD were impaired in their ability to generalize facial emotions across different identities and showed a tendency to recognize

  13. The Mask of Sanity: Facial Expressive, Self-Reported, and Physiological Consequences of Emotion Regulation in Psychopathic Offenders.

    Science.gov (United States)

    Nentjes, Lieke; Bernstein, David P; Meijer, Ewout; Arntz, Arnoud; Wiers, Reinout W

    2016-12-01

    This study investigated the physiological, self-reported, and facial correlates of emotion regulation in psychopathy. Specifically, we compared psychopathic offenders (n = 42), nonpsychopathic offenders (n = 42), and nonoffender controls (n = 26) in their ability to inhibit and express emotion while watching affective films (fear, happy, and sad). Results showed that all participants were capable of drastically diminishing facial emotions under inhibition instructions. Contrary to expectation, psychopaths were not superior in adopting such a "poker face." Further, the inhibition of emotion was associated with cardiovascular changes, an effect that was also not dependent on psychopathy (or its factors), suggesting emotion inhibition to be an effortful process in psychopaths as well. Interestingly, psychopathic offenders did not differ from nonpsychopaths in the capacity to show content-appropriate facial emotions during the expression condition. Taken together, these data challenge the view that psychopathy is associated with either superior emotional inhibitory capacities or a generalized impairment in showing facial affect.

  14. Sex differences in facial emotion recognition across varying expression intensity levels from videos.

    Science.gov (United States)

    Wingenbach, Tanja S H; Ashwin, Chris; Brosnan, Mark

    2018-01-01

    There has been much research on sex differences in the ability to recognise facial expressions of emotions, with results generally showing a female advantage in reading emotional expressions from the face. However, most of the research to date has used static images and/or 'extreme' examples of facial expressions. Therefore, little is known about how expression intensity and dynamic stimuli might affect the commonly reported female advantage in facial emotion recognition. The current study investigated sex differences in accuracy of response (Hu; unbiased hit rates) and response latencies for emotion recognition using short video stimuli (1sec) of 10 different facial emotion expressions (anger, disgust, fear, sadness, surprise, happiness, contempt, pride, embarrassment, neutral) across three variations in the intensity of the emotional expression (low, intermediate, high) in an adolescent and adult sample (N = 111; 51 male, 60 female) aged between 16 and 45 (M = 22.2, SD = 5.7). Overall, females showed more accurate facial emotion recognition compared to males and were faster in correctly recognising facial emotions. The female advantage in reading expressions from the faces of others was unaffected by expression intensity levels and emotion categories used in the study. The effects were specific to recognition of emotions, as males and females did not differ in the recognition of neutral faces. Together, the results showed a robust sex difference favouring females in facial emotion recognition using video stimuli of a wide range of emotions and expression intensity variations.

  15. Sex differences in facial emotion recognition across varying expression intensity levels from videos

    Science.gov (United States)

    2018-01-01

    There has been much research on sex differences in the ability to recognise facial expressions of emotions, with results generally showing a female advantage in reading emotional expressions from the face. However, most of the research to date has used static images and/or ‘extreme’ examples of facial expressions. Therefore, little is known about how expression intensity and dynamic stimuli might affect the commonly reported female advantage in facial emotion recognition. The current study investigated sex differences in accuracy of response (Hu; unbiased hit rates) and response latencies for emotion recognition using short video stimuli (1sec) of 10 different facial emotion expressions (anger, disgust, fear, sadness, surprise, happiness, contempt, pride, embarrassment, neutral) across three variations in the intensity of the emotional expression (low, intermediate, high) in an adolescent and adult sample (N = 111; 51 male, 60 female) aged between 16 and 45 (M = 22.2, SD = 5.7). Overall, females showed more accurate facial emotion recognition compared to males and were faster in correctly recognising facial emotions. The female advantage in reading expressions from the faces of others was unaffected by expression intensity levels and emotion categories used in the study. The effects were specific to recognition of emotions, as males and females did not differ in the recognition of neutral faces. Together, the results showed a robust sex difference favouring females in facial emotion recognition using video stimuli of a wide range of emotions and expression intensity variations. PMID:29293674

  16. Sex differences in facial emotion recognition across varying expression intensity levels from videos.

    Directory of Open Access Journals (Sweden)

    Tanja S H Wingenbach

    Full Text Available There has been much research on sex differences in the ability to recognise facial expressions of emotions, with results generally showing a female advantage in reading emotional expressions from the face. However, most of the research to date has used static images and/or 'extreme' examples of facial expressions. Therefore, little is known about how expression intensity and dynamic stimuli might affect the commonly reported female advantage in facial emotion recognition. The current study investigated sex differences in accuracy of response (Hu; unbiased hit rates and response latencies for emotion recognition using short video stimuli (1sec of 10 different facial emotion expressions (anger, disgust, fear, sadness, surprise, happiness, contempt, pride, embarrassment, neutral across three variations in the intensity of the emotional expression (low, intermediate, high in an adolescent and adult sample (N = 111; 51 male, 60 female aged between 16 and 45 (M = 22.2, SD = 5.7. Overall, females showed more accurate facial emotion recognition compared to males and were faster in correctly recognising facial emotions. The female advantage in reading expressions from the faces of others was unaffected by expression intensity levels and emotion categories used in the study. The effects were specific to recognition of emotions, as males and females did not differ in the recognition of neutral faces. Together, the results showed a robust sex difference favouring females in facial emotion recognition using video stimuli of a wide range of emotions and expression intensity variations.

  17. Attention and memory bias to facial emotions underlying negative symptoms of schizophrenia.

    Science.gov (United States)

    Jang, Seon-Kyeong; Park, Seon-Cheol; Lee, Seung-Hwan; Cho, Yang Seok; Choi, Kee-Hong

    2016-01-01

    This study assessed bias in selective attention to facial emotions in negative symptoms of schizophrenia and its influence on subsequent memory for facial emotions. Thirty people with schizophrenia who had high and low levels of negative symptoms (n = 15, respectively) and 21 healthy controls completed a visual probe detection task investigating selective attention bias (happy, sad, and angry faces randomly presented for 50, 500, or 1000 ms). A yes/no incidental facial memory task was then completed. Attention bias scores and recognition errors were calculated. Those with high negative symptoms exhibited reduced attention to emotional faces relative to neutral faces; those with low negative symptoms showed the opposite pattern when faces were presented for 500 ms regardless of the valence. Compared to healthy controls, those with high negative symptoms made more errors for happy faces in the memory task. Reduced attention to emotional faces in the probe detection task was significantly associated with less pleasure and motivation and more recognition errors for happy faces in schizophrenia group only. Attention bias away from emotional information relatively early in the attentional process and associated diminished positive memory may relate to pathological mechanisms for negative symptoms.

  18. Developmental differences in the neural mechanisms of facial emotion labeling.

    Science.gov (United States)

    Wiggins, Jillian Lee; Adleman, Nancy E; Kim, Pilyoung; Oakes, Allison H; Hsu, Derek; Reynolds, Richard C; Chen, Gang; Pine, Daniel S; Brotman, Melissa A; Leibenluft, Ellen

    2016-01-01

    Adolescence is a time of increased risk for the onset of psychological disorders associated with deficits in face emotion labeling. We used functional magnetic resonance imaging (fMRI) to examine age-related differences in brain activation while adolescents and adults labeled the emotion on fearful, happy and angry faces of varying intensities [0% (i.e. neutral), 50%, 75%, 100%]. Adolescents and adults did not differ on accuracy to label emotions. In the superior temporal sulcus, ventrolateral prefrontal cortex and middle temporal gyrus, adults show an inverted-U-shaped response to increasing intensities of fearful faces and a U-shaped response to increasing intensities of happy faces, whereas adolescents show the opposite patterns. In addition, adults, but not adolescents, show greater inferior occipital gyrus activation to negative (angry, fearful) vs positive (happy) emotions. In sum, when subjects classify subtly varying facial emotions, developmental differences manifest in several 'ventral stream' brain regions. Charting the typical developmental course of the brain mechanisms of socioemotional processes, such as facial emotion labeling, is an important focus for developmental psychopathology research. Published by Oxford University Press 2015. This work is written by US Government employees and is in the public domain in the US.

  19. Automatic Emotional State Detection using Facial Expression Dynamic in Videos

    Directory of Open Access Journals (Sweden)

    Hongying Meng

    2014-11-01

    Full Text Available In this paper, an automatic emotion detection system is built for a computer or machine to detect the emotional state from facial expressions in human computer communication. Firstly, dynamic motion features are extracted from facial expression videos and then advanced machine learning methods for classification and regression are used to predict the emotional states. The system is evaluated on two publicly available datasets, i.e. GEMEP_FERA and AVEC2013, and satisfied performances are achieved in comparison with the baseline results provided. With this emotional state detection capability, a machine can read the facial expression of its user automatically. This technique can be integrated into applications such as smart robots, interactive games and smart surveillance systems.

  20. Facial Mimicry and Emotion Consistency: Influences of Memory and Context.

    Science.gov (United States)

    Kirkham, Alexander J; Hayes, Amy E; Pawling, Ralph; Tipper, Steven P

    2015-01-01

    This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene), whilst others were always inconsistent (e.g., frowning with a positive scene). During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.

  1. Facial Mimicry and Emotion Consistency: Influences of Memory and Context.

    Directory of Open Access Journals (Sweden)

    Alexander J Kirkham

    Full Text Available This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene, whilst others were always inconsistent (e.g., frowning with a positive scene. During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.

  2. Dissociation in Rating Negative Facial Emotions between Behavioral Variant Frontotemporal Dementia and Major Depressive Disorder.

    Science.gov (United States)

    Chiu, Isabelle; Piguet, Olivier; Diehl-Schmid, Janine; Riedl, Lina; Beck, Johannes; Leyhe, Thomas; Holsboer-Trachsler, Edith; Berres, Manfred; Monsch, Andreas U; Sollberger, Marc

    2016-11-01

    Features of behavioral variant frontotemporal dementia (bvFTD) such as executive dysfunction, apathy, and impaired empathic abilities are also observed in major depressive disorder (MDD). This may contribute to the reason why early stage bvFTD is often misdiagnosed as MDD. New assessment tools are thus needed to improve early diagnosis of bvFTD. Although emotion processing is affected in bvFTD and MDD, growing evidence indicates that the pattern of emotion processing deficits varies between the two disorders. As such, emotion processing paradigms have substantial potentials to distinguish bvFTD from MDD. The current study compared 25 patients with bvFTD, 21 patients with MDD, 21 patients with Alzheimer disease (AD) dementia, and 31 healthy participants on a novel facial emotion intensity rating task. Stimuli comprised morphed faces from the Ekman and Friesen stimulus set containing faces of each sex with two different degrees of emotion intensity for each of the six basic emotions. Analyses of covariance uncovered a significant dissociation between bvFTD and MDD patients in rating the intensity of negative emotions overall (i.e., bvFTD patients underrated negative emotions overall, whereas MDD patients overrated negative emotions overall compared with healthy participants). In contrast, AD dementia patients rated negative emotions similarly to healthy participants, suggesting no impact of cognitive deficits on rating facial emotions. By strongly differentiating bvFTD and MDDpatients through negative facial emotions, this sensitive and short rating task might help improve the early diagnosis of bvFTD. Copyright © 2016 American Association for Geriatric Psychiatry. All rights reserved.

  3. Warsaw set of emotional facial expression pictures: a validation study of facial display photographs

    NARCIS (Netherlands)

    Olszanowski, M.; Pochwatko, G.; Kuklinski, K.; Scibor-Rylski, M.; Lewinski, P.; Ohme, R.K.

    2015-01-01

    Emotional facial expressions play a critical role in theories of emotion and figure prominently in research on almost every aspect of emotion. This article provides a background for a new database of basic emotional expressions. The goal in creating this set was to provide high quality photographs

  4. Facial emotion recognition in patients with focal and diffuse axonal injury.

    Science.gov (United States)

    Yassin, Walid; Callahan, Brandy L; Ubukata, Shiho; Sugihara, Genichi; Murai, Toshiya; Ueda, Keita

    2017-01-01

    Facial emotion recognition impairment has been well documented in patients with traumatic brain injury. Studies exploring the neural substrates involved in such deficits have implicated specific grey matter structures (e.g. orbitofrontal regions), as well as diffuse white matter damage. Our study aims to clarify whether different types of injuries (i.e. focal vs. diffuse) will lead to different types of impairments on facial emotion recognition tasks, as no study has directly compared these patients. The present study examined performance and response patterns on a facial emotion recognition task in 14 participants with diffuse axonal injury (DAI), 14 with focal injury (FI) and 22 healthy controls. We found that, overall, participants with FI and DAI performed more poorly than controls on the facial emotion recognition task. Further, we observed comparable emotion recognition performance in participants with FI and DAI, despite differences in the nature and distribution of their lesions. However, the rating response pattern between the patient groups was different. This is the first study to show that pure DAI, without gross focal lesions, can independently lead to facial emotion recognition deficits and that rating patterns differ depending on the type and location of trauma.

  5. Dissociation between facial and bodily expressions in emotion recognition: A case study.

    Science.gov (United States)

    Leiva, Samanta; Margulis, Laura; Micciulli, Andrea; Ferreres, Aldo

    2017-12-21

    Existing single-case studies have reported deficit in recognizing basic emotions through facial expression and unaffected performance with body expressions, but not the opposite pattern. The aim of this paper is to present a case study with impaired emotion recognition through body expressions and intact performance with facial expressions. In this single-case study we assessed a 30-year-old patient with autism spectrum disorder, without intellectual disability, and a healthy control group (n = 30) with four tasks of basic and complex emotion recognition through face and body movements, and two non-emotional control tasks. To analyze the dissociation between facial and body expressions, we used Crawford and Garthwaite's operational criteria, and we compared the patient and the control group performance with a modified one-tailed t-test designed specifically for single-case studies. There were no statistically significant differences between the patient's and the control group's performances on the non-emotional body movement task or the facial perception task. For both kinds of emotions (basic and complex) when the patient's performance was compared to the control group's, statistically significant differences were only observed for the recognition of body expressions. There were no significant differences between the patient's and the control group's correct answers for emotional facial stimuli. Our results showed a profile of impaired emotion recognition through body expressions and intact performance with facial expressions. This is the first case study that describes the existence of this kind of dissociation pattern between facial and body expressions of basic and complex emotions.

  6. The role of encoding and attention in facial emotion memory: an EEG investigation.

    Science.gov (United States)

    Brenner, Colleen A; Rumak, Samuel P; Burns, Amy M N; Kieffaber, Paul D

    2014-09-01

    Facial expressions are encoded via sensory mechanisms, but meaning extraction and salience of these expressions involve cognitive functions. We investigated the time course of sensory encoding and subsequent maintenance in memory via EEG. Twenty-nine healthy participants completed a facial emotion delayed match-to-sample task. P100, N170 and N250 ERPs were measured in response to the first stimulus, and evoked theta power (4-7Hz) was measured during the delay interval. Negative facial expressions produced larger N170 amplitudes and greater theta power early in the delay. N170 amplitude correlated with theta power, however larger N170 amplitude coupled with greater theta power only predicted behavioural performance for one emotion condition (very happy) out of six tested (see Supplemental Data). These findings indicate that the N170 ERP may be sensitive to emotional facial expressions when task demands require encoding and retention of this information. Furthermore, sustained theta activity may represent continued attentional processing that supports short-term memory, especially of negative facial stimuli. Further study is needed to investigate the potential influence of these measures, and their interaction, on behavioural performance. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  7. Misinterpretation of Facial Expressions of Emotion in Verbal Adults with Autism Spectrum Disorder

    Science.gov (United States)

    Eack, Shaun M.; Mazefsky, Carla A.; Minshew, Nancy J.

    2015-01-01

    Facial emotion perception is significantly affected in autism spectrum disorder, yet little is known about how individuals with autism spectrum disorder misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. This study examined facial emotion perception in 45 verbal adults with autism spectrum…

  8. The impact of high trait social anxiety on neural processing of facial emotion expressions in females.

    Science.gov (United States)

    Felmingham, Kim L; Stewart, Laura F; Kemp, Andrew H; Carr, Andrea R

    2016-05-01

    A cognitive model of social anxiety predicts that an early attentional bias leads to greater cognitive processing of social threat signals, whereas the vigilance-avoidance model predicts there will be subsequent reduction in cognitive processing. This study tests these models by examining neural responses to social threat stimuli using Event-related potentials (ERP). 19 women with high trait social anxiety and 19 women with low trait social anxiety viewed emotional expressions (angry, disgusted, happy and neutral) in a passive viewing task whilst ERP responses were recorded. The HSA group revealed greater automatic attention, or hypervigilance, to all facial expressions, as indexed by greater N1 amplitude compared to the LSA group. They also showed greater sustained attention and elaborative processing of all facial expressions, indexed by significantly increased P2 and P3 amplitudes compared to the LSA group. These results support cognitive models of social anxiety, but are not consistent with predictions of the vigilance-avoidance model. Copyright © 2016. Published by Elsevier B.V.

  9. Age and gender modulate the neural circuitry supporting facial emotion processing in adults with major depressive disorder.

    Science.gov (United States)

    Briceño, Emily M; Rapport, Lisa J; Kassel, Michelle T; Bieliauskas, Linas A; Zubieta, Jon-Kar; Weisenbach, Sara L; Langenecker, Scott A

    2015-03-01

    Emotion processing, supported by frontolimbic circuitry known to be sensitive to the effects of aging, is a relatively understudied cognitive-emotional domain in geriatric depression. Some evidence suggests that the neurophysiological disruption observed in emotion processing among adults with major depressive disorder (MDD) may be modulated by both gender and age. Therefore, the present study investigated the effects of gender and age on the neural circuitry supporting emotion processing in MDD. Cross-sectional comparison of fMRI signal during performance of an emotion processing task. Outpatient university setting. One hundred adults recruited by MDD status, gender, and age. Participants underwent fMRI while completing the Facial Emotion Perception Test. They viewed photographs of faces and categorized the emotion perceived. Contrast for fMRI was of face perception minus animal identification blocks. Effects of depression were observed in precuneus and effects of age in a number of frontolimbic regions. Three-way interactions were present between MDD status, gender, and age in regions pertinent to emotion processing, including frontal, limbic, and basal ganglia. Young women with MDD and older men with MDD exhibited hyperactivation in these regions compared with their respective same-gender healthy comparison (HC) counterparts. In contrast, older women and younger men with MDD exhibited hypoactivation compared to their respective same-gender HC counterparts. This the first study to report gender- and age-specific differences in emotion processing circuitry in MDD. Gender-differential mechanisms may underlie cognitive-emotional disruption in older adults with MDD. The present findings have implications for improved probes into the heterogeneity of the MDD syndrome. Copyright © 2015 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  10. The Emotional Modulation of Facial Mimicry: A Kinematic Study

    Directory of Open Access Journals (Sweden)

    Antonella Tramacere

    2018-01-01

    Full Text Available It is well-established that the observation of emotional facial expression induces facial mimicry responses in the observers. However, how the interaction between emotional and motor components of facial expressions can modulate the motor behavior of the perceiver is still unknown. We have developed a kinematic experiment to evaluate the effect of different oro-facial expressions on perceiver's face movements. Participants were asked to perform two movements, i.e., lip stretching and lip protrusion, in response to the observation of four meaningful (i.e., smile, angry-mouth, kiss, and spit and two meaningless mouth gestures. All the stimuli were characterized by different motor patterns (mouth aperture or mouth closure. Response Times and kinematics parameters of the movements (amplitude, duration, and mean velocity were recorded and analyzed. Results evidenced a dissociated effect on reaction times and movement kinematics. We found shorter reaction time when a mouth movement was preceded by the observation of a meaningful and motorically congruent oro-facial gesture, in line with facial mimicry effect. On the contrary, during execution, the perception of smile was associated with the facilitation, in terms of shorter duration and higher velocity of the incongruent movement, i.e., lip protrusion. The same effect resulted in response to kiss and spit that significantly facilitated the execution of lip stretching. We called this phenomenon facial mimicry reversal effect, intended as the overturning of the effect normally observed during facial mimicry. In general, the findings show that both motor features and types of emotional oro-facial gestures (conveying positive or negative valence affect the kinematics of subsequent mouth movements at different levels: while congruent motor features facilitate a general motor response, motor execution could be speeded by gestures that are motorically incongruent with the observed one. Moreover, valence

  11. The Emotional Modulation of Facial Mimicry: A Kinematic Study.

    Science.gov (United States)

    Tramacere, Antonella; Ferrari, Pier F; Gentilucci, Maurizio; Giuffrida, Valeria; De Marco, Doriana

    2017-01-01

    It is well-established that the observation of emotional facial expression induces facial mimicry responses in the observers. However, how the interaction between emotional and motor components of facial expressions can modulate the motor behavior of the perceiver is still unknown. We have developed a kinematic experiment to evaluate the effect of different oro-facial expressions on perceiver's face movements. Participants were asked to perform two movements, i.e., lip stretching and lip protrusion, in response to the observation of four meaningful (i.e., smile, angry-mouth, kiss, and spit) and two meaningless mouth gestures. All the stimuli were characterized by different motor patterns (mouth aperture or mouth closure). Response Times and kinematics parameters of the movements (amplitude, duration, and mean velocity) were recorded and analyzed. Results evidenced a dissociated effect on reaction times and movement kinematics. We found shorter reaction time when a mouth movement was preceded by the observation of a meaningful and motorically congruent oro-facial gesture, in line with facial mimicry effect. On the contrary, during execution, the perception of smile was associated with the facilitation, in terms of shorter duration and higher velocity of the incongruent movement, i.e., lip protrusion. The same effect resulted in response to kiss and spit that significantly facilitated the execution of lip stretching. We called this phenomenon facial mimicry reversal effect , intended as the overturning of the effect normally observed during facial mimicry. In general, the findings show that both motor features and types of emotional oro-facial gestures (conveying positive or negative valence) affect the kinematics of subsequent mouth movements at different levels: while congruent motor features facilitate a general motor response, motor execution could be speeded by gestures that are motorically incongruent with the observed one. Moreover, valence effect depends on

  12. Gaze Dynamics in the Recognition of Facial Expressions of Emotion.

    Science.gov (United States)

    Barabanschikov, Vladimir A

    2015-01-01

    We studied preferably fixated parts and features of human face in the process of recognition of facial expressions of emotion. Photographs of facial expressions were used. Participants were to categorize these as basic emotions; during this process, eye movements were registered. It was found that variation in the intensity of an expression is mirrored in accuracy of emotion recognition; it was also reflected by several indices of oculomotor function: duration of inspection of certain areas of the face, its upper and bottom or right parts, right and left sides; location, number and duration of fixations, viewing trajectory. In particular, for low-intensity expressions, right side of the face was found to be attended predominantly (right-side dominance); the right-side dominance effect, was, however, absent for expressions of high intensity. For both low- and high-intensity expressions, upper face part was predominantly fixated, though with greater fixation of high-intensity expressions. The majority of trials (70%), in line with findings in previous studies, revealed a V-shaped pattern of inspection trajectory. No relationship, between accuracy of recognition of emotional expressions, was found, though, with either location and duration of fixations or pattern of gaze directedness in the face. © The Author(s) 2015.

  13. Common impairments of emotional facial expression recognition in schizophrenia across French and Japanese cultures

    Directory of Open Access Journals (Sweden)

    Takashi eOkada

    2015-07-01

    Full Text Available To address whether the recognition of emotional facial expressions is impaired in schizophrenia across different cultures, patients with schizophrenia and age-matched normal controls in France and Japan were tested with a labeling task of emotional facial expressions and a matching task of unfamiliar faces. Schizophrenia patients in both France and Japan were less accurate in labeling fearful facial expressions. There was no correlation between the scores of facial emotion labeling and face matching. These results suggest that the impaired recognition of emotional facial expressions in schizophrenia is common across different cultures.

  14. Perceptions of variability in facial emotion influence beliefs about the stability of psychological characteristics.

    Science.gov (United States)

    Weisbuch, Max; Grunberg, Rebecca L; Slepian, Michael L; Ambady, Nalini

    2016-10-01

    Beliefs about the malleability versus stability of traits (incremental vs. entity lay theories) have a profound impact on social cognition and self-regulation, shaping phenomena that range from the fundamental attribution error and group-based stereotyping to academic motivation and achievement. Less is known about the causes than the effects of these lay theories, and in the current work the authors examine the perception of facial emotion as a causal influence on lay theories. Specifically, they hypothesized that (a) within-person variability in facial emotion signals within-person variability in traits and (b) social environments replete with within-person variability in facial emotion encourage perceivers to endorse incremental lay theories. Consistent with Hypothesis 1, Study 1 participants were more likely to attribute dynamic (vs. stable) traits to a person who exhibited several different facial emotions than to a person who exhibited a single facial emotion across multiple images. Hypothesis 2 suggests that social environments support incremental lay theories to the extent that they include many people who exhibit within-person variability in facial emotion. Consistent with Hypothesis 2, participants in Studies 2-4 were more likely to endorse incremental theories of personality, intelligence, and morality after exposure to multiple individuals exhibiting within-person variability in facial emotion than after exposure to multiple individuals exhibiting a single emotion several times. Perceptions of within-person variability in facial emotion-rather than perceptions of simple diversity in facial emotion-were responsible for these effects. Discussion focuses on how social ecologies shape lay theories. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  15. Do you see what I see? Sex differences in the discrimination of facial emotions during adolescence.

    Science.gov (United States)

    Lee, Nikki C; Krabbendam, Lydia; White, Thomas P; Meeter, Martijn; Banaschewski, Tobias; Barker, Gareth J; Bokde, Arun L W; Büchel, Christian; Conrod, Patricia; Flor, Herta; Frouin, Vincent; Heinz, Andreas; Garavan, Hugh; Gowland, Penny; Ittermann, Bernd; Mann, Karl; Paillère Martinot, Marie-Laure; Nees, Frauke; Paus, Tomas; Pausova, Zdenka; Rietschel, Marcella; Robbins, Trevor; Fauth-Bühler, Mira; Smolka, Michael N; Gallinat, Juergen; Schumann, Gunther; Shergill, Sukhi S

    2013-12-01

    During adolescence social relationships become increasingly important. Establishing and maintaining these relationships requires understanding of emotional stimuli, such as facial emotions. A failure to adequately interpret emotional facial expressions has previously been associated with various mental disorders that emerge during adolescence. The current study examined sex differences in emotional face processing during adolescence. Participants were adolescents (n = 1951) with a target age of 14, who completed a forced-choice emotion discrimination task. The stimuli used comprised morphed faces that contained a blend of two emotions in varying intensities (11 stimuli per set of emotions). Adolescent girls showed faster and more sensitive perception of facial emotions than boys. However, both adolescent boys and girls were most sensitive to variations in emotion intensity in faces combining happiness and sadness, and least sensitive to changes in faces comprising fear and anger. Furthermore, both sexes overidentified happiness and anger. However, the overidentification of happiness was stronger in boys. These findings were not influenced by individual differences in the level of pubertal maturation. These results indicate that male and female adolescents differ in their ability to identify emotions in morphed faces containing emotional blends. The findings provide information for clinical studies examining whether sex differences in emotional processing are related to sex differences in the prevalence of psychiatric disorders within this age group.

  16. Gender differences in emotion experience perception under different facial muscle manipulations.

    Science.gov (United States)

    Wang, Yufeng; Zhang, Dongjun; Zou, Feng; Li, Hao; Luo, Yanyan; Zhang, Meng; Liu, Yijun

    2016-04-01

    According to embodied emotion theory, facial manipulations should modulate and initiate particular emotions. However, whether there are gender differences in emotion experience perception under different facial muscle manipulations is not clear. Therefore, we conducted two behavioral experiments to examine gender differences in emotional perception in response to facial expressions (sad, neutral, and happy) under three conditions: (1) holding a pen using only the teeth (HPT), which facilitates the muscles typically associated with smiling; (2) holding a pen using only the lips (HPL), which inhibits the muscles typically associated with smiling; and (3) a control condition--hold no pen (HNP). We found that HPT made the emotional feelings more positive, and that the change degree of female's ratings of sad facial expressions between conditions (HPL to HPT) was larger than males'. These results suggested cognition can be affected by the interaction of the stimuli and the body, especially the female. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Enhanced embodied response following ambiguous emotional processing.

    Science.gov (United States)

    Beffara, Brice; Ouellet, Marc; Vermeulen, Nicolas; Basu, Anamitra; Morisseau, Tiffany; Mermillod, Martial

    2012-08-01

    It has generally been assumed that high-level cognitive and emotional processes are based on amodal conceptual information. In contrast, however, "embodied simulation" theory states that the perception of an emotional signal can trigger a simulation of the related state in the motor, somatosensory, and affective systems. To study the effect of social context on the mimicry effect predicted by the "embodied simulation" theory, we recorded the electromyographic (EMG) activity of participants when looking at emotional facial expressions. We observed an increase in embodied responses when the participants were exposed to a context involving social valence before seeing the emotional facial expressions. An examination of the dynamic EMG activity induced by two socially relevant emotional expressions (namely joy and anger) revealed enhanced EMG responses of the facial muscles associated with the related social prime (either positive or negative). These results are discussed within the general framework of embodiment theory.

  18. Gender and the capacity to identify facial emotional expressions

    Directory of Open Access Journals (Sweden)

    Carolina Baptista Menezes

    Full Text Available Recognizing emotional expressions is enabled by a fundamental sociocognitive mechanism of human nature. This study compared 114 women and 104 men on the identification of basic emotions on a recognition task that used culturally adapted and validated faces to the Brazilian context. It was also investigated whether gender differences on emotion recognition would vary according to different exposure times. Women were generally better at detecting facial expressions, but an interaction suggested that the female superiority was particularly observed for anger, disgust, and surprise; results did not change according to age or time exposure. However, regardless of sex, total accuracy improved as presentation times increased, but only fear and anger significantly differed between the presentation times. Hence, in addition to the support of the evolutionary hypothesis of the female superiority in detecting facial expressions of emotions, recognition of facial expressions also depend on the time available to correctly identify an expression.

  19. Realistic prediction of individual facial emotion expressions for craniofacial surgery simulations

    Science.gov (United States)

    Gladilin, Evgeny; Zachow, Stefan; Deuflhard, Peter; Hege, Hans-Christian

    2003-05-01

    In addition to the static soft tissue prediction, the estimation of individual facial emotion expressions is an important criterion for the evaluation of the carniofacial surgery planning. In this paper, we present an approach for the estimation of individual facial emotion expressions on the basis of geometrical models of human anatomy derived from tomographic data and the finite element modeling of facial tissue biomechanics.

  20. Unobtrusive multimodal emotion detection in adaptive interfaces: speech and facial expressions

    NARCIS (Netherlands)

    Truong, K.P.; Leeuwen, D.A. van; Neerincx, M.A.

    2007-01-01

    Two unobtrusive modalities for automatic emotion recognition are discussed: speech and facial expressions. First, an overview is given of emotion recognition studies based on a combination of speech and facial expressions. We will identify difficulties concerning data collection, data fusion, system

  1. Emotion and Object Processing in Parkinson's Disease

    Science.gov (United States)

    Cohen, Henri; Gagne, Marie-Helene; Hess, Ursula; Pourcher, Emmanuelle

    2010-01-01

    The neuropsychological literature on the processing of emotions in Parkinson's disease (PD) reveals conflicting evidence about the role of the basal ganglia in the recognition of facial emotions. Hence, the present study had two objectives. One was to determine the extent to which the visual processing of emotions and objects differs in PD. The…

  2. Capturing Physiology of Emotion along Facial Muscles: A Method of Distinguishing Feigned from Involuntary Expressions

    Science.gov (United States)

    Khan, Masood Mehmood; Ward, Robert D.; Ingleby, Michael

    The ability to distinguish feigned from involuntary expressions of emotions could help in the investigation and treatment of neuropsychiatric and affective disorders and in the detection of malingering. This work investigates differences in emotion-specific patterns of thermal variations along the major facial muscles. Using experimental data extracted from 156 images, we attempted to classify patterns of emotion-specific thermal variations into neutral, and voluntary and involuntary expressions of positive and negative emotive states. Initial results suggest (i) each facial muscle exhibits a unique thermal response to various emotive states; (ii) the pattern of thermal variances along the facial muscles may assist in classifying voluntary and involuntary facial expressions; and (iii) facial skin temperature measurements along the major facial muscles may be used in automated emotion assessment.

  3. Face processing in chronic alcoholism: a specific deficit for emotional features.

    Science.gov (United States)

    Maurage, P; Campanella, S; Philippot, P; Martin, S; de Timary, P

    2008-04-01

    It is well established that chronic alcoholism is associated with a deficit in the decoding of emotional facial expression (EFE). Nevertheless, it is still unclear whether this deficit is specifically for emotions or due to a more general impairment in visual or facial processing. This study was designed to clarify this issue using multiple control tasks and the subtraction method. Eighteen patients suffering from chronic alcoholism and 18 matched healthy control subjects were asked to perform several tasks evaluating (1) Basic visuo-spatial and facial identity processing; (2) Simple reaction times; (3) Complex facial features identification (namely age, emotion, gender, and race). Accuracy and reaction times were recorded. Alcoholic patients had a preserved performance for visuo-spatial and facial identity processing, but their performance was impaired for visuo-motor abilities and for the detection of complex facial aspects. More importantly, the subtraction method showed that alcoholism is associated with a specific EFE decoding deficit, still present when visuo-motor slowing down is controlled for. These results offer a post hoc confirmation of earlier data showing an EFE decoding deficit in alcoholism by strongly suggesting a specificity of this deficit for emotions. This may have implications for clinical situations, where emotional impairments are frequently observed among alcoholic subjects.

  4. Common cues to emotion in the dynamic facial expressions of speech and song.

    Science.gov (United States)

    Livingstone, Steven R; Thompson, William F; Wanderley, Marcelo M; Palmer, Caroline

    2015-01-01

    Speech and song are universal forms of vocalization that may share aspects of emotional expression. Research has focused on parallels in acoustic features, overlooking facial cues to emotion. In three experiments, we compared moving facial expressions in speech and song. In Experiment 1, vocalists spoke and sang statements each with five emotions. Vocalists exhibited emotion-dependent movements of the eyebrows and lip corners that transcended speech-song differences. Vocalists' jaw movements were coupled to their acoustic intensity, exhibiting differences across emotion and speech-song. Vocalists' emotional movements extended beyond vocal sound to include large sustained expressions, suggesting a communicative function. In Experiment 2, viewers judged silent videos of vocalists' facial expressions prior to, during, and following vocalization. Emotional intentions were identified accurately for movements during and after vocalization, suggesting that these movements support the acoustic message. Experiment 3 compared emotional identification in voice-only, face-only, and face-and-voice recordings. Emotion judgements for voice-only singing were poorly identified, yet were accurate for all other conditions, confirming that facial expressions conveyed emotion more accurately than the voice in song, yet were equivalent in speech. Collectively, these findings highlight broad commonalities in the facial cues to emotion in speech and song, yet highlight differences in perception and acoustic-motor production.

  5. Coherence explored between emotion components: evidence from event-related potentials and facial electromyography.

    Science.gov (United States)

    Gentsch, Kornelia; Grandjean, Didier; Scherer, Klaus R

    2014-04-01

    Componential theories assume that emotion episodes consist of emergent and dynamic response changes to relevant events in different components, such as appraisal, physiology, motivation, expression, and subjective feeling. In particular, Scherer's Component Process Model hypothesizes that subjective feeling emerges when the synchronization (or coherence) of appraisal-driven changes between emotion components has reached a critical threshold. We examined the prerequisite of this synchronization hypothesis for appraisal-driven response changes in facial expression. The appraisal process was manipulated by using feedback stimuli, presented in a gambling task. Participants' responses to the feedback were investigated in concurrently recorded brain activity related to appraisal (event-related potentials, ERP) and facial muscle activity (electromyography, EMG). Using principal component analysis, the prediction of appraisal-driven response changes in facial EMG was examined. Results support this prediction: early cognitive processes (related to the feedback-related negativity) seem to primarily affect the upper face, whereas processes that modulate P300 amplitudes tend to predominantly drive cheek region responses. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Development and validation of an Argentine set of facial expressions of emotion.

    Science.gov (United States)

    Vaiman, Marcelo; Wagner, Mónica Anna; Caicedo, Estefanía; Pereno, Germán Leandro

    2017-02-01

    Pictures of facial expressions of emotion are used in a wide range of experiments. The last decade has seen an increase in the number of studies presenting local sets of emotion stimuli. However, only a few existing sets contain pictures of Latin Americans, despite the growing attention emotion research is receiving in this region. Here we present the development and validation of the Universidad Nacional de Cordoba, Expresiones de Emociones Faciales (UNCEEF), a Facial Action Coding System (FACS)-verified set of pictures of Argentineans expressing the six basic emotions, plus neutral expressions. FACS scores, recognition rates, Hu scores, and discrimination indices are reported. Evidence of convergent validity was obtained using the Pictures of Facial Affect in an Argentine sample. However, recognition accuracy was greater for UNCEEF. The importance of local sets of emotion pictures is discussed.

  7. Feature Fusion Algorithm for Multimodal Emotion Recognition from Speech and Facial Expression Signal

    Directory of Open Access Journals (Sweden)

    Han Zhiyan

    2016-01-01

    Full Text Available In order to overcome the limitation of single mode emotion recognition. This paper describes a novel multimodal emotion recognition algorithm, and takes speech signal and facial expression signal as the research subjects. First, fuse the speech signal feature and facial expression signal feature, get sample sets by putting back sampling, and then get classifiers by BP neural network (BPNN. Second, measure the difference between two classifiers by double error difference selection strategy. Finally, get the final recognition result by the majority voting rule. Experiments show the method improves the accuracy of emotion recognition by giving full play to the advantages of decision level fusion and feature level fusion, and makes the whole fusion process close to human emotion recognition more, with a recognition rate 90.4%.

  8. Emotional Verbalization and Identification of Facial Expressions in Teenagers’ Communication

    Directory of Open Access Journals (Sweden)

    I. S. Ivanova

    2013-01-01

    Full Text Available The paper emphasizes the need for studying the subjective effectiveness criteria of interpersonal communication and importance of effective communication for personality development in adolescence. The problemof undeveloped representation of positive emotions in communication process is discussed. Both the identification and verbalization of emotions are regarded by the author as the basic communication skills. The experimental data regarding the longitude and age levels are described, the gender differences in identification and verbalization of emotions considered. The outcomes of experimental study demonstrate that the accuracy of facial emotional expressions of teenage boys and girls changes at different rates. The prospects of defining the age norms for identification and verbalization of emotions are analyzed.

  9. Automatic recognition of emotions from facial expressions

    Science.gov (United States)

    Xue, Henry; Gertner, Izidor

    2014-06-01

    In the human-computer interaction (HCI) process it is desirable to have an artificial intelligent (AI) system that can identify and categorize human emotions from facial expressions. Such systems can be used in security, in entertainment industries, and also to study visual perception, social interactions and disorders (e.g. schizophrenia and autism). In this work we survey and compare the performance of different feature extraction algorithms and classification schemes. We introduce a faster feature extraction method that resizes and applies a set of filters to the data images without sacrificing the accuracy. In addition, we have enhanced SVM to multiple dimensions while retaining the high accuracy rate of SVM. The algorithms were tested using the Japanese Female Facial Expression (JAFFE) Database and the Database of Faces (AT&T Faces).

  10. Dynamic facial expressions of emotion transmit an evolving hierarchy of signals over time.

    Science.gov (United States)

    Jack, Rachael E; Garrod, Oliver G B; Schyns, Philippe G

    2014-01-20

    Designed by biological and social evolutionary pressures, facial expressions of emotion comprise specific facial movements to support a near-optimal system of signaling and decoding. Although highly dynamical, little is known about the form and function of facial expression temporal dynamics. Do facial expressions transmit diagnostic signals simultaneously to optimize categorization of the six classic emotions, or sequentially to support a more complex communication system of successive categorizations over time? Our data support the latter. Using a combination of perceptual expectation modeling, information theory, and Bayesian classifiers, we show that dynamic facial expressions of emotion transmit an evolving hierarchy of "biologically basic to socially specific" information over time. Early in the signaling dynamics, facial expressions systematically transmit few, biologically rooted face signals supporting the categorization of fewer elementary categories (e.g., approach/avoidance). Later transmissions comprise more complex signals that support categorization of a larger number of socially specific categories (i.e., the six classic emotions). Here, we show that dynamic facial expressions of emotion provide a sophisticated signaling system, questioning the widely accepted notion that emotion communication is comprised of six basic (i.e., psychologically irreducible) categories, and instead suggesting four. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Emotional Representation in Facial Expression and Script: A Comparison between Normal and Autistic Children

    Science.gov (United States)

    Balconi, Michela; Carrera, Alba

    2007-01-01

    The paper explored conceptual and lexical skills with regard to emotional correlates of facial stimuli and scripts. In two different experimental phases normal and autistic children observed six facial expressions of emotions (happiness, anger, fear, sadness, surprise, and disgust) and six emotional scripts (contextualized facial expressions). In…

  12. Facial emotion recognition in Chinese with schizophrenia at early and chronic stages of illness.

    Science.gov (United States)

    Leung, Joey Shuk-Yan; Lee, Tatia M C; Lee, Chi-Chiu

    2011-12-30

    Deficits in facial emotion recognition have been recognised in Chinese patients diagnosed with schizophrenia. This study examined the relationship between chronicity of illness and performance of facial emotion recognition in Chinese with schizophrenia. There were altogether four groups of subjects matched for age and gender composition. The first and second groups comprised medically stable outpatients with first-episode schizophrenia (n=50) and their healthy controls (n=26). The third and fourth groups were patients with chronic schizophrenic illness (n=51) and their controls (n=28). The ability to recognise the six prototypical facial emotions was examined using locally validated coloured photographs from the Japanese and Caucasian Facial Expressions of Emotion. Chinese patients with schizophrenia, in both the first-episode and chronic stages, performed significantly worse than their control counterparts on overall facial emotion recognition, (Pemotion did not appear to have worsened over the course of disease progression, suggesting that recognition of facial emotion is a rather stable trait of the illness. The emotion-specific deficit may have implications for understanding the social difficulties in schizophrenia. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Facial emotion perception in Chinese patients with schizophrenia and non-psychotic first-degree relatives.

    Science.gov (United States)

    Li, Huijie; Chan, Raymond C K; Zhao, Qing; Hong, Xiaohong; Gong, Qi-Yong

    2010-03-17

    Although there is a consensus that patients with schizophrenia have certain deficits in perceiving and expressing facial emotions, previous studies of facial emotion perception in schizophrenia do not present consistent results. The objective of this study was to explore facial emotion perception deficits in Chinese patients with schizophrenia and their non-psychotic first-degree relatives. Sixty-nine patients with schizophrenia, 56 of their first-degree relatives (33 parents and 23 siblings), and 92 healthy controls (67 younger healthy controls matched to the patients and siblings, and 25 older healthy controls matched to the parents) completed a set of facial emotion perception tasks, including facial emotion discrimination, identification, intensity, valence, and corresponding face identification tasks. The results demonstrated that patients with schizophrenia performed significantly worse than their siblings and younger healthy controls in accuracy in a variety of facial emotion perception tasks, whereas the siblings of the patients performed as well as the corresponding younger healthy controls in all of the facial emotion perception tasks. Patients with schizophrenia also showed significantly reduced speed than younger healthy controls, while siblings of patients did not demonstrate significant differences with both patients and younger healthy controls in speed. Meanwhile, we also found that parents of the schizophrenia patients performed significantly worse than the corresponding older healthy controls in accuracy in terms of facial emotion identification, valence, and the composite index of the facial discrimination, identification, intensity and valence tasks. Moreover, no significant differences were found between the parents of patients and older healthy controls in speed after controlling the years of education and IQ. Taken together, the results suggest that facial emotion perception deficits may serve as potential endophenotypes for schizophrenia

  14. 5-HTTLPR modulates the recognition accuracy and exploration of emotional facial expressions

    Directory of Open Access Journals (Sweden)

    Sabrina eBoll

    2014-07-01

    Full Text Available Individual genetic differences in the serotonin transporter-linked polymorphic region (5-HTTLPR have been associated with variations in the sensitivity to social and emotional cues as well as altered amygdala reactivity to facial expressions of emotion. Amygdala activation has further been shown to trigger gaze changes towards diagnostically relevant facial features. The current study examined whether altered socio-emotional reactivity in variants of the 5-HTTLPR promoter polymorphism reflects individual differences in attending to diagnostic features of facial expressions. For this purpose, visual exploration of emotional facial expressions was compared between a low (n=39 and a high (n=40 5-HTT expressing group of healthy human volunteers in an eye tracking paradigm. Emotional faces were presented while manipulating the initial fixation such that saccadic changes towards the eyes and towards the mouth could be identified. We found that the low versus the high 5-HTT group demonstrated greater accuracy with regard to emotion classifications, particularly when faces were presented for a longer duration. No group differences in gaze orientation towards diagnostic facial features could be observed. However, participants in the low 5-HTT group exhibited more and faster fixation changes for certain emotions when faces were presented for a longer duration and overall face fixation times were reduced for this genotype group. These results suggest that the 5-HTT gene influences social perception by modulating the general vigilance to social cues rather than selectively affecting the pre-attentive detection of diagnostic facial features.

  15. Face puzzle—two new video-based tasks for measuring explicit and implicit aspects of facial emotion recognition

    Science.gov (United States)

    Kliemann, Dorit; Rosenblau, Gabriela; Bölte, Sven; Heekeren, Hauke R.; Dziobek, Isabel

    2013-01-01

    Recognizing others' emotional states is crucial for effective social interaction. While most facial emotion recognition tasks use explicit prompts that trigger consciously controlled processing, emotional faces are almost exclusively processed implicitly in real life. Recent attempts in social cognition suggest a dual process perspective, whereby explicit and implicit processes largely operate independently. However, due to differences in methodology the direct comparison of implicit and explicit social cognition has remained a challenge. Here, we introduce a new tool to comparably measure implicit and explicit processing aspects comprising basic and complex emotions in facial expressions. We developed two video-based tasks with similar answer formats to assess performance in respective facial emotion recognition processes: Face Puzzle, implicit and explicit. To assess the tasks' sensitivity to atypical social cognition and to infer interrelationship patterns between explicit and implicit processes in typical and atypical development, we included healthy adults (NT, n = 24) and adults with autism spectrum disorder (ASD, n = 24). Item analyses yielded good reliability of the new tasks. Group-specific results indicated sensitivity to subtle social impairments in high-functioning ASD. Correlation analyses with established implicit and explicit socio-cognitive measures were further in favor of the tasks' external validity. Between group comparisons provide first hints of differential relations between implicit and explicit aspects of facial emotion recognition processes in healthy compared to ASD participants. In addition, an increased magnitude of between group differences in the implicit task was found for a speed-accuracy composite measure. The new Face Puzzle tool thus provides two new tasks to separately assess explicit and implicit social functioning, for instance, to measure subtle impairments as well as potential improvements due to social cognitive

  16. Singing emotionally: A study of pre-production, production, and post-production facial expressions

    Directory of Open Access Journals (Sweden)

    Lena Rachel Quinto

    2014-04-01

    Full Text Available Singing involves vocal production accompanied by a dynamic and meaningful use of facial expressions, which may serve as ancillary gestures that complement, disambiguate, or reinforce the acoustic signal. In this investigation, we examined the use of facial movements to communicate emotion, focusing on movements arising in three epochs: before vocalisation (pre-production, during vocalisation (production, and immediately after vocalisation (post-production. The stimuli were recordings of seven vocalists’ facial movements as they sang short (14 syllable melodic phrases with the intention of communicating happiness, sadness, irritation, or no emotion. Facial movements were presented as point-light displays to 16 observers who judged the emotion conveyed. Experiment 1 revealed that the accuracy of emotional judgement varied with singer, emotion and epoch. Accuracy was highest in the production epoch, however, happiness was well communicated in the pre-production epoch. In Experiment 2, observers judged point-light displays of exaggerated movements. The ratings suggested that the extent of facial and head movements is largely perceived as a gauge of emotional arousal. In Experiment 3, observers rated point-light displays of scrambled movements. Configural information was removed in these stimuli but velocity and acceleration were retained. Exaggerated scrambled movements were likely to be associated with happiness or irritation whereas unexaggerated scrambled movements were more likely to be identified as neutral. An analysis of the motions of singers revealed systematic changes in facial movement as a function of the emotional intentions of singers. The findings confirm the central role of facial expressions in vocal emotional communication, and highlight individual differences between singers in the amount and intelligibility of facial movements made before, during, and after vocalization.

  17. What emotion does the "facial expression of disgust" express?

    Science.gov (United States)

    Pochedly, Joseph T; Widen, Sherri C; Russell, James A

    2012-12-01

    The emotion attributed to the prototypical "facial expression of disgust" (a nose scrunch) depended on what facial expressions preceded it. In two studies, the majority of 120 children (5-14 years) and 135 adults (16-58 years) judged the nose scrunch as expressing disgust when the preceding set included an anger scowl, but as angry when the anger scowl was omitted. An even greater proportion of observers judged the nose scrunch as angry when the preceding set also included a facial expression of someone about to be sick. The emotion attributed to the nose scrunch therefore varies with experimental context. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  18. Americans and Palestinians judge spontaneous facial expressions of emotion.

    Science.gov (United States)

    Kayyal, Mary H; Russell, James A

    2013-10-01

    The claim that certain emotions are universally recognized from facial expressions is based primarily on the study of expressions that were posed. The current study was of spontaneous facial expressions shown by aborigines in Papua New Guinea (Ekman, 1980); 17 faces claimed to convey one (or, in the case of blends, two) basic emotions and five faces claimed to show other universal feelings. For each face, participants rated the degree to which each of the 12 predicted emotions or feelings was conveyed. The modal choice for English-speaking Americans (n = 60), English-speaking Palestinians (n = 60), and Arabic-speaking Palestinians (n = 44) was the predicted label for only 4, 5, and 4, respectively, of the 17 faces for basic emotions, and for only 2, 2, and 2, respectively, of the 5 faces for other feelings. Observers endorsed the predicted emotion or feeling moderately often (65%, 55%, and 44%), but also denied it moderately often (35%, 45%, and 56%). They also endorsed more than one (or, for blends, two) label(s) in each face-on average, 2.3, 2.3, and 1.5 of basic emotions and 2.6, 2.2, and 1.5 of other feelings. There were both similarities and differences across culture and language, but the emotional meaning of a facial expression is not well captured by the predicted label(s) or, indeed, by any single label.

  19. Performance-driven facial animation: basic research on human judgments of emotional state in facial avatars.

    Science.gov (United States)

    Rizzo, A A; Neumann, U; Enciso, R; Fidaleo, D; Noh, J Y

    2001-08-01

    Virtual reality is rapidly evolving into a pragmatically usable technology for mental health (MH) applications. As the underlying enabling technologies continue to evolve and allow us to design more useful and usable structural virtual environments (VEs), the next important challenge will involve populating these environments with virtual representations of humans (avatars). This will be vital to create mental health VEs that leverage the use of avatars for applications that require human-human interaction and communication. As Alessi et al.1 pointed out at the 8th Annual Medicine Meets Virtual Reality Conference (MMVR8), virtual humans have mainly appeared in MH applications to "serve the role of props, rather than humans." More believable avatars inhabiting VEs would open up possibilities for MH applications that address social interaction, communication, instruction, assessment, and rehabilitation issues. They could also serve to enhance realism that might in turn promote the experience of presence in VR. Additionally, it will soon be possible to use computer-generated avatars that serve to provide believable dynamic facial and bodily representations of individuals communicating from a distance in real time. This could support the delivery, in shared virtual environments, of more natural human interaction styles, similar to what is used in real life between people. These techniques could enhance communication and interaction by leveraging our natural sensing and perceiving capabilities and offer the potential to model human-computer-human interaction after human-human interaction. To enhance the authenticity of virtual human representations, advances in the rendering of facial and gestural behaviors that support implicit communication will be needed. In this regard, the current paper presents data from a study that compared human raters' judgments of emotional expression between actual video clips of facial expressions and identical expressions rendered on a

  20. Dynamic Displays Enhance the Ability to Discriminate Genuine and Posed Facial Expressions of Emotion

    Science.gov (United States)

    Namba, Shushi; Kabir, Russell S.; Miyatani, Makoto; Nakao, Takashi

    2018-01-01

    Accurately gauging the emotional experience of another person is important for navigating interpersonal interactions. This study investigated whether perceivers are capable of distinguishing between unintentionally expressed (genuine) and intentionally manipulated (posed) facial expressions attributed to four major emotions: amusement, disgust, sadness, and surprise. Sensitivity to this discrimination was explored by comparing unstaged dynamic and static facial stimuli and analyzing the results with signal detection theory. Participants indicated whether facial stimuli presented on a screen depicted a person showing a given emotion and whether that person was feeling a given emotion. The results showed that genuine displays were evaluated more as felt expressions than posed displays for all target emotions presented. In addition, sensitivity to the perception of emotional experience, or discriminability, was enhanced in dynamic facial displays, but was less pronounced in the case of static displays. This finding indicates that dynamic information in facial displays contributes to the ability to accurately infer the emotional experiences of another person. PMID:29896135

  1. Facial Expression Enhances Emotion Perception Compared to Vocal Prosody: Behavioral and fMRI Studies.

    Science.gov (United States)

    Zhang, Heming; Chen, Xuhai; Chen, Shengdong; Li, Yansong; Chen, Changming; Long, Quanshan; Yuan, Jiajin

    2018-05-09

    Facial and vocal expressions are essential modalities mediating the perception of emotion and social communication. Nonetheless, currently little is known about how emotion perception and its neural substrates differ across facial expression and vocal prosody. To clarify this issue, functional MRI scans were acquired in Study 1, in which participants were asked to discriminate the valence of emotional expression (angry, happy or neutral) from facial, vocal, or bimodal stimuli. In Study 2, we used an affective priming task (unimodal materials as primers and bimodal materials as target) and participants were asked to rate the intensity, valence, and arousal of the targets. Study 1 showed higher accuracy and shorter response latencies in the facial than in the vocal modality for a happy expression. Whole-brain analysis showed enhanced activation during facial compared to vocal emotions in the inferior temporal-occipital regions. Region of interest analysis showed a higher percentage signal change for facial than for vocal anger in the superior temporal sulcus. Study 2 showed that facial relative to vocal priming of anger had a greater influence on perceived emotion for bimodal targets, irrespective of the target valence. These findings suggest that facial expression is associated with enhanced emotion perception compared to equivalent vocal prosodies.

  2. Relationship between individual differences in functional connectivity and facial-emotion recognition abilities in adults with traumatic brain injury.

    Science.gov (United States)

    Rigon, A; Voss, M W; Turkstra, L S; Mutlu, B; Duff, M C

    2017-01-01

    Although several studies have demonstrated that facial-affect recognition impairment is common following moderate-severe traumatic brain injury (TBI), and that there are diffuse alterations in large-scale functional brain networks in TBI populations, little is known about the relationship between the two. Here, in a sample of 26 participants with TBI and 20 healthy comparison participants (HC) we measured facial-affect recognition abilities and resting-state functional connectivity (rs-FC) using fMRI. We then used network-based statistics to examine (A) the presence of rs-FC differences between individuals with TBI and HC within the facial-affect processing network, and (B) the association between inter-individual differences in emotion recognition skills and rs-FC within the facial-affect processing network. We found that participants with TBI showed significantly lower rs-FC in a component comprising homotopic and within-hemisphere, anterior-posterior connections within the facial-affect processing network. In addition, within the TBI group, participants with higher emotion-labeling skills showed stronger rs-FC within a network comprised of intra- and inter-hemispheric bilateral connections. Findings indicate that the ability to successfully recognize facial-affect after TBI is related to rs-FC within components of facial-affective networks, and provide new evidence that further our understanding of the mechanisms underlying emotion recognition impairment in TBI.

  3. Facing mixed emotions: Analytic and holistic perception of facial emotion expressions engages separate brain networks.

    Science.gov (United States)

    Meaux, Emilie; Vuilleumier, Patrik

    2016-11-01

    The ability to decode facial emotions is of primary importance for human social interactions; yet, it is still debated how we analyze faces to determine their expression. Here we compared the processing of emotional face expressions through holistic integration and/or local analysis of visual features, and determined which brain systems mediate these distinct processes. Behavioral, physiological, and brain responses to happy and angry faces were assessed by presenting congruent global configurations of expressions (e.g., happy top+happy bottom), incongruent composite configurations (e.g., angry top+happy bottom), and isolated features (e.g. happy top only). Top and bottom parts were always from the same individual. Twenty-six healthy volunteers were scanned using fMRI while they classified the expression in either the top or the bottom face part but ignored information in the other non-target part. Results indicate that the recognition of happy and anger expressions is neither strictly holistic nor analytic Both routes were involved, but with a different role for analytic and holistic information depending on the emotion type, and different weights of local features between happy and anger expressions. Dissociable neural pathways were engaged depending on emotional face configurations. In particular, regions within the face processing network differed in their sensitivity to holistic expression information, which predominantly activated fusiform, inferior occipital areas and amygdala when internal features were congruent (i.e. template matching), whereas more local analysis of independent features preferentially engaged STS and prefrontal areas (IFG/OFC) in the context of full face configurations, but early visual areas and pulvinar when seen in isolated parts. Collectively, these findings suggest that facial emotion recognition recruits separate, but interactive dorsal and ventral routes within the face processing networks, whose engagement may be shaped by

  4. Facial Expressions in Context: Contributions to Infant Emotion Theory.

    Science.gov (United States)

    Camras, Linda A.

    To make the point that infant emotions are more dynamic than suggested by Differential Emotions Theory, which maintains that infants show the same prototypical facial expressions for emotions as adults do, this paper explores two questions: (1) when infants experience an emotion, do they always show the corresponding prototypical facial…

  5. A selective emotional decision-making bias elicited by facial expressions.

    Directory of Open Access Journals (Sweden)

    Nicholas Furl

    Full Text Available Emotional and social information can sway otherwise rational decisions. For example, when participants decide between two faces that are probabilistically rewarded, they make biased choices that favor smiling relative to angry faces. This bias may arise because facial expressions evoke positive and negative emotional responses, which in turn may motivate social approach and avoidance. We tested a wide range of pictures that evoke emotions or convey social information, including animals, words, foods, a variety of scenes, and faces differing in trustworthiness or attractiveness, but we found only facial expressions biased decisions. Our results extend brain imaging and pharmacological findings, which suggest that a brain mechanism supporting social interaction may be involved. Facial expressions appear to exert special influence over this social interaction mechanism, one capable of biasing otherwise rational choices. These results illustrate that only specific types of emotional experiences can best sway our choices.

  6. A Selective Emotional Decision-Making Bias Elicited by Facial Expressions

    Science.gov (United States)

    Furl, Nicholas; Gallagher, Shannon; Averbeck, Bruno B.

    2012-01-01

    Emotional and social information can sway otherwise rational decisions. For example, when participants decide between two faces that are probabilistically rewarded, they make biased choices that favor smiling relative to angry faces. This bias may arise because facial expressions evoke positive and negative emotional responses, which in turn may motivate social approach and avoidance. We tested a wide range of pictures that evoke emotions or convey social information, including animals, words, foods, a variety of scenes, and faces differing in trustworthiness or attractiveness, but we found only facial expressions biased decisions. Our results extend brain imaging and pharmacological findings, which suggest that a brain mechanism supporting social interaction may be involved. Facial expressions appear to exert special influence over this social interaction mechanism, one capable of biasing otherwise rational choices. These results illustrate that only specific types of emotional experiences can best sway our choices. PMID:22438936

  7. A selective emotional decision-making bias elicited by facial expressions.

    Science.gov (United States)

    Furl, Nicholas; Gallagher, Shannon; Averbeck, Bruno B

    2012-01-01

    Emotional and social information can sway otherwise rational decisions. For example, when participants decide between two faces that are probabilistically rewarded, they make biased choices that favor smiling relative to angry faces. This bias may arise because facial expressions evoke positive and negative emotional responses, which in turn may motivate social approach and avoidance. We tested a wide range of pictures that evoke emotions or convey social information, including animals, words, foods, a variety of scenes, and faces differing in trustworthiness or attractiveness, but we found only facial expressions biased decisions. Our results extend brain imaging and pharmacological findings, which suggest that a brain mechanism supporting social interaction may be involved. Facial expressions appear to exert special influence over this social interaction mechanism, one capable of biasing otherwise rational choices. These results illustrate that only specific types of emotional experiences can best sway our choices.

  8. Facial expressions of emotion and psychopathology in adolescent boys.

    Science.gov (United States)

    Keltner, D; Moffitt, T E; Stouthamer-Loeber, M

    1995-11-01

    On the basis of the widespread belief that emotions underpin psychological adjustment, the authors tested 3 predicted relations between externalizing problems and anger, internalizing problems and fear and sadness, and the absence of externalizing problems and social-moral emotion (embarrassment). Seventy adolescent boys were classified into 1 of 4 comparison groups on the basis of teacher reports using a behavior problem checklist: internalizers, externalizers, mixed (both internalizers and externalizers), and nondisordered boys. The authors coded the facial expressions of emotion shown by the boys during a structured social interaction. Results supported the 3 hypotheses: (a) Externalizing adolescents showed increased facial expressions of anger, (b) on 1 measure internalizing adolescents showed increased facial expressions of fear, and (c) the absence of externalizing problems (or nondisordered classification) was related to increased displays of embarrassment. Discussion focused on the relations of these findings to hypotheses concerning the role of impulse control in antisocial behavior.

  9. Effect of positive emotion on consolidation of memory for faces: the modulation of facial valence and facial gender.

    Science.gov (United States)

    Wang, Bo

    2013-01-01

    Studies have shown that emotion elicited after learning enhances memory consolidation. However, no prior studies have used facial photos as stimuli. This study examined the effect of post-learning positive emotion on consolidation of memory for faces. During the learning participants viewed neutral, positive, or negative faces. Then they were assigned to a condition in which they either watched a 9-minute positive video clip, or a 9-minute neutral video. Then 30 minutes after the learning participants took a surprise memory test, in which they made "remember", "know", and "new" judgements. The findings are: (1) Positive emotion enhanced consolidation of recognition for negative male faces, but impaired consolidation of recognition for negative female faces; (2) For males, recognition for negative faces was equivalent to that for positive faces; for females, recognition for negative faces was better than that for positive faces. Our study provides the important evidence that effect of post-learning emotion on memory consolidation can extend to facial stimuli and such an effect can be modulated by facial valence and facial gender. The findings may shed light on establishing models concerning the influence of emotion on memory consolidation.

  10. Predicting Emotions in Facial Expressions from the Annotations in Naturally Occurring First Encounters

    DEFF Research Database (Denmark)

    Navarretta, Costanza

    2014-01-01

    This paper deals with the automatic identification of emotions from the manual annotations of the shape and functions of facial expressions in a Danish corpus of video recorded naturally occurring first encounters. More specifically, a support vector classified is trained on the corpus annotations...... to identify emotions in facial expressions. In the classification experiments, we test to what extent emotions expressed in naturally-occurring conversations can be identified automatically by a classifier trained on the manual annotations of the shape of facial expressions and co-occurring speech tokens. We...... also investigate the relation between emotions and the communicative functions of facial expressions. Both emotion labels and their values in a three dimensional space are identified. The three dimensions are Pleasure, Arousal and Dominance. The results of our experiments indicate that the classifiers...

  11. Virtual facial expressions of emotions: An initial concomitant and construct validity study.

    Directory of Open Access Journals (Sweden)

    Christian eJoyal

    2014-09-01

    Full Text Available Abstract. Background. Facial expressions of emotions represent classic stimuli for the study of social cognition. Developing virtual dynamic facial expressions of emotions, however, would open-up possibilities, both for fundamental and clinical research. For instance, virtual faces allow real-time Human-Computer retroactions between physiological measures and the virtual agent. Objectives. The goal of this study was to initially assess concomitant and construct validity of a newly developed set of virtual faces expressing 6 fundamental emotions (happiness, surprise, anger, sadness, fear, or disgust. Recognition rates, facial electromyography (zygomatic major and corrugator supercilii muscles, and regional gaze fixation latencies (eyes and mouth regions were compared in 41 adult volunteers (20 ♂, 21 ♀ during the presentation of video clips depicting real vs. virtual adults expressing emotions. Results. Emotions expressed by each sets of stimuli were similarly recognized, both by men and women. Accordingly, both sets of stimuli elicited similar activation of facial muscles and similar ocular fixation times in eye regions from man and woman participants. Conclusion. Further validation studies can be performed with these virtual faces among clinical populations known to present social cognition difficulties. Brain-Computer Interface studies with feedback-feed forward interactions based on facial emotion expressions can also be conducted with these stimuli.

  12. Facial emotional recognition in schizophrenia: preliminary results of the virtual reality program for facial emotional recognition

    Directory of Open Access Journals (Sweden)

    Teresa Souto

    2013-01-01

    Full Text Available BACKGROUND: Significant deficits in emotional recognition and social perception characterize patients with schizophrenia and have direct negative impact both in inter-personal relationships and in social functioning. Virtual reality, as a methodological resource, might have a high potential for assessment and training skills in people suffering from mental illness. OBJECTIVES: To present preliminary results of a facial emotional recognition assessment designed for patients with schizophrenia, using 3D avatars and virtual reality. METHODS: Presentation of 3D avatars which reproduce images developed with the FaceGen® software and integrated in a three-dimensional virtual environment. Each avatar was presented to a group of 12 patients with schizophrenia and a reference group of 12 subjects without psychiatric pathology. RESULTS: The results show that the facial emotions of happiness and anger are better recognized by both groups and that the major difficulties arise in fear and disgust recognition. Frontal alpha electroencephalography variations were found during the presentation of anger and disgust stimuli among patients with schizophrenia. DISCUSSION: The developed program evaluation module can be of surplus value both for patient and therapist, providing the task execution in a non anxiogenic environment, however similar to the actual experience.

  13. The Mysterious Noh Mask: Contribution of Multiple Facial Parts to the Recognition of Emotional Expressions

    Science.gov (United States)

    Miyata, Hiromitsu; Nishimura, Ritsuko; Okanoya, Kazuo; Kawai, Nobuyuki

    2012-01-01

    Background A Noh mask worn by expert actors when performing on a Japanese traditional Noh drama is suggested to convey countless different facial expressions according to different angles of head/body orientation. The present study addressed the question of how different facial parts of a Noh mask, including the eyebrows, the eyes, and the mouth, may contribute to different emotional expressions. Both experimental situations of active creation and passive recognition of emotional facial expressions were introduced. Methodology/Principal Findings In Experiment 1, participants either created happy or sad facial expressions, or imitated a face that looked up or down, by actively changing each facial part of a Noh mask image presented on a computer screen. For an upward tilted mask, the eyebrows and the mouth shared common features with sad expressions, whereas the eyes with happy expressions. This contingency tended to be reversed for a downward tilted mask. Experiment 2 further examined which facial parts of a Noh mask are crucial in determining emotional expressions. Participants were exposed to the synthesized Noh mask images with different facial parts expressing different emotions. Results clearly revealed that participants primarily used the shape of the mouth in judging emotions. The facial images having the mouth of an upward/downward tilted Noh mask strongly tended to be evaluated as sad/happy, respectively. Conclusions/Significance The results suggest that Noh masks express chimeric emotional patterns, with different facial parts conveying different emotions This appears consistent with the principles of Noh which highly appreciate subtle and composite emotional expressions, as well as with the mysterious facial expressions observed in Western art. It was further demonstrated that the mouth serves as a diagnostic feature in characterizing the emotional expressions. This indicates the superiority of biologically-driven factors over the traditionally

  14. The Change in Facial Emotion Recognition Ability in Inpatients with Treatment Resistant Schizophrenia After Electroconvulsive Therapy.

    Science.gov (United States)

    Dalkıran, Mihriban; Tasdemir, Akif; Salihoglu, Tamer; Emul, Murat; Duran, Alaattin; Ugur, Mufit; Yavuz, Ruhi

    2017-09-01

    People with schizophrenia have impairments in emotion recognition along with other social cognitive deficits. In the current study, we aimed to investigate the immediate benefits of ECT on facial emotion recognition ability. Thirty-two treatment resistant patients with schizophrenia who have been indicated for ECT enrolled in the study. Facial emotion stimuli were a set of 56 photographs that depicted seven basic emotions: sadness, anger, happiness, disgust, surprise, fear, and neutral faces. The average age of the participants was 33.4 ± 10.5 years. The rate of recognizing the disgusted facial expression increased significantly after ECT (p facial expressions (p > 0.05). After the ECT, the time period of responding to the fear and happy facial expressions were significantly shorter (p Facial emotion recognition ability is an important social cognitive skill for social harmony, proper relation and living independently. At least, the ECT sessions do not seem to affect facial emotion recognition ability negatively and seem to improve identifying disgusted facial emotion which is related with dopamine enriched regions in brain.

  15. Neural correlates of the perception of dynamic versus static facial expressions of emotion.

    Science.gov (United States)

    Kessler, Henrik; Doyen-Waldecker, Cornelia; Hofer, Christian; Hoffmann, Holger; Traue, Harald C; Abler, Birgit

    2011-04-20

    This study investigated brain areas involved in the perception of dynamic facial expressions of emotion. A group of 30 healthy subjects was measured with fMRI when passively viewing prototypical facial expressions of fear, disgust, sadness and happiness. Using morphing techniques, all faces were displayed as still images and also dynamically as a film clip with the expressions evolving from neutral to emotional. Irrespective of a specific emotion, dynamic stimuli selectively activated bilateral superior temporal sulcus, visual area V5, fusiform gyrus, thalamus and other frontal and parietal areas. Interaction effects of emotion and mode of presentation (static/dynamic) were only found for the expression of happiness, where static faces evoked greater activity in the medial prefrontal cortex. Our results confirm previous findings on neural correlates of the perception of dynamic facial expressions and are in line with studies showing the importance of the superior temporal sulcus and V5 in the perception of biological motion. Differential activation in the fusiform gyrus for dynamic stimuli stands in contrast to classical models of face perception but is coherent with new findings arguing for a more general role of the fusiform gyrus in the processing of socially relevant stimuli.

  16. Acute alcohol effects on facial expressions of emotions in social drinkers: a systematic review

    Science.gov (United States)

    Capito, Eva Susanne; Lautenbacher, Stefan; Horn-Hofmann, Claudia

    2017-01-01

    Background As known from everyday experience and experimental research, alcohol modulates emotions. Particularly regarding social interaction, the effects of alcohol on the facial expression of emotion might be of relevance. However, these effects have not been systematically studied. We performed a systematic review on acute alcohol effects on social drinkers’ facial expressions of induced positive and negative emotions. Materials and methods With a predefined algorithm, we searched three electronic databases (PubMed, PsycInfo, and Web of Science) for studies conducted on social drinkers that used acute alcohol administration, emotion induction, and standardized methods to record facial expressions. We excluded those studies that failed common quality standards, and finally selected 13 investigations for this review. Results Overall, alcohol exerted effects on facial expressions of emotions in social drinkers. These effects were not generally disinhibiting, but varied depending on the valence of emotion and on social interaction. Being consumed within social groups, alcohol mostly influenced facial expressions of emotions in a socially desirable way, thus underscoring the view of alcohol as social lubricant. However, methodical differences regarding alcohol administration between the studies complicated comparability. Conclusion Our review highlighted the relevance of emotional valence and social-context factors for acute alcohol effects on social drinkers’ facial expressions of emotions. Future research should investigate how these alcohol effects influence the development of problematic drinking behavior in social drinkers. PMID:29255375

  17. Development of Facial Emotion Recognition in Childhood: Age-related Differences in a Shortened Version of the Facial Expression of Emotions - Stimuli and Tests. Data from an ongoing study.

    NARCIS (Netherlands)

    Coenen, Maraike; Aarnoudse, Ceciel; Braams, O.; Veenstra, Wencke S.

    2014-01-01

    OBJECTIVE: Facial emotion recognition is a crucial aspect of social cognition and deficits have been shown to be related to psychiatric disorders in adults and children. However, the development of facial emotion recognition is less clear (Herba & Philips, 2004) and an appropriate instrument to

  18. Lonely adolescents exhibit heightened sensitivity for facial cues of emotion.

    Science.gov (United States)

    Vanhalst, Janne; Gibb, Brandon E; Prinstein, Mitchell J

    2017-02-01

    Contradicting evidence exists regarding the link between loneliness and sensitivity to facial cues of emotion, as loneliness has been related to better but also to worse performance on facial emotion recognition tasks. This study aims to contribute to this debate and extends previous work by (a) focusing on both accuracy and sensitivity to detecting positive and negative expressions, (b) controlling for depressive symptoms and social anxiety, and (c) using an advanced emotion recognition task with videos of neutral adolescent faces gradually morphing into full-intensity expressions. Participants were 170 adolescents (49% boys; M age  = 13.65 years) from rural, low-income schools. Results showed that loneliness was associated with increased sensitivity to happy, sad, and fear faces. When controlling for depressive symptoms and social anxiety, loneliness remained significantly associated with sensitivity to sad and fear faces. Together, these results suggest that lonely adolescents are vigilant to negative facial cues of emotion.

  19. Direction of Amygdala-Neocortex Interaction During Dynamic Facial Expression Processing.

    Science.gov (United States)

    Sato, Wataru; Kochiyama, Takanori; Uono, Shota; Yoshikawa, Sakiko; Toichi, Motomi

    2017-03-01

    Dynamic facial expressions of emotion strongly elicit multifaceted emotional, perceptual, cognitive, and motor responses. Neuroimaging studies revealed that some subcortical (e.g., amygdala) and neocortical (e.g., superior temporal sulcus and inferior frontal gyrus) brain regions and their functional interaction were involved in processing dynamic facial expressions. However, the direction of the functional interaction between the amygdala and the neocortex remains unknown. To investigate this issue, we re-analyzed functional magnetic resonance imaging (fMRI) data from 2 studies and magnetoencephalography (MEG) data from 1 study. First, a psychophysiological interaction analysis of the fMRI data confirmed the functional interaction between the amygdala and neocortical regions. Then, dynamic causal modeling analysis was used to compare models with forward, backward, or bidirectional effective connectivity between the amygdala and neocortical networks in the fMRI and MEG data. The results consistently supported the model of effective connectivity from the amygdala to the neocortex. Further increasing time-window analysis of the MEG demonstrated that this model was valid after 200 ms from the stimulus onset. These data suggest that emotional processing in the amygdala rapidly modulates some neocortical processing, such as perception, recognition, and motor mimicry, when observing dynamic facial expressions of emotion. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. The role of the cannabinoid receptor in adolescents' processing of facial expressions.

    Science.gov (United States)

    Ewald, Anais; Becker, Susanne; Heinrich, Angela; Banaschewski, Tobias; Poustka, Luise; Bokde, Arun; Büchel, Christian; Bromberg, Uli; Cattrell, Anna; Conrod, Patricia; Desrivières, Sylvane; Frouin, Vincent; Papadopoulos-Orfanos, Dimitri; Gallinat, Jürgen; Garavan, Hugh; Heinz, Andreas; Walter, Henrik; Ittermann, Bernd; Gowland, Penny; Paus, Tomáš; Martinot, Jean-Luc; Paillère Martinot, Marie-Laure; Smolka, Michael N; Vetter, Nora; Whelan, Rob; Schumann, Gunter; Flor, Herta; Nees, Frauke

    2016-01-01

    The processing of emotional faces is an important prerequisite for adequate social interactions in daily life, and might thus specifically be altered in adolescence, a period marked by significant changes in social emotional processing. Previous research has shown that the cannabinoid receptor CB1R is associated with longer gaze duration and increased brain responses in the striatum to happy faces in adults, yet, for adolescents, it is not clear whether an association between CBR1 and face processing exists. In the present study we investigated genetic effects of the two CB1R polymorphisms, rs1049353 and rs806377, on the processing of emotional faces in healthy adolescents. They participated in functional magnetic resonance imaging during a Faces Task, watching blocks of video clips with angry and neutral facial expressions, and completed a Morphed Faces Task in the laboratory where they looked at different facial expressions that switched from anger to fear or sadness or from happiness to fear or sadness, and labelled them according to these four emotional expressions. A-allele versus GG-carriers in rs1049353 displayed earlier recognition of facial expressions changing from anger to sadness or fear, but not for expressions changing from happiness to sadness or fear, and higher brain responses to angry, but not neutral, faces in the amygdala and insula. For rs806377 no significant effects emerged. This suggests that rs1049353 is involved in the processing of negative facial expressions with relation to anger in adolescence. These findings add to our understanding of social emotion-related mechanisms in this life period. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  1. Deficits in recognition, identification, and discrimination of facial emotions in patients with bipolar disorder.

    Science.gov (United States)

    Benito, Adolfo; Lahera, Guillermo; Herrera, Sara; Muncharaz, Ramón; Benito, Guillermo; Fernández-Liria, Alberto; Montes, José Manuel

    2013-01-01

    To analyze the recognition, identification, and discrimination of facial emotions in a sample of outpatients with bipolar disorder (BD). Forty-four outpatients with diagnosis of BD and 48 matched control subjects were selected. Both groups were assessed with tests for recognition (Emotion Recognition-40 - ER40), identification (Facial Emotion Identification Test - FEIT), and discrimination (Facial Emotion Discrimination Test - FEDT) of facial emotions, as well as a theory of mind (ToM) verbal test (Hinting Task). Differences between groups were analyzed, controlling the influence of mild depressive and manic symptoms. Patients with BD scored significantly lower than controls on recognition (ER40), identification (FEIT), and discrimination (FEDT) of emotions. Regarding the verbal measure of ToM, a lower score was also observed in patients compared to controls. Patients with mild syndromal depressive symptoms obtained outcomes similar to patients in euthymia. A significant correlation between FEDT scores and global functioning (measured by the Functioning Assessment Short Test, FAST) was found. These results suggest that, even in euthymia, patients with BD experience deficits in recognition, identification, and discrimination of facial emotions, with potential functional implications.

  2. Processing emotional body expressions: state-of-the-art.

    Science.gov (United States)

    Enea, Violeta; Iancu, Sorina

    2016-10-01

    Processing emotional body expressions has become recently an important topic in affective and social neuroscience along with the investigation of facial expressions. The objective of the study is to review the literature on emotional body expressions in order to discuss the current state of knowledge on this topic and identify directions for future research. The following electronic databases were searched: PsychINFO, Ebsco, ERIC, ProQuest, Sagepub, and SCOPUS using terms such as "body," "bodily expression," "body perception," "emotions," "posture," "body recognition" and combinations of them. The synthesis revealed several research questions that were addressed in neuroimaging, electrophysiological and behavioral studies. Among them, one important question targeted the neural mechanisms of emotional processing of body expressions to specific subsections regarding the time course for the integration of emotional signals from face and body, as well as the role of context in the perception of emotional signals. Processing bodily expression of emotion is similar to processing facial expressions, and the holistic processing is extended to the whole person. The current state-of-the-art in processing emotional body expressions may lead to a better understanding of the underlying neural mechanisms of social behavior. At the end of the review, suggestions for future research directions are presented.

  3. Cocaine users manifest impaired prosodic and cross-modal emotion processing

    Directory of Open Access Journals (Sweden)

    Lea M Hulka

    2013-09-01

    Full Text Available Background: A small number of previous studies have provided evidence that cocaine users exhibit impairments in complex social cognition tasks, while the more basic facial emotion recognition is widely unaffected. However, prosody and cross-modal emotion processing has not been systematically investigated in cocaine users so far. Therefore, the aim of the present study was to assess complex multisensory emotion processing in cocaine users in comparison to controls and to examine a potential association with drug use patterns.Method: The abbreviated version of the Comprehensive Affect Testing System (CATS-A was used to measure emotion perception across the three channels of facial affect, prosody, and semantic content in 58 cocaine users and 48 healthy control subjects who were matched for age, sex, verbal intelligence, and years of education.Results: Cocaine users had significantly lower scores than controls in the quotient scales of Emotion Recognition and Prosody Recognition and the subtests Conflicting Prosody/Meaning – Attend to Prosody and Match Emotional Prosody to Emotional Face either requiring to attend to prosody or to integrate cross-modal information. In contrast, no group difference emerged for the Affect Recognition Quotient. Cumulative cocaine doses and duration of cocaine use correlated negatively with emotion processing.Conclusion: Cocaine users show impaired cross-modal integration of different emotion processing channels particularly with regard to prosody, whereas more basic aspects of emotion processing such as facial affect perception are comparable to the performance of healthy controls.

  4. Age, gender and puberty influence the development of facial emotion recognition

    Directory of Open Access Journals (Sweden)

    Kate eLawrence

    2015-06-01

    Full Text Available Our ability to differentiate between simple facial expressions of emotion develops between infancy and early adulthood, yet few studies have explored the developmental trajectory of emotion recognition using a single methodology across a wide age-range. We investigated the development of emotion recognition abilities through childhood and adolescence, testing the hypothesis that children’s ability to recognise simple emotions is modulated by chronological age, pubertal stage and gender. In order to establish norms, we assessed 478 children aged 6-16 years, using the Ekman-Friesen Pictures of Facial Affect. We then modelled these cross-sectional data in terms of competence in accurate recognition of the six emotions studied, when the positive correlation between emotion recognition and IQ was controlled. Significant linear trends were seen in children’s ability to recognise facial expressions of happiness, surprise, fear and disgust; there was improvement with increasing age. In contrast, for sad and angry expressions there is little or no change in accuracy over the age range 6-16 years; near-adult levels of competence are established by middle-childhood. In a sampled subset, pubertal status influenced the ability to recognize facial expressions of disgust and anger; there was an increase in competence from mid to late puberty, which occurred independently of age. A small female advantage was found in the recognition of some facial expressions. The normative data provided in this study will aid clinicians and researchers in assessing the emotion recognition abilities of children and will facilitate the identification of abnormalities in a skill that is often impaired in neurodevelopmental disorders. If emotion recognition abilities are a good model with which to understand adolescent development, then these results could have implications for the education, mental health provision and legal treatment of teenagers.

  5. Age, gender, and puberty influence the development of facial emotion recognition.

    Science.gov (United States)

    Lawrence, Kate; Campbell, Ruth; Skuse, David

    2015-01-01

    Our ability to differentiate between simple facial expressions of emotion develops between infancy and early adulthood, yet few studies have explored the developmental trajectory of emotion recognition using a single methodology across a wide age-range. We investigated the development of emotion recognition abilities through childhood and adolescence, testing the hypothesis that children's ability to recognize simple emotions is modulated by chronological age, pubertal stage and gender. In order to establish norms, we assessed 478 children aged 6-16 years, using the Ekman-Friesen Pictures of Facial Affect. We then modeled these cross-sectional data in terms of competence in accurate recognition of the six emotions studied, when the positive correlation between emotion recognition and IQ was controlled. Significant linear trends were seen in children's ability to recognize facial expressions of happiness, surprise, fear, and disgust; there was improvement with increasing age. In contrast, for sad and angry expressions there is little or no change in accuracy over the age range 6-16 years; near-adult levels of competence are established by middle-childhood. In a sampled subset, pubertal status influenced the ability to recognize facial expressions of disgust and anger; there was an increase in competence from mid to late puberty, which occurred independently of age. A small female advantage was found in the recognition of some facial expressions. The normative data provided in this study will aid clinicians and researchers in assessing the emotion recognition abilities of children and will facilitate the identification of abnormalities in a skill that is often impaired in neurodevelopmental disorders. If emotion recognition abilities are a good model with which to understand adolescent development, then these results could have implications for the education, mental health provision and legal treatment of teenagers.

  6. Age, gender, and puberty influence the development of facial emotion recognition

    Science.gov (United States)

    Lawrence, Kate; Campbell, Ruth; Skuse, David

    2015-01-01

    Our ability to differentiate between simple facial expressions of emotion develops between infancy and early adulthood, yet few studies have explored the developmental trajectory of emotion recognition using a single methodology across a wide age-range. We investigated the development of emotion recognition abilities through childhood and adolescence, testing the hypothesis that children’s ability to recognize simple emotions is modulated by chronological age, pubertal stage and gender. In order to establish norms, we assessed 478 children aged 6–16 years, using the Ekman-Friesen Pictures of Facial Affect. We then modeled these cross-sectional data in terms of competence in accurate recognition of the six emotions studied, when the positive correlation between emotion recognition and IQ was controlled. Significant linear trends were seen in children’s ability to recognize facial expressions of happiness, surprise, fear, and disgust; there was improvement with increasing age. In contrast, for sad and angry expressions there is little or no change in accuracy over the age range 6–16 years; near-adult levels of competence are established by middle-childhood. In a sampled subset, pubertal status influenced the ability to recognize facial expressions of disgust and anger; there was an increase in competence from mid to late puberty, which occurred independently of age. A small female advantage was found in the recognition of some facial expressions. The normative data provided in this study will aid clinicians and researchers in assessing the emotion recognition abilities of children and will facilitate the identification of abnormalities in a skill that is often impaired in neurodevelopmental disorders. If emotion recognition abilities are a good model with which to understand adolescent development, then these results could have implications for the education, mental health provision and legal treatment of teenagers. PMID:26136697

  7. Facial and semantic emotional interference: A pilot study on the behavioral and cortical responses to the dual valence association task

    Directory of Open Access Journals (Sweden)

    Petroni Agustín

    2011-04-01

    Full Text Available Abstract Background Integration of compatible or incompatible emotional valence and semantic information is an essential aspect of complex social interactions. A modified version of the Implicit Association Test (IAT called Dual Valence Association Task (DVAT was designed in order to measure conflict resolution processing from compatibility/incompatibly of semantic and facial valence. The DVAT involves two emotional valence evaluative tasks which elicits two forms of emotional compatible/incompatible associations (facial and semantic. Methods Behavioural measures and Event Related Potentials were recorded while participants performed the DVAT. Results Behavioural data showed a robust effect that distinguished compatible/incompatible tasks. The effects of valence and contextual association (between facial and semantic stimuli showed early discrimination in N170 of faces. The LPP component was modulated by the compatibility of the DVAT. Conclusions Results suggest that DVAT is a robust paradigm for studying the emotional interference effect in the processing of simultaneous information from semantic and facial stimuli.

  8. Recognition of schematic facial displays of emotion in parents of children with autism.

    Science.gov (United States)

    Palermo, Mark T; Pasqualetti, Patrizio; Barbati, Giulia; Intelligente, Fabio; Rossini, Paolo Maria

    2006-07-01

    Performance on an emotional labeling task in response to schematic facial patterns representing five basic emotions without the concurrent presentation of a verbal category was investigated in 40 parents of children with autism and 40 matched controls. 'Autism fathers' performed worse than 'autism mothers', who performed worse than controls in decoding displays representing sadness or disgust. This indicates the need to include facial expression decoding tasks in genetic research of autism. In addition, emotional expression interactions between parents and their children with autism, particularly through play, where affect and prosody are 'physiologically' exaggerated, may stimulate development of social competence. Future studies could benefit from a combination of stimuli including photographs and schematic drawings, with and without associated verbal categories. This may allow the subdivision of patients and relatives on the basis of the amount of information needed to understand and process social-emotionally relevant information.

  9. Recognizing facial expressions of emotion in infancy: A replication and extension.

    Science.gov (United States)

    Safar, Kristina; Moulson, Margaret C

    2017-05-01

    Infants may recognize facial expressions of emotion more readily when familiar faces express the emotions. Studies 1 and 2 investigated whether familiarity influences two metrics of emotion processing: Categorization and spontaneous preference. In Study 1 (n = 32), we replicated previous findings showing an asymmetrical pattern of categorization of happy and fearful faces in 6.5-month-old infants, and extended these findings by demonstrating that infants' categorization did not differ when emotions were expressed by familiar (i.e., caregiver) faces. In Study 2 (n = 34), we replicated the spontaneous preference for fearful over happy expressions in 6.5-month-old infants, and extended these findings by demonstrating that the spontaneous preference for fear was also present for familiar faces. Thus, infants' performance on two metrics of emotion processing did not differ depending on face familiarity. © 2017 Wiley Periodicals, Inc.

  10. Unconscious Processing of Facial Expressions in Individuals with Internet Gaming Disorder

    Directory of Open Access Journals (Sweden)

    Xiaozhe Peng

    2017-06-01

    Full Text Available Internet Gaming Disorder (IGD is characterized by impairments in social communication and the avoidance of social contact. Facial expression processing is the basis of social communication. However, few studies have investigated how individuals with IGD process facial expressions, and whether they have deficits in emotional facial processing remains unclear. The aim of the present study was to explore these two issues by investigating the time course of emotional facial processing in individuals with IGD. A backward masking task was used to investigate the differences between individuals with IGD and normal controls (NC in the processing of subliminally presented facial expressions (sad, happy, and neutral with event-related potentials (ERPs. The behavioral results showed that individuals with IGD are slower than NC in response to both sad and neutral expressions in the sad–neutral context. The ERP results showed that individuals with IGD exhibit decreased amplitudes in ERP component N170 (an index of early face processing in response to neutral expressions compared to happy expressions in the happy–neutral expressions context, which might be due to their expectancies for positive emotional content. The NC, on the other hand, exhibited comparable N170 amplitudes in response to both happy and neutral expressions in the happy–neutral expressions context, as well as sad and neutral expressions in the sad–neutral expressions context. Both individuals with IGD and NC showed comparable ERP amplitudes during the processing of sad expressions and neutral expressions. The present study revealed that individuals with IGD have different unconscious neutral facial processing patterns compared with normal individuals and suggested that individuals with IGD may expect more positive emotion in the happy–neutral expressions context.Highlights:• The present study investigated whether the unconscious processing of facial expressions is influenced by

  11. Unconscious Processing of Facial Expressions in Individuals with Internet Gaming Disorder.

    Science.gov (United States)

    Peng, Xiaozhe; Cui, Fang; Wang, Ting; Jiao, Can

    2017-01-01

    Internet Gaming Disorder (IGD) is characterized by impairments in social communication and the avoidance of social contact. Facial expression processing is the basis of social communication. However, few studies have investigated how individuals with IGD process facial expressions, and whether they have deficits in emotional facial processing remains unclear. The aim of the present study was to explore these two issues by investigating the time course of emotional facial processing in individuals with IGD. A backward masking task was used to investigate the differences between individuals with IGD and normal controls (NC) in the processing of subliminally presented facial expressions (sad, happy, and neutral) with event-related potentials (ERPs). The behavioral results showed that individuals with IGD are slower than NC in response to both sad and neutral expressions in the sad-neutral context. The ERP results showed that individuals with IGD exhibit decreased amplitudes in ERP component N170 (an index of early face processing) in response to neutral expressions compared to happy expressions in the happy-neutral expressions context, which might be due to their expectancies for positive emotional content. The NC, on the other hand, exhibited comparable N170 amplitudes in response to both happy and neutral expressions in the happy-neutral expressions context, as well as sad and neutral expressions in the sad-neutral expressions context. Both individuals with IGD and NC showed comparable ERP amplitudes during the processing of sad expressions and neutral expressions. The present study revealed that individuals with IGD have different unconscious neutral facial processing patterns compared with normal individuals and suggested that individuals with IGD may expect more positive emotion in the happy-neutral expressions context. • The present study investigated whether the unconscious processing of facial expressions is influenced by excessive online gaming. A validated

  12. Judgments of subtle facial expressions of emotion.

    Science.gov (United States)

    Matsumoto, David; Hwang, Hyisung C

    2014-04-01

    Most studies on judgments of facial expressions of emotion have primarily utilized prototypical, high-intensity expressions. This paper examines judgments of subtle facial expressions of emotion, including not only low-intensity versions of full-face prototypes but also variants of those prototypes. A dynamic paradigm was used in which observers were shown a neutral expression followed by the target expression to judge, and then the neutral expression again, allowing for a simulation of the emergence of the expression from and then return to a baseline. We also examined how signal and intensity clarities of the expressions (explained more fully in the Introduction) were associated with judgment agreement levels. Low-intensity, full-face prototypical expressions of emotion were judged as the intended emotion at rates significantly greater than chance. A number of the proposed variants were also judged as the intended emotions. Both signal and intensity clarities were individually associated with agreement rates; when their interrelationships were taken into account, signal clarity independently predicted agreement rates but intensity clarity did not. The presence or absence of specific muscles appeared to be more important to agreement rates than their intensity levels, with the exception of the intensity of zygomatic major, which was positively correlated with agreement rates for judgments of joy.

  13. Deliberately generated and imitated facial expressions of emotions in people with eating disorders.

    Science.gov (United States)

    Dapelo, Marcela Marin; Bodas, Sergio; Morris, Robin; Tchanturia, Kate

    2016-02-01

    People with eating disorders have difficulties in socio emotional functioning that could contribute to maintaining the functional consequences of the disorder. This study aimed to explore the ability to deliberately generate (i.e., pose) and imitate facial expressions of emotions in women with anorexia (AN) and bulimia nervosa (BN), compared to healthy controls (HC). One hundred and three participants (36 AN, 25 BN, and 42 HC) were asked to pose and imitate facial expressions of anger, disgust, fear, happiness, and sadness. Their facial expressions were recorded and coded. Participants with eating disorders (both AN and BN) were less accurate than HC when posing facial expressions of emotions. Participants with AN were less accurate compared to HC imitating facial expressions, whilst BN participants had a middle range performance. All results remained significant after controlling for anxiety, depression and autistic features. The relatively small number of BN participants recruited for this study. The study findings suggest that people with eating disorders, particularly those with AN, have difficulties posing and imitating facial expressions of emotions. These difficulties could have an impact in social communication and social functioning. This is the first study to investigate the ability to pose and imitate facial expressions of emotions in people with eating disorders, and the findings suggest this area should be further explored in future studies. Copyright © 2015. Published by Elsevier B.V.

  14. Beyond face value: does involuntary emotional anticipation shape the perception of dynamic facial expressions?

    Directory of Open Access Journals (Sweden)

    Letizia Palumbo

    Full Text Available Emotional facial expressions are immediate indicators of the affective dispositions of others. Recently it has been shown that early stages of social perception can already be influenced by (implicit attributions made by the observer about the agent's mental state and intentions. In the current study possible mechanisms underpinning distortions in the perception of dynamic, ecologically-valid, facial expressions were explored. In four experiments we examined to what extent basic perceptual processes such as contrast/context effects, adaptation and representational momentum underpinned the perceptual distortions, and to what extent 'emotional anticipation', i.e. the involuntary anticipation of the other's emotional state of mind on the basis of the immediate perceptual history, might have played a role. Neutral facial expressions displayed at the end of short video-clips, in which an initial facial expression of joy or anger gradually morphed into a neutral expression, were misjudged as being slightly angry or slightly happy, respectively (Experiment 1. This response bias disappeared when the actor's identity changed in the final neutral expression (Experiment 2. Videos depicting neutral-to-joy-to-neutral and neutral-to-anger-to-neutral sequences again produced biases but in opposite direction (Experiment 3. The bias survived insertion of a 400 ms blank (Experiment 4. These results suggested that the perceptual distortions were not caused by any of the low-level perceptual mechanisms (adaptation, representational momentum and contrast effects. We speculate that especially when presented with dynamic, facial expressions, perceptual distortions occur that reflect 'emotional anticipation' (a low-level mindreading mechanism, which overrules low-level visual mechanisms. Underpinning neural mechanisms are discussed in relation to the current debate on action and emotion understanding.

  15. Beyond face value: does involuntary emotional anticipation shape the perception of dynamic facial expressions?

    Science.gov (United States)

    Palumbo, Letizia; Jellema, Tjeerd

    2013-01-01

    Emotional facial expressions are immediate indicators of the affective dispositions of others. Recently it has been shown that early stages of social perception can already be influenced by (implicit) attributions made by the observer about the agent's mental state and intentions. In the current study possible mechanisms underpinning distortions in the perception of dynamic, ecologically-valid, facial expressions were explored. In four experiments we examined to what extent basic perceptual processes such as contrast/context effects, adaptation and representational momentum underpinned the perceptual distortions, and to what extent 'emotional anticipation', i.e. the involuntary anticipation of the other's emotional state of mind on the basis of the immediate perceptual history, might have played a role. Neutral facial expressions displayed at the end of short video-clips, in which an initial facial expression of joy or anger gradually morphed into a neutral expression, were misjudged as being slightly angry or slightly happy, respectively (Experiment 1). This response bias disappeared when the actor's identity changed in the final neutral expression (Experiment 2). Videos depicting neutral-to-joy-to-neutral and neutral-to-anger-to-neutral sequences again produced biases but in opposite direction (Experiment 3). The bias survived insertion of a 400 ms blank (Experiment 4). These results suggested that the perceptual distortions were not caused by any of the low-level perceptual mechanisms (adaptation, representational momentum and contrast effects). We speculate that especially when presented with dynamic, facial expressions, perceptual distortions occur that reflect 'emotional anticipation' (a low-level mindreading mechanism), which overrules low-level visual mechanisms. Underpinning neural mechanisms are discussed in relation to the current debate on action and emotion understanding.

  16. Categorical Perception of Emotional Facial Expressions in Preschoolers

    Science.gov (United States)

    Cheal, Jenna L.; Rutherford, M. D.

    2011-01-01

    Adults perceive emotional facial expressions categorically. In this study, we explored categorical perception in 3.5-year-olds by creating a morphed continuum of emotional faces and tested preschoolers' discrimination and identification of them. In the discrimination task, participants indicated whether two examples from the continuum "felt the…

  17. Attention to Facial Emotion Expressions in Children with Autism

    Science.gov (United States)

    Begeer, Sander; Rieffe, Carolien; Terwogt, Mark Meerum; Stockmann, Lex

    2006-01-01

    High-functioning children in the autism spectrum are frequently noted for their impaired attention to facial expressions of emotions. In this study, we examined whether attention to emotion cues in others could be enhanced in children with autism, by varying the relevance of children's attention to emotion expressions. Twenty-eight…

  18. Are you looking at me? The influence of facial orientation and cultural focus salience on the perception of emotion expressions

    Directory of Open Access Journals (Sweden)

    Konstantinos Kafetsios

    2015-12-01

    Full Text Available We examined the influence of cultural orientation salience on the emotion perception process in a contextualized emotion recognition task. We primed individual and collective focus in participants who later rated the emotion expressions of a central character (target showing a happy, sad, angry, or neutral facial expression in a group setting. Facial orientation of a group of four other persons towards the target person was manipulated so that they faced either “inwards,” towards the central character, or “outwards,” towards the observer. Priming a collectivistic mind-set resulted in the perception of more intense emotions in the “inwards” facial orientation condition when the target showed angry, happy, or neutral expressions. Individualist focus influenced emotion perception in the “outwards” facial orientation condition in few cases. The findings highlight the significance of perceivers’ cultural orientation and social elements of the situation for emotion perception in line with the “culture as situated cognition” model.

  19. The Differential Effects of Thalamus and Basal Ganglia on Facial Emotion Recognition

    Science.gov (United States)

    Cheung, Crystal C. Y.; Lee, Tatia M. C.; Yip, James T. H.; King, Kristin E.; Li, Leonard S. W.

    2006-01-01

    This study examined if subcortical stroke was associated with impaired facial emotion recognition. Furthermore, the lateralization of the impairment and the differential profiles of facial emotion recognition deficits with localized thalamic or basal ganglia damage were also studied. Thirty-eight patients with subcortical strokes and 19 matched…

  20. The not face: A grammaticalization of facial expressions of emotion.

    Science.gov (United States)

    Benitez-Quiroz, C Fabian; Wilbur, Ronnie B; Martinez, Aleix M

    2016-05-01

    Facial expressions of emotion are thought to have evolved from the development of facial muscles used in sensory regulation and later adapted to express moral judgment. Negative moral judgment includes the expressions of anger, disgust and contempt. Here, we study the hypothesis that these facial expressions of negative moral judgment have further evolved into a facial expression of negation regularly used as a grammatical marker in human language. Specifically, we show that people from different cultures expressing negation use the same facial muscles as those employed to express negative moral judgment. We then show that this nonverbal signal is used as a co-articulator in speech and that, in American Sign Language, it has been grammaticalized as a non-manual marker. Furthermore, this facial expression of negation exhibits the theta oscillation (3-8 Hz) universally seen in syllable and mouthing production in speech and signing. These results provide evidence for the hypothesis that some components of human language have evolved from facial expressions of emotion, and suggest an evolutionary route for the emergence of grammatical markers. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Children's Recognition of Emotional Facial Expressions Through Photographs and Drawings.

    Science.gov (United States)

    Brechet, Claire

    2017-01-01

    The author's purpose was to examine children's recognition of emotional facial expressions, by comparing two types of stimulus: photographs and drawings. The author aimed to investigate whether drawings could be considered as a more evocative material than photographs, as a function of age and emotion. Five- and 7-year-old children were presented with photographs and drawings displaying facial expressions of 4 basic emotions (i.e., happiness, sadness, anger, and fear) and were asked to perform a matching task by pointing to the face corresponding to the target emotion labeled by the experimenter. The photographs we used were selected from the Radboud Faces Database and the drawings were designed on the basis of both the facial components involved in the expression of these emotions and the graphic cues children tend to use when asked to depict these emotions in their own drawings. Our results show that drawings are better recognized than photographs, for sadness, anger, and fear (with no difference for happiness, due to a ceiling effect). And that the difference between the 2 types of stimuli tends to be more important for 5-year-olds compared to 7-year-olds. These results are discussed in view of their implications, both for future research and for practical application.

  2. Face to face: blocking facial mimicry can selectively impair recognition of emotional expressions.

    Science.gov (United States)

    Oberman, Lindsay M; Winkielman, Piotr; Ramachandran, Vilayanur S

    2007-01-01

    People spontaneously mimic a variety of behaviors, including emotional facial expressions. Embodied cognition theories suggest that mimicry reflects internal simulation of perceived emotion in order to facilitate its understanding. If so, blocking facial mimicry should impair recognition of expressions, especially of emotions that are simulated using facial musculature. The current research tested this hypothesis using four expressions (happy, disgust, fear, and sad) and two mimicry-interfering manipulations (1) biting on a pen and (2) chewing gum, as well as two control conditions. Experiment 1 used electromyography over cheek, mouth, and nose regions. The bite manipulation consistently activated assessed muscles, whereas the chew manipulation activated muscles only intermittently. Further, expressing happiness generated most facial action. Experiment 2 found that the bite manipulation interfered most with recognition of happiness. These findings suggest that facial mimicry differentially contributes to recognition of specific facial expressions, thus allowing for more refined predictions from embodied cognition theories.

  3. Comparing Facial Emotional Recognition in Patients with Borderline Personality Disorder and Patients with Schizotypal Personality Disorder with a Normal Group.

    Science.gov (United States)

    Farsham, Aida; Abbaslou, Tahereh; Bidaki, Reza; Bozorg, Bonnie

    2017-04-01

    Objective: No research has been conducted on facial emotional recognition on patients with borderline personality disorder (BPD) and schizotypal personality disorder (SPD). The present study aimed at comparing facial emotion recognition in these patients with the general population. The neurocognitive processing of emotions can show the pathologic style of these 2 disorders. Method: Twenty BPD patients, 16 SPD patients, and 20 healthy individuals were selected by available sampling method. Structural Clinical Interview for Axis II, Millon Personality Inventory, Beck Depression Inventory and Facial Emotional Recognition Test was were conducted for all participants. Discussion: The results of one way ANOVA and Scheffe's post hoc test analysis revealed significant differences in neuropsychology assessment of facial emotional recognition between BPD and SPD patients with normal group (p = 0/001). A significant difference was found in emotion recognition of fear between the 2 groups of BPD and normal population (p = 0/008). A significant difference was observed between SPD patients and control group in emotion recognition of wonder (p = 0/04(. The obtained results indicated a deficit in negative emotion recognition, especially disgust emotion, thus, it can be concluded that these patients have the same neurocognitive profile in the emotion domain.

  4. Cultural Dialects of Real and Synthetic Emotional Facial Expressions

    NARCIS (Netherlands)

    Ruttkay, Z.M.

    2009-01-01

    In this paper we discuss the aspects of designing facial expressions for Virtual Humans with a specific culture. First we explore the notion of cultures and its relevance for applications with a Virtual Human. Then we give a general scheme of designing emotional facial expressions, and identify the

  5. Identification of emotional facial expressions among behaviorally inhibited adolescents with lifetime anxiety disorders

    Science.gov (United States)

    Reeb-Sutherland, Bethany C.; Williams, Lela Rankin; Degnan, Kathryn A.; Pérez-Edgar, Koraly; Chronis-Tuscano, Andrea; Leibenluft, Ellen; Pine, Daniel S.; Pollak, Seth D.; Fox, Nathan A.

    2014-01-01

    The current study examined differences in emotion expression identification between adolescents characterized with behavioral inhibition (BI) in childhood with and without a lifetime history of anxiety disorder. Participants were originally assessed for behavioral inhibition during toddlerhood and for social reticence during childhood. During adolescence, participants returned to the laboratory and completed a facial-emotion identification task and a clinical psychiatric interview. Results revealed that behaviorally inhibited adolescents with a lifetime history of anxiety disorder displayed a lower threshold for identifying fear relative to anger emotion expressions compared to non-anxious behaviorally inhibited adolescents and non-inhibited adolescents with or without anxiety. These findings were specific to behaviorally inhibited adolescents with a lifetime history of social anxiety disorder. Thus, adolescents with a history of both BI and anxiety, specifically social anxiety, are more likely to differ from other adolescents in their identification of fearful facial expressions. This offers further evidence that perturbations in the processing of emotional stimuli may underlie the etiology of anxiety disorders. PMID:24800906

  6. Psychopathic traits in adolescents and recognition of emotion in facial expressions

    Directory of Open Access Journals (Sweden)

    Silvio José Lemos Vasconcellos

    2014-12-01

    Full Text Available Recent studies have investigated the ability of adult psychopaths and children with psychopathy traits to identify specific facial expressions of emotion. Conclusive results have not yet been found regarding whether psychopathic traits are associated with a specific deficit in the ability of identifying negative emotions such as fear and sadness. This study compared 20 adolescents with psychopathic traits and 21 adolescents without these traits in terms of their ability to recognize facial expressions of emotion using facial stimuli presented during 200 milliseconds, 500 milliseconds, and 1 second expositions. Analyses indicated significant differences between the two groups' performances only for fear and when displayed for 200 ms. This finding is consistent with findings from other studies in the field and suggests that controlling the duration of exposure to affective stimuli in future studies may help to clarify the mechanisms underlying the facial affect recognition deficits of individuals with psychopathic traits.

  7. Effects of facial emotion recognition remediation on visual scanning of novel face stimuli.

    Science.gov (United States)

    Marsh, Pamela J; Luckett, Gemma; Russell, Tamara; Coltheart, Max; Green, Melissa J

    2012-11-01

    Previous research shows that emotion recognition in schizophrenia can be improved with targeted remediation that draws attention to important facial features (eyes, nose, mouth). Moreover, the effects of training have been shown to last for up to one month after training. The aim of this study was to investigate whether improved emotion recognition of novel faces is associated with concomitant changes in visual scanning of these same novel facial expressions. Thirty-nine participants with schizophrenia received emotion recognition training using Ekman's Micro-Expression Training Tool (METT), with emotion recognition and visual scanpath (VSP) recordings to face stimuli collected simultaneously. Baseline ratings of interpersonal and cognitive functioning were also collected from all participants. Post-METT training, participants showed changes in foveal attention to the features of facial expressions of emotion not used in METT training, which were generally consistent with the information about important features from the METT. In particular, there were changes in how participants looked at the features of facial expressions of emotion surprise, disgust, fear, happiness, and neutral, demonstrating that improved emotion recognition is paralleled by changes in the way participants with schizophrenia viewed novel facial expressions of emotion. However, there were overall decreases in foveal attention to sad and neutral faces that indicate more intensive instruction might be needed for these faces during training. Most importantly, the evidence shows that participant gender may affect training outcomes. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Distinct facial processing in schizophrenia and schizoaffective disorders

    Science.gov (United States)

    Chen, Yue; Cataldo, Andrea; Norton, Daniel J; Ongur, Dost

    2011-01-01

    Although schizophrenia and schizoaffective disorders have both similar and differing clinical features, it is not well understood whether similar or differing pathophysiological processes mediate patients’ cognitive functions. Using psychophysical methods, this study compared the performances of schizophrenia (SZ) patients, patients with schizoaffective disorder (SA), and a healthy control group in two face-related cognitive tasks: emotion discrimination, which tested perception of facial affect, and identity discrimination, which tested perception of non-affective facial features. Compared to healthy controls, SZ patients, but not SA patients, exhibited deficient performance in both fear and happiness discrimination, as well as identity discrimination. SZ patients, but not SA patients, also showed impaired performance in a theory-of-mind task for which emotional expressions are identified based upon the eye regions of face images. This pattern of results suggests distinct processing of face information in schizophrenia and schizoaffective disorders. PMID:21868199

  9. Perceiving emotions: Cueing social categorization processes and attentional control through facial expressions.

    Science.gov (United States)

    Cañadas, Elena; Lupiáñez, Juan; Kawakami, Kerry; Niedenthal, Paula M; Rodríguez-Bailón, Rosa

    2016-09-01

    Individuals spontaneously categorise other people on the basis of their gender, ethnicity and age. But what about the emotions they express? In two studies we tested the hypothesis that facial expressions are similar to other social categories in that they can function as contextual cues to control attention. In Experiment 1 we associated expressions of anger and happiness with specific proportions of congruent/incongruent flanker trials. We also created consistent and inconsistent category members within each of these two general contexts. The results demonstrated that participants exhibited a larger congruency effect when presented with faces in the emotional group associated with a high proportion of congruent trials. Notably, this effect transferred to inconsistent members of the group. In Experiment 2 we replicated the effects with faces depicting true and false smiles. Together these findings provide consistent evidence that individuals spontaneously utilise emotions to categorise others and that such categories determine the allocation of attentional control.

  10. Alexithymia is associated with attenuated automatic brain response to facial emotion in clinical depression.

    Science.gov (United States)

    Suslow, Thomas; Kugel, Harald; Rufer, Michael; Redlich, Ronny; Dohm, Katharina; Grotegerd, Dominik; Zaremba, Dario; Dannlowski, Udo

    2016-02-04

    Alexithymia is a clinically relevant personality trait related to difficulties in recognizing and describing emotions. Previous studies examining the neural correlates of alexithymia have shown mainly decreased response of several brain areas during emotion processing in healthy samples and patients suffering from autism or post-traumatic stress disorder. In the present study, we examined the effect of alexithymia on automatic brain reactivity to negative and positive facial expressions in clinical depression. Brain activation in response to sad, happy, neutral, and no facial expression (presented for 33 ms and masked by neutral faces) was measured by functional magnetic resonance imaging at 3 T in 26 alexithymic and 26 non-alexithymic patients with major depression. Alexithymic patients manifested less activation in response to masked sad and happy (compared to neutral) faces in right frontal regions and right caudate nuclei than non-alexithymic patients. Our neuroimaging study provides evidence that the personality trait alexithymia has a modulating effect on automatic emotion processing in clinical depression. Our findings support the idea that alexithymia could be associated with functional deficits of the right hemisphere. Future research on the neural substrates of emotion processing in depression should assess and control alexithymia in their analyses.

  11. Facial emotion recognition, socio-occupational functioning and expressed emotions in schizophrenia versus bipolar disorder.

    Science.gov (United States)

    Thonse, Umesh; Behere, Rishikesh V; Praharaj, Samir Kumar; Sharma, Podila Sathya Venkata Narasimha

    2018-06-01

    Facial emotion recognition deficits have been consistently demonstrated in patients with severe mental disorders. Expressed emotion is found to be an important predictor of relapse. However, the relationship between facial emotion recognition abilities and expressed emotions and its influence on socio-occupational functioning in schizophrenia versus bipolar disorder has not been studied. In this study we examined 91 patients with schizophrenia and 71 with bipolar disorder for psychopathology, socio occupational functioning and emotion recognition abilities. Primary caregivers of 62 patients with schizophrenia and 49 with bipolar disorder were assessed on Family Attitude Questionnaire to assess their expressed emotions. Patients of schizophrenia and bipolar disorder performed similarly on the emotion recognition task. Patients with schizophrenia group experienced higher critical comments and had a poorer socio-occupational functioning as compared to patients with bipolar disorder. Poorer socio-occupational functioning in patients with schizophrenia was significantly associated with greater dissatisfaction in their caregivers. In patients with bipolar disorder, poorer emotion recognition scores significantly correlated with poorer adaptive living skills and greater hostility and dissatisfaction in their caregivers. The findings of our study suggest that emotion recognition abilities in patients with bipolar disorder are associated with negative expressed emotions leading to problems in adaptive living skills. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Face processing regions are sensitive to distinct aspects of temporal sequence in facial dynamics.

    Science.gov (United States)

    Reinl, Maren; Bartels, Andreas

    2014-11-15

    Facial movement conveys important information for social interactions, yet its neural processing is poorly understood. Computational models propose that shape- and temporal sequence sensitive mechanisms interact in processing dynamic faces. While face processing regions are known to respond to facial movement, their sensitivity to particular temporal sequences has barely been studied. Here we used fMRI to examine the sensitivity of human face-processing regions to two aspects of directionality in facial movement trajectories. We presented genuine movie recordings of increasing and decreasing fear expressions, each of which were played in natural or reversed frame order. This two-by-two factorial design matched low-level visual properties, static content and motion energy within each factor, emotion-direction (increasing or decreasing emotion) and timeline (natural versus artificial). The results showed sensitivity for emotion-direction in FFA, which was timeline-dependent as it only occurred within the natural frame order, and sensitivity to timeline in the STS, which was emotion-direction-dependent as it only occurred for decreased fear. The occipital face area (OFA) was sensitive to the factor timeline. These findings reveal interacting temporal sequence sensitive mechanisms that are responsive to both ecological meaning and to prototypical unfolding of facial dynamics. These mechanisms are temporally directional, provide socially relevant information regarding emotional state or naturalness of behavior, and agree with predictions from modeling and predictive coding theory. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Facial emotion linked cooperation in patients with paranoid schizophrenia: a test on the Interpersonal Communication Model.

    Science.gov (United States)

    Tse, Wai S; Yan Lu; Bond, Alyson J; Chan, Raymond Ck; Tam, Danny W H

    2011-09-01

    Patients with schizophrenia consistently show deficits in facial affect perception and social behaviours. It is illusive to suggest that these deficits in facial affect perception cause poor social behaviours. The present research aims to study how facial affects influence ingratiation, cooperation and punishment behaviours of the patients. Forty outpatients with paranoid schizophrenia, 26 matched depressed patients and 46 healthy volunteers were recruited. After measurement of clinical symptoms and depression, their facial emotion recognition, neurocognitive functioning and the facial affects dependent cooperative behaviour were measured using a modified version of Mixed-Motive Game. The depressed control group showed demographic characteristics, depression levels and neurocognitive functioning similar to the schizophrenic group. Patients with schizophrenia committed significantly more errors in neutral face identification than the other two groups. They were significantly more punitive on the Mixed-Motive Game in the neutral face condition. Neutral face misidentification was a unique emotion-processing deficit in the schizophrenic group. Their increase in punitive behaviours in the neutral face condition might confuse their family members and trigger more expressed emotion from them, thus increasing the risk of relapse. Family members might display more happy faces to promote positive relationships with patients.

  14. Neural bases of different cognitive strategies for facial affect processing in schizophrenia.

    Science.gov (United States)

    Fakra, Eric; Salgado-Pineda, Pilar; Delaveau, Pauline; Hariri, Ahmad R; Blin, Olivier

    2008-03-01

    To examine the neural basis and dynamics of facial affect processing in schizophrenic patients as compared to healthy controls. Fourteen schizophrenic patients and fourteen matched controls performed a facial affect identification task during fMRI acquisition. The emotional task included an intuitive emotional condition (matching emotional faces) and a more cognitively demanding condition (labeling emotional faces). Individual analysis for each emotional condition, and second-level t-tests examining both within-, and between-group differences, were carried out using a random effects approach. Psychophysiological interactions (PPI) were tested for variations in functional connectivity between amygdala and other brain regions as a function of changes in experimental conditions (labeling versus matching). During the labeling condition, both groups engaged similar networks. During the matching condition, schizophrenics failed to activate regions of the limbic system implicated in the automatic processing of emotions. PPI revealed an inverse functional connectivity between prefrontal regions and the left amygdala in healthy volunteers but there was no such change in patients. Furthermore, during the matching condition, and compared to controls, patients showed decreased activation of regions involved in holistic face processing (fusiform gyrus) and increased activation of regions associated with feature analysis (inferior parietal cortex, left middle temporal lobe, right precuneus). Our findings suggest that schizophrenic patients invariably adopt a cognitive approach when identifying facial affect. The distributed neocortical network observed during the intuitive condition indicates that patients may resort to feature-based, rather than configuration-based, processing and may constitute a compensatory strategy for limbic dysfunction.

  15. Mapping correspondence between facial mimicry and emotion recognition in healthy subjects.

    Science.gov (United States)

    Ponari, Marta; Conson, Massimiliano; D'Amico, Nunzia Pina; Grossi, Dario; Trojano, Luigi

    2012-12-01

    We aimed at verifying the hypothesis that facial mimicry is causally and selectively involved in emotion recognition. For this purpose, in Experiment 1, we explored the effect of tonic contraction of muscles in upper or lower half of participants' face on their ability to recognize emotional facial expressions. We found that the "lower" manipulation specifically impaired recognition of happiness and disgust, the "upper" manipulation impaired recognition of anger, while both manipulations affected recognition of fear; recognition of surprise and sadness were not affected by either blocking manipulations. In Experiment 2, we verified whether emotion recognition is hampered by stimuli in which an upper or lower half-face showing an emotional expression is combined with a neutral half-face. We found that the neutral lower half-face interfered with recognition of happiness and disgust, whereas the neutral upper half impaired recognition of anger; recognition of fear and sadness was impaired by both manipulations, whereas recognition of surprise was not affected by either manipulation. Taken together, the present findings support simulation models of emotion recognition and provide insight into the role of mimicry in comprehension of others' emotional facial expressions. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  16. Facial expressions recognition with an emotion expressive robotic head

    Science.gov (United States)

    Doroftei, I.; Adascalitei, F.; Lefeber, D.; Vanderborght, B.; Doroftei, I. A.

    2016-08-01

    The purpose of this study is to present the preliminary steps in facial expressions recognition with a new version of an expressive social robotic head. So, in a first phase, our main goal was to reach a minimum level of emotional expressiveness in order to obtain nonverbal communication between the robot and human by building six basic facial expressions. To evaluate the facial expressions, the robot was used in some preliminary user studies, among children and adults.

  17. Poor sleep quality predicts deficient emotion information processing over time in early adolescence.

    Science.gov (United States)

    Soffer-Dudek, Nirit; Sadeh, Avi; Dahl, Ronald E; Rosenblat-Stein, Shiran

    2011-11-01

    There is deepening understanding of the effects of sleep on emotional information processing. Emotion information processing is a key aspect of social competence, which undergoes important maturational and developmental changes in adolescence; however, most research in this area has focused on adults. Our aim was to test the links between sleep and emotion information processing during early adolescence. Sleep and facial information processing were assessed objectively during 3 assessment waves, separated by 1-year lags. Data were obtained in natural environments-sleep was assessed in home settings, and facial information processing was assessed at school. 94 healthy children (53 girls, 41 boys), aged 10 years at Time 1. N/A. Facial information processing was tested under neutral (gender identification) and emotional (emotional expression identification) conditions. Sleep was assessed in home settings using actigraphy for 7 nights at each assessment wave. Waking > 5 min was considered a night awakening. Using multilevel modeling, elevated night awakenings and decreased sleep efficiency significantly predicted poor performance only in the emotional information processing condition (e.g., b = -1.79, SD = 0.52, confidence interval: lower boundary = -2.82, upper boundary = -0.076, t(416.94) = -3.42, P = 0.001). Poor sleep quality is associated with compromised emotional information processing during early adolescence, a sensitive period in socio-emotional development.

  18. Recognition of emotional facial expressions in adolescents with anorexia nervosa and adolescents with major depression.

    Science.gov (United States)

    Sfärlea, Anca; Greimel, Ellen; Platt, Belinda; Dieler, Alica C; Schulte-Körne, Gerd

    2018-04-01

    Anorexia nervosa (AN) has been suggested to be associated with abnormalities in facial emotion recognition. Most prior studies on facial emotion recognition in AN have investigated adult samples, despite the onset of AN being particularly often during adolescence. In addition, few studies have examined whether impairments in facial emotion recognition are specific to AN or might be explained by frequent comorbid conditions that are also associated with deficits in emotion recognition, such as depression. The present study addressed these gaps by investigating recognition of emotional facial expressions in adolescent girls with AN (n = 26) compared to girls with major depression (MD; n = 26) and healthy girls (HC; n = 37). Participants completed one task requiring identification of emotions (happy, sad, afraid, angry, neutral) in faces and two control tasks. Neither of the clinical groups showed impairments. The AN group was more accurate than the HC group in recognising afraid facial expressions and more accurate than the MD group in recognising happy, sad, and afraid expressions. Misclassification analyses identified subtle group differences in the types of errors made. The results suggest that the deficits in facial emotion recognition found in adult AN samples are not present in adolescent patients. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Facial expression recognition and emotional regulation in narcolepsy with cataplexy.

    Science.gov (United States)

    Bayard, Sophie; Croisier Langenier, Muriel; Dauvilliers, Yves

    2013-04-01

    Cataplexy is pathognomonic of narcolepsy with cataplexy, and defined by a transient loss of muscle tone triggered by strong emotions. Recent researches suggest abnormal amygdala function in narcolepsy with cataplexy. Emotion treatment and emotional regulation strategies are complex functions involving cortical and limbic structures, like the amygdala. As the amygdala has been shown to play a role in facial emotion recognition, we tested the hypothesis that patients with narcolepsy with cataplexy would have impaired recognition of facial emotional expressions compared with patients affected with central hypersomnia without cataplexy and healthy controls. We also aimed to determine whether cataplexy modulates emotional regulation strategies. Emotional intensity, arousal and valence ratings on Ekman faces displaying happiness, surprise, fear, anger, disgust, sadness and neutral expressions of 21 drug-free patients with narcolepsy with cataplexy were compared with 23 drug-free sex-, age- and intellectual level-matched adult patients with hypersomnia without cataplexy and 21 healthy controls. All participants underwent polysomnography recording and multiple sleep latency tests, and completed depression, anxiety and emotional regulation questionnaires. Performance of patients with narcolepsy with cataplexy did not differ from patients with hypersomnia without cataplexy or healthy controls on both intensity rating of each emotion on its prototypical label and mean ratings for valence and arousal. Moreover, patients with narcolepsy with cataplexy did not use different emotional regulation strategies. The level of depressive and anxious symptoms in narcolepsy with cataplexy did not differ from the other groups. Our results demonstrate that narcolepsy with cataplexy accurately perceives and discriminates facial emotions, and regulates emotions normally. The absence of alteration of perceived affective valence remains a major clinical interest in narcolepsy with cataplexy

  20. Maternal Personality and Infants' Neural and Visual Responsivity to Facial Expressions of Emotion

    Science.gov (United States)

    De Haan, Michelle; Belsky, Jay; Reid, Vincent; Volein, Agnes; Johnson, Mark H.

    2004-01-01

    Background: Recent investigations suggest that experience plays an important role in the development of face processing. The aim of this study was to investigate the potential role of experience in the development of the ability to process facial expressions of emotion. Method: We examined the potential role of experience indirectly by…

  1. Temporal lobe structures and facial emotion recognition in schizophrenia patients and nonpsychotic relatives.

    Science.gov (United States)

    Goghari, Vina M; Macdonald, Angus W; Sponheim, Scott R

    2011-11-01

    Temporal lobe abnormalities and emotion recognition deficits are prominent features of schizophrenia and appear related to the diathesis of the disorder. This study investigated whether temporal lobe structural abnormalities were associated with facial emotion recognition deficits in schizophrenia and related to genetic liability for the disorder. Twenty-seven schizophrenia patients, 23 biological family members, and 36 controls participated. Several temporal lobe regions (fusiform, superior temporal, middle temporal, amygdala, and hippocampus) previously associated with face recognition in normative samples and found to be abnormal in schizophrenia were evaluated using volumetric analyses. Participants completed a facial emotion recognition task and an age recognition control task under time-limited and self-paced conditions. Temporal lobe volumes were tested for associations with task performance. Group status explained 23% of the variance in temporal lobe volume. Left fusiform gray matter volume was decreased by 11% in patients and 7% in relatives compared with controls. Schizophrenia patients additionally exhibited smaller hippocampal and middle temporal volumes. Patients were unable to improve facial emotion recognition performance with unlimited time to make a judgment but were able to improve age recognition performance. Patients additionally showed a relationship between reduced temporal lobe gray matter and poor facial emotion recognition. For the middle temporal lobe region, the relationship between greater volume and better task performance was specific to facial emotion recognition and not age recognition. Because schizophrenia patients exhibited a specific deficit in emotion recognition not attributable to a generalized impairment in face perception, impaired emotion recognition may serve as a target for interventions.

  2. Continuous emotion detection using EEG signals and facial expressions

    NARCIS (Netherlands)

    Soleymani, Mohammad; Asghari-Esfeden, Sadjad; Pantic, Maja; Fu, Yun

    Emotions play an important role in how we select and consume multimedia. Recent advances on affect detection are focused on detecting emotions continuously. In this paper, for the first time, we continuously detect valence from electroencephalogram (EEG) signals and facial expressions in response to

  3. Internal representations reveal cultural diversity in expectations of facial expressions of emotion.

    Science.gov (United States)

    Jack, Rachael E; Caldara, Roberto; Schyns, Philippe G

    2012-02-01

    Facial expressions have long been considered the "universal language of emotion." Yet consistent cultural differences in the recognition of facial expressions contradict such notions (e.g., R. E. Jack, C. Blais, C. Scheepers, P. G. Schyns, & R. Caldara, 2009). Rather, culture--as an intricate system of social concepts and beliefs--could generate different expectations (i.e., internal representations) of facial expression signals. To investigate, they used a powerful psychophysical technique (reverse correlation) to estimate the observer-specific internal representations of the 6 basic facial expressions of emotion (i.e., happy, surprise, fear, disgust, anger, and sad) in two culturally distinct groups (i.e., Western Caucasian [WC] and East Asian [EA]). Using complementary statistical image analyses, cultural specificity was directly revealed in these representations. Specifically, whereas WC internal representations predominantly featured the eyebrows and mouth, EA internal representations showed a preference for expressive information in the eye region. Closer inspection of the EA observer preference revealed a surprising feature: changes of gaze direction, shown primarily among the EA group. For the first time, it is revealed directly that culture can finely shape the internal representations of common facial expressions of emotion, challenging notions of a biologically hardwired "universal language of emotion."

  4. Comparing Facial Emotional Recognition in Patients with Borderline Personality Disorder and Patients with Schizotypal Personality Disorder with a Normal Group

    Directory of Open Access Journals (Sweden)

    Aida Farsham

    2017-04-01

    Full Text Available Objective: No research has been conducted on facial emotional recognition on patients with borderline personality disorder (BPD and schizotypal personality disorder (SPD. The present study aimed at comparing facial emotion recognition in these patients with the general population. The neurocognitive processing of emotions can show the pathologic style of these 2 disorders. Method:  Twenty BPD patients, 16 SPD patients, and 20 healthy individuals were selected by available sampling method. Structural Clinical Interview for Axis II, Millon Personality Inventory, Beck Depression Inventory and Facial Emotional Recognition Test was were conducted for all participants.Discussion: The results of one way ANOVA and Scheffe’s post hoc test analysis revealed significant differences in neuropsychology assessment of  facial emotional recognition between BPD and  SPD patients with normal group (p = 0/001. A significant difference was found in emotion recognition of fear between the 2 groups of BPD and normal population (p = 0/008. A significant difference was observed between SPD patients and control group in emotion recognition of wonder (p = 0/04(.The obtained results indicated a deficit in negative emotion recognition, especially disgust emotion, thus, it can be concluded that these patients have the same neurocognitive profile in the emotion domain.

  5. Oxytocin Promotes Facial Emotion Recognition and Amygdala Reactivity in Adults with Asperger Syndrome

    Science.gov (United States)

    Domes, Gregor; Kumbier, Ekkehardt; Heinrichs, Markus; Herpertz, Sabine C

    2014-01-01

    The neuropeptide oxytocin has recently been shown to enhance eye gaze and emotion recognition in healthy men. Here, we report a randomized double-blind, placebo-controlled trial that examined the neural and behavioral effects of a single dose of intranasal oxytocin on emotion recognition in individuals with Asperger syndrome (AS), a clinical condition characterized by impaired eye gaze and facial emotion recognition. Using functional magnetic resonance imaging, we examined whether oxytocin would enhance emotion recognition from facial sections of the eye vs the mouth region and modulate regional activity in brain areas associated with face perception in both adults with AS, and a neurotypical control group. Intranasal administration of the neuropeptide oxytocin improved performance in a facial emotion recognition task in individuals with AS. This was linked to increased left amygdala reactivity in response to facial stimuli and increased activity in the neural network involved in social cognition. Our data suggest that the amygdala, together with functionally associated cortical areas mediate the positive effect of oxytocin on social cognitive functioning in AS. PMID:24067301

  6. Seeing mixed emotions: The specificity of emotion perception from static and dynamic facial expressions across cultures

    NARCIS (Netherlands)

    Fang, X.; Sauter, D.A.; van Kleef, G.A.

    2018-01-01

    Although perceivers often agree about the primary emotion that is conveyed by a particular expression, observers may concurrently perceive several additional emotions from a given facial expression. In the present research, we compared the perception of two types of nonintended emotions in Chinese

  7. Comparing the Recognition of Emotional Facial Expressions in Patients with

    Directory of Open Access Journals (Sweden)

    Abdollah Ghasempour

    2014-05-01

    Full Text Available Background: Recognition of emotional facial expressions is one of the psychological factors which involve in obsessive-compulsive disorder (OCD and major depressive disorder (MDD. The aim of present study was to compare the ability of recognizing emotional facial expressions in patients with Obsessive-Compulsive Disorder and major depressive disorder. Materials and Methods: The present study is a cross-sectional and ex-post facto investigation (causal-comparative method. Forty participants (20 patients with OCD, 20 patients with MDD were selected through available sampling method from the clients referred to Tabriz Bozorgmehr clinic. Data were collected through Structured Clinical Interview and Recognition of Emotional Facial States test. The data were analyzed utilizing MANOVA. Results: The obtained results showed that there is no significant difference between groups in the mean score of recognition emotional states of surprise, sadness, happiness and fear; but groups had a significant difference in the mean score of diagnosing disgust and anger states (p<0.05. Conclusion: Patients suffering from both OCD and MDD show equal ability to recognize surprise, sadness, happiness and fear. However, the former are less competent in recognizing disgust and anger than the latter.

  8. Effects of cultural characteristics on building an emotion classifier through facial expression analysis

    Science.gov (United States)

    da Silva, Flávio Altinier Maximiano; Pedrini, Helio

    2015-03-01

    Facial expressions are an important demonstration of humanity's humors and emotions. Algorithms capable of recognizing facial expressions and associating them with emotions were developed and employed to compare the expressions that different cultural groups use to show their emotions. Static pictures of predominantly occidental and oriental subjects from public datasets were used to train machine learning algorithms, whereas local binary patterns, histogram of oriented gradients (HOGs), and Gabor filters were employed to describe the facial expressions for six different basic emotions. The most consistent combination, formed by the association of HOG filter and support vector machines, was then used to classify the other cultural group: there was a strong drop in accuracy, meaning that the subtle differences of facial expressions of each culture affected the classifier performance. Finally, a classifier was trained with images from both occidental and oriental subjects and its accuracy was higher on multicultural data, evidencing the need of a multicultural training set to build an efficient classifier.

  9. Empathy, but not mimicry restriction, influences the recognition of change in emotional facial expressions.

    Science.gov (United States)

    Kosonogov, Vladimir; Titova, Alisa; Vorobyeva, Elena

    2015-01-01

    The current study addressed the hypothesis that empathy and the restriction of facial muscles of observers can influence recognition of emotional facial expressions. A sample of 74 participants recognized the subjective onset of emotional facial expressions (anger, disgust, fear, happiness, sadness, surprise, and neutral) in a series of morphed face photographs showing a gradual change (frame by frame) from one expression to another. The high-empathy (as measured by the Empathy Quotient) participants recognized emotional facial expressions at earlier photographs from the series than did low-empathy ones, but there was no difference in the exploration time. Restriction of facial muscles of observers (with plasters and a stick in mouth) did not influence the responses. We discuss these findings in the context of the embodied simulation theory and previous data on empathy.

  10. Mimicking emotions: how 3-12-month-old infants use the facial expressions and eyes of a model.

    Science.gov (United States)

    Soussignan, Robert; Dollion, Nicolas; Schaal, Benoist; Durand, Karine; Reissland, Nadja; Baudouin, Jean-Yves

    2018-06-01

    While there is an extensive literature on the tendency to mimic emotional expressions in adults, it is unclear how this skill emerges and develops over time. Specifically, it is unclear whether infants mimic discrete emotion-related facial actions, whether their facial displays are moderated by contextual cues and whether infants' emotional mimicry is constrained by developmental changes in the ability to discriminate emotions. We therefore investigate these questions using Baby-FACS to code infants' facial displays and eye-movement tracking to examine infants' looking times at facial expressions. Three-, 7-, and 12-month-old participants were exposed to dynamic facial expressions (joy, anger, fear, disgust, sadness) of a virtual model which either looked at the infant or had an averted gaze. Infants did not match emotion-specific facial actions shown by the model, but they produced valence-congruent facial responses to the distinct expressions. Furthermore, only the 7- and 12-month-olds displayed negative responses to the model's negative expressions and they looked more at areas of the face recruiting facial actions involved in specific expressions. Our results suggest that valence-congruent expressions emerge in infancy during a period where the decoding of facial expressions becomes increasingly sensitive to the social signal value of emotions.

  11. Mapping structural covariance networks of facial emotion recognition in early psychosis: A pilot study.

    Science.gov (United States)

    Buchy, Lisa; Barbato, Mariapaola; Makowski, Carolina; Bray, Signe; MacMaster, Frank P; Deighton, Stephanie; Addington, Jean

    2017-11-01

    People with psychosis show deficits recognizing facial emotions and disrupted activation in the underlying neural circuitry. We evaluated associations between facial emotion recognition and cortical thickness using a correlation-based approach to map structural covariance networks across the brain. Fifteen people with an early psychosis provided magnetic resonance scans and completed the Penn Emotion Recognition and Differentiation tasks. Fifteen historical controls provided magnetic resonance scans. Cortical thickness was computed using CIVET and analyzed with linear models. Seed-based structural covariance analysis was done using the mapping anatomical correlations across the cerebral cortex methodology. To map structural covariance networks involved in facial emotion recognition, the right somatosensory cortex and bilateral fusiform face areas were selected as seeds. Statistics were run in SurfStat. Findings showed increased cortical covariance between the right fusiform face region seed and right orbitofrontal cortex in controls than early psychosis subjects. Facial emotion recognition scores were not significantly associated with thickness in any region. A negative effect of Penn Differentiation scores on cortical covariance was seen between the left fusiform face area seed and right superior parietal lobule in early psychosis subjects. Results suggest that facial emotion recognition ability is related to covariance in a temporal-parietal network in early psychosis. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Who do you trust? The impact of facial emotion and behaviour on decision making.

    Science.gov (United States)

    Campellone, Timothy R; Kring, Ann M

    2013-01-01

    During social interactions, we use available information to guide our decisions, including behaviour and emotional displays. In some situations, behaviour and emotional displays may be incongruent, complicating decision making. This study had two main aims: first, to investigate the independent contributions of behaviour and facial displays of emotion on decisions to trust, and, second, to examine what happens when the information being signalled by a facial display is incongruent with behaviour. Participants played a modified version of the Trust Game in which they learned simulated players' behaviour with or without concurrent displays of facial emotion. Results indicated that displays of anger, but not happiness, influenced decisions to trust during initial encounters. Over the course of repeated interactions, however, emotional displays consistent with an established pattern of behaviour made independent contributions to decision making, strengthening decisions to trust. When facial display and behaviour were incongruent, participants used current behaviour to inform decision making.

  13. Development of Emotional Facial Recognition in Late Childhood and Adolescence

    Science.gov (United States)

    Thomas, Laura A.; De Bellis, Michael D.; Graham, Reiko; Labar, Kevin S.

    2007-01-01

    The ability to interpret emotions in facial expressions is crucial for social functioning across the lifespan. Facial expression recognition develops rapidly during infancy and improves with age during the preschool years. However, the developmental trajectory from late childhood to adulthood is less clear. We tested older children, adolescents…

  14. The Right Place at the Right Time: Priming Facial Expressions with Emotional Face Components in Developmental Visual Agnosia

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran. R.; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo

    2012-01-01

    The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG’s impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face’s emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG’s performance was strongly influenced by the diagnosticity of the components: His emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. PMID:22349446

  15. The right place at the right time: priming facial expressions with emotional face components in developmental visual agnosia.

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran R; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo

    2012-04-01

    The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG's impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face's emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG's performance was strongly influenced by the diagnosticity of the components: his emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Robust representation and recognition of facial emotions using extreme sparse learning.

    Science.gov (United States)

    Shojaeilangari, Seyedehsamaneh; Yau, Wei-Yun; Nandakumar, Karthik; Li, Jun; Teoh, Eam Khwang

    2015-07-01

    Recognition of natural emotions from human faces is an interesting topic with a wide range of potential applications, such as human-computer interaction, automated tutoring systems, image and video retrieval, smart environments, and driver warning systems. Traditionally, facial emotion recognition systems have been evaluated on laboratory controlled data, which is not representative of the environment faced in real-world applications. To robustly recognize the facial emotions in real-world natural situations, this paper proposes an approach called extreme sparse learning, which has the ability to jointly learn a dictionary (set of basis) and a nonlinear classification model. The proposed approach combines the discriminative power of extreme learning machine with the reconstruction property of sparse representation to enable accurate classification when presented with noisy signals and imperfect data recorded in natural settings. In addition, this paper presents a new local spatio-temporal descriptor that is distinctive and pose-invariant. The proposed framework is able to achieve the state-of-the-art recognition accuracy on both acted and spontaneous facial emotion databases.

  17. Right Hemispheric Dominance in Processing of Unconscious Negative Emotion

    Science.gov (United States)

    Sato, Wataru; Aoki, Satoshi

    2006-01-01

    Right hemispheric dominance in unconscious emotional processing has been suggested, but remains controversial. This issue was investigated using the subliminal affective priming paradigm combined with unilateral visual presentation in 40 normal subjects. In either left or right visual fields, angry facial expressions, happy facial expressions, or…

  18. Optimal Geometrical Set for Automated Marker Placement to Virtualized Real-Time Facial Emotions.

    Directory of Open Access Journals (Sweden)

    Vasanthan Maruthapillai

    Full Text Available In recent years, real-time face recognition has been a major topic of interest in developing intelligent human-machine interaction systems. Over the past several decades, researchers have proposed different algorithms for facial expression recognition, but there has been little focus on detection in real-time scenarios. The present work proposes a new algorithmic method of automated marker placement used to classify six facial expressions: happiness, sadness, anger, fear, disgust, and surprise. Emotional facial expressions were captured using a webcam, while the proposed algorithm placed a set of eight virtual markers on each subject's face. Facial feature extraction methods, including marker distance (distance between each marker to the center of the face and change in marker distance (change in distance between the original and new marker positions, were used to extract three statistical features (mean, variance, and root mean square from the real-time video sequence. The initial position of each marker was subjected to the optical flow algorithm for marker tracking with each emotional facial expression. Finally, the extracted statistical features were mapped into corresponding emotional facial expressions using two simple non-linear classifiers, K-nearest neighbor and probabilistic neural network. The results indicate that the proposed automated marker placement algorithm effectively placed eight virtual markers on each subject's face and gave a maximum mean emotion classification rate of 96.94% using the probabilistic neural network.

  19. Optimal Geometrical Set for Automated Marker Placement to Virtualized Real-Time Facial Emotions.

    Science.gov (United States)

    Maruthapillai, Vasanthan; Murugappan, Murugappan

    2016-01-01

    In recent years, real-time face recognition has been a major topic of interest in developing intelligent human-machine interaction systems. Over the past several decades, researchers have proposed different algorithms for facial expression recognition, but there has been little focus on detection in real-time scenarios. The present work proposes a new algorithmic method of automated marker placement used to classify six facial expressions: happiness, sadness, anger, fear, disgust, and surprise. Emotional facial expressions were captured using a webcam, while the proposed algorithm placed a set of eight virtual markers on each subject's face. Facial feature extraction methods, including marker distance (distance between each marker to the center of the face) and change in marker distance (change in distance between the original and new marker positions), were used to extract three statistical features (mean, variance, and root mean square) from the real-time video sequence. The initial position of each marker was subjected to the optical flow algorithm for marker tracking with each emotional facial expression. Finally, the extracted statistical features were mapped into corresponding emotional facial expressions using two simple non-linear classifiers, K-nearest neighbor and probabilistic neural network. The results indicate that the proposed automated marker placement algorithm effectively placed eight virtual markers on each subject's face and gave a maximum mean emotion classification rate of 96.94% using the probabilistic neural network.

  20. Violent video game players and non-players differ on facial emotion recognition.

    Science.gov (United States)

    Diaz, Ruth L; Wong, Ulric; Hodgins, David C; Chiu, Carina G; Goghari, Vina M

    2016-01-01

    Violent video game playing has been associated with both positive and negative effects on cognition. We examined whether playing two or more hours of violent video games a day, compared to not playing video games, was associated with a different pattern of recognition of five facial emotions, while controlling for general perceptual and cognitive differences that might also occur. Undergraduate students were categorized as violent video game players (n = 83) or non-gamers (n = 69) and completed a facial recognition task, consisting of an emotion recognition condition and a control condition of gender recognition. Additionally, participants completed questionnaires assessing their video game and media consumption, aggression, and mood. Violent video game players recognized fearful faces both more accurately and quickly and disgusted faces less accurately than non-gamers. Desensitization to violence, constant exposure to fear and anxiety during game playing, and the habituation to unpleasant stimuli, are possible mechanisms that could explain these results. Future research should evaluate the effects of violent video game playing on emotion processing and social cognition more broadly. © 2015 Wiley Periodicals, Inc.

  1. Do Dynamic Facial Expressions Convey Emotions to Children Better than Do Static Ones?

    Science.gov (United States)

    Widen, Sherri C.; Russell, James A.

    2015-01-01

    Past research has shown that children recognize emotions from facial expressions poorly and improve only gradually with age, but the stimuli in such studies have been static faces. Because dynamic faces include more information, it may well be that children more readily recognize emotions from dynamic facial expressions. The current study of…

  2. Preschooler's Faces in Spontaneous Emotional Contexts--How Well Do They Match Adult Facial Expression Prototypes?

    Science.gov (United States)

    Gaspar, Augusta; Esteves, Francisco G.

    2012-01-01

    Prototypical facial expressions of emotion, also known as universal facial expressions, are the underpinnings of most research concerning recognition of emotions in both adults and children. Data on natural occurrences of these prototypes in natural emotional contexts are rare and difficult to obtain in adults. By recording naturalistic…

  3. Children's understanding of facial expression of emotion: II. Drawing of emotion-faces.

    Science.gov (United States)

    Missaghi-Lakshman, M; Whissell, C

    1991-06-01

    67 children from Grades 2, 4, and 7 drew faces representing the emotional expressions of fear, anger, surprise, disgust, happiness, and sadness. The children themselves and 29 adults later decoded the drawings in an emotion-recognition task. Children were the more accurate decoders, and their accuracy and the accuracy of adults increased significantly for judgments of 7th-grade drawings. The emotions happy and sad were most accurately decoded. There were no significant differences associated with sex. In their drawings, children utilized a symbol system that seems to be based on a highlighting or exaggeration of features of the innately governed facial expression of emotion.

  4. Ventromedial prefrontal cortex mediates visual attention during facial emotion recognition.

    Science.gov (United States)

    Wolf, Richard C; Philippi, Carissa L; Motzkin, Julian C; Baskaya, Mustafa K; Koenigs, Michael

    2014-06-01

    The ventromedial prefrontal cortex is known to play a crucial role in regulating human social and emotional behaviour, yet the precise mechanisms by which it subserves this broad function remain unclear. Whereas previous neuropsychological studies have largely focused on the role of the ventromedial prefrontal cortex in higher-order deliberative processes related to valuation and decision-making, here we test whether ventromedial prefrontal cortex may also be critical for more basic aspects of orienting attention to socially and emotionally meaningful stimuli. Using eye tracking during a test of facial emotion recognition in a sample of lesion patients, we show that bilateral ventromedial prefrontal cortex damage impairs visual attention to the eye regions of faces, particularly for fearful faces. This finding demonstrates a heretofore unrecognized function of the ventromedial prefrontal cortex-the basic attentional process of controlling eye movements to faces expressing emotion. © The Author (2014). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Facial Emotion Recognition Performance Differentiates Between Behavioral Variant Frontotemporal Dementia and Major Depressive Disorder.

    Science.gov (United States)

    Chiu, Isabelle; Piguet, Olivier; Diehl-Schmid, Janine; Riedl, Lina; Beck, Johannes; Leyhe, Thomas; Holsboer-Trachsler, Edith; Kressig, Reto W; Berres, Manfred; Monsch, Andreas U; Sollberger, Marc

    Misdiagnosis of early behavioral variant frontotemporal dementia (bvFTD) with major depressive disorder (MDD) is not uncommon due to overlapping symptoms. The aim of this study was to improve the discrimination between these disorders using a novel facial emotion perception task. In this prospective cohort study (July 2013-March 2016), we compared 25 patients meeting Rascovsky diagnostic criteria for bvFTD, 20 patients meeting DSM-IV criteria for MDD, 21 patients meeting McKhann diagnostic criteria for Alzheimer's disease dementia, and 31 healthy participants on a novel emotion intensity rating task comprising morphed low-intensity facial stimuli. Participants were asked to rate the intensity of morphed faces on the congruent basic emotion (eg, rating on sadness when sad face is shown) and on the 5 incongruent basic emotions (eg, rating on each of the other basic emotions when sad face is shown). While bvFTD patients underrated congruent emotions (P dementia patients perceived emotions similarly to healthy participants, indicating no impact of cognitive impairment on rating scores. Our congruent and incongruent facial emotion intensity rating task allows a detailed assessment of facial emotion perception in patient populations. By using this simple task, we achieved an almost complete discrimination between bvFTD and MDD, potentially helping improve the diagnostic certainty in early bvFTD. © Copyright 2018 Physicians Postgraduate Press, Inc.

  6. Do different fairness contexts and facial emotions motivate 'irrational' social decision-making in major depression? An exploratory patient study.

    Science.gov (United States)

    Radke, Sina; Schäfer, Ina C; Müller, Bernhard W; de Bruijn, Ellen R A

    2013-12-15

    Although 'irrational' decision-making has been linked to depression, the contribution of biases in information processing to these findings remains unknown. To investigate the impact of cognitive biases and aberrant processing of facial emotions on social decision-making, we manipulated both context-related and emotion-related information in a modified Ultimatum Game. Unfair offers were (1) paired with different unselected alternatives, establishing the context in which an offer was made, and (2) accompanied by emotional facial expressions of proposers. Responder behavior was assessed in patients with major depressive disorder and healthy controls. In both groups alike, rejection rates were highest following unambiguous signals of unfairness, i.e. an angry proposer face or when an unfair distribution had deliberately been chosen over an equal split. However, depressed patients showed overall higher rejection rates than healthy volunteers, without exhibiting differential processing biases. This suggests that depressed patients were, as healthy individuals, basing their decisions on informative, salient features and differentiating between (i) fair and unfair offers, (ii) alternatives to unfair offers and (iii) proposers' facial emotions. Although more fundamental processes, e.g. reduced reward sensitivity, might underlie increased rejection in depression, the current study provides insight into mechanisms that shape fairness considerations in both depressed and healthy individuals. © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. Sad or fearful? The influence of body posture on adults' and children's perception of facial displays of emotion.

    Science.gov (United States)

    Mondloch, Catherine J

    2012-02-01

    The current research investigated the influence of body posture on adults' and children's perception of facial displays of emotion. In each of two experiments, participants categorized facial expressions that were presented on a body posture that was congruent (e.g., a sad face on a body posing sadness) or incongruent (e.g., a sad face on a body posing fear). Adults and 8-year-olds made more errors and had longer reaction times on incongruent trials than on congruent trials when judging sad versus fearful facial expressions, an effect that was larger in 8-year-olds. The congruency effect was reduced when faces and bodies were misaligned, providing some evidence for holistic processing. Neither adults nor 8-year-olds were affected by congruency when judging sad versus happy expressions. Evidence that congruency effects vary with age and with similarity of emotional expressions is consistent with dimensional theories and "emotional seed" models of emotion perception. 2011 Elsevier Inc. All rights reserved.

  8. Inferring Emotion from Facial Expressions in Social Contexts : A Role of Self-Construal?

    Directory of Open Access Journals (Sweden)

    Ana Maria Draghici

    2010-01-01

    Full Text Available 'The study attempted to replicate the findings of Masuda and colleagues (2008, testing a modified hypothesis: when judging people’s emotions from facial expressions, interdependence-primed participants, in contrast to independence-primed participants, incorporate information from the social context, i.e. facial expressions of surrounding people.  This was done in order to check if self construal could be the main variable influencing the cultural differences in emotion perception documented by Masuda and colleagues. Participants viewed cartoon images depicting a happy, sad, or neutral character in its facial expression, surrounded by other characters expressing graded congruent or incongruent facial expressions. The hypothesis was only (partially confirmed for the emotional judgments of a neutral facial expression target. However, a closer look at the individual means indicated both assimilation and contrast effects, without a systematic manner in which the background characters' facial expressions would have been incorporated in the participants' judgments, for either of the priming groups. The results are discussed in terms of priming and priming success, and possible moderators other than self-construal for the effect found by Masuda and colleagues.'

  9. Facial emotion recognition in Williams syndrome and Down syndrome: A matching and developmental study.

    Science.gov (United States)

    Martínez-Castilla, Pastora; Burt, Michael; Borgatti, Renato; Gagliardi, Chiara

    2015-01-01

    In this study both the matching and developmental trajectories approaches were used to clarify questions that remain open in the literature on facial emotion recognition in Williams syndrome (WS) and Down syndrome (DS). The matching approach showed that individuals with WS or DS exhibit neither proficiency for the expression of happiness nor specific impairments for negative emotions. Instead, they present the same pattern of emotion recognition as typically developing (TD) individuals. Thus, the better performance on the recognition of positive compared to negative emotions usually reported in WS and DS is not specific of these populations but seems to represent a typical pattern. Prior studies based on the matching approach suggested that the development of facial emotion recognition is delayed in WS and atypical in DS. Nevertheless, and even though performance levels were lower in DS than in WS, the developmental trajectories approach used in this study evidenced that not only individuals with DS but also those with WS present atypical development in facial emotion recognition. Unlike in the TD participants, where developmental changes were observed along with age, in the WS and DS groups, the development of facial emotion recognition was static. Both individuals with WS and those with DS reached an early maximum developmental level due to cognitive constraints.

  10. Mathematical problems in the application of multilinear models to facial emotion processing experiments

    Science.gov (United States)

    Andersen, Anders H.; Rayens, William S.; Li, Ren-Cang; Blonder, Lee X.

    2000-10-01

    In this paper we describe the enormous potential that multilinear models hold for the analysis of data from neuroimaging experiments that rely on functional magnetic resonance imaging (MRI) or other imaging modalities. A case is made for why one might fully expect that the successful introduction of these models to the neuroscience community could define the next generation of structure-seeking paradigms in the area. In spite of the potential for immediate application, there is much to do from the perspective of statistical science. That is, although multilinear models have already been particularly successful in chemistry and psychology, relatively little is known about their statistical properties. To that end, our research group at the University of Kentucky has made significant progress. In particular, we are in the process of developing formal influence measures for multilinear methods as well as associated classification models and effective implementations. We believe that these problems will be among the most important and useful to the scientific community. Details are presented herein and an application is given in the context of facial emotion processing experiments.

  11. Children Facial Expression Production: Influence of Age, Gender, Emotion Subtype, Elicitation Condition and Culture

    Directory of Open Access Journals (Sweden)

    Charline Grossard

    2018-04-01

    Full Text Available The production of facial expressions (FEs is an important skill that allows children to share and adapt emotions with their relatives and peers during social interactions. These skills are impaired in children with Autism Spectrum Disorder. However, the way in which typical children develop and master their production of FEs has still not been clearly assessed. This study aimed to explore factors that could influence the production of FEs in childhood such as age, gender, emotion subtype (sadness, anger, joy, and neutral, elicitation task (on request, imitation, area of recruitment (French Riviera and Parisian and emotion multimodality. A total of one hundred fifty-seven children aged 6–11 years were enrolled in Nice and Paris, France. We asked them to produce FEs in two different tasks: imitation with an avatar model and production on request without a model. Results from a multivariate analysis revealed that: (1 children performed better with age. (2 Positive emotions were easier to produce than negative emotions. (3 Children produced better FE on request (as opposed to imitation; and (4 Riviera children performed better than Parisian children suggesting regional influences on emotion production. We conclude that facial emotion production is a complex developmental process influenced by several factors that needs to be acknowledged in future research.

  12. Instructions to mimic improve facial emotion recognition in people with sub-clinical autism traits.

    Science.gov (United States)

    Lewis, Michael B; Dunn, Emily

    2017-11-01

    People tend to mimic the facial expression of others. It has been suggested that this helps provide social glue between affiliated people but it could also aid recognition of emotions through embodied cognition. The degree of facial mimicry, however, varies between individuals and is limited in people with autism spectrum conditions (ASC). The present study sought to investigate the effect of promoting facial mimicry during a facial-emotion-recognition test. In two experiments, participants without an ASC diagnosis had their autism quotient (AQ) measured. Following a baseline test, they did an emotion-recognition test again but half of the participants were asked to mimic the target face they saw prior to making their responses. Mimicry improved emotion recognition, and further analysis revealed that the largest improvement was for participants who had higher scores on the autism traits. In fact, recognition performance was best overall for people who had high AQ scores but also received the instruction to mimic. Implications for people with ASC are explored.

  13. 3D Face Model Dataset: Automatic Detection of Facial Expressions and Emotions for Educational Environments

    Science.gov (United States)

    Chickerur, Satyadhyan; Joshi, Kartik

    2015-01-01

    Emotion detection using facial images is a technique that researchers have been using for the last two decades to try to analyze a person's emotional state given his/her image. Detection of various kinds of emotion using facial expressions of students in educational environment is useful in providing insight into the effectiveness of tutoring…

  14. Oxytocin improves facial emotion recognition in young adults with antisocial personality disorder.

    Science.gov (United States)

    Timmermann, Marion; Jeung, Haang; Schmitt, Ruth; Boll, Sabrina; Freitag, Christine M; Bertsch, Katja; Herpertz, Sabine C

    2017-11-01

    Deficient facial emotion recognition has been suggested to underlie aggression in individuals with antisocial personality disorder (ASPD). As the neuropeptide oxytocin (OT) has been shown to improve facial emotion recognition, it might also exert beneficial effects in individuals providing so much harm to the society. In a double-blind, randomized, placebo-controlled crossover trial, 22 individuals with ASPD and 29 healthy control (HC) subjects (matched for age, sex, intelligence, and education) were intranasally administered either OT (24 IU) or a placebo 45min before participating in an emotion classification paradigm with fearful, angry, and happy faces. We assessed the number of correct classifications and reaction times as indicators of emotion recognition ability. Significant group×substance×emotion interactions were found in correct classifications and reaction times. Compared to HC, individuals with ASPD showed deficits in recognizing fearful and happy faces; these group differences were no longer observable under OT. Additionally, reaction times for angry faces differed significantly between the ASPD and HC group in the placebo condition. This effect was mainly driven by longer reaction times in HC subjects after placebo administration compared to OT administration while individuals with ASPD revealed descriptively the contrary response pattern. Our data indicate an improvement of the recognition of fearful and happy facial expressions by OT in young adults with ASPD. Particularly the increased recognition of facial fear is of high importance since the correct perception of distress signals in others is thought to inhibit aggression. Beneficial effects of OT might be further mediated by improved recognition of facial happiness probably reflecting increased social reward responsiveness. Copyright © 2017. Published by Elsevier Ltd.

  15. Abnormal Facial Emotion Recognition in Depression: Serial Testing in an Ultra-Rapid-Cycling Patient.

    Science.gov (United States)

    George, Mark S.; Huggins, Teresa; McDermut, Wilson; Parekh, Priti I.; Rubinow, David; Post, Robert M.

    1998-01-01

    Mood disorder subjects have a selective deficit in recognizing human facial emotion. Whether the facial emotion recognition errors persist during normal mood states (i.e., are state vs. trait dependent) was studied in one male bipolar II patient. Results of five sessions are presented and discussed. (Author/EMK)

  16. Diminished facial emotion expression and associated clinical characteristics in Anorexia Nervosa.

    Science.gov (United States)

    Lang, Katie; Larsson, Emma E C; Mavromara, Liza; Simic, Mima; Treasure, Janet; Tchanturia, Kate

    2016-02-28

    This study aimed to investigate emotion expression in a large group of children, adolescents and adults with Anorexia Nervosa (AN), and investigate the associated clinical correlates. One hundred and forty-one participants (AN=66, HC= 75) were recruited and positive and negative film clips were used to elicit emotion expressions. The Facial Activation Coding system (FACES) was used to code emotion expression. Subjective ratings of emotion were collected. Individuals with AN displayed less positive emotions during the positive film clip compared to healthy controls (HC). There was no significant difference between the groups on the Positive and Negative Affect Scale (PANAS). The AN group displayed emotional incongruence (reporting a different emotion to what would be expected given the stimuli, with limited facial affect to signal the emotion experienced), whereby they reported feeling significantly higher rates of negative emotion during the positive clip. There were no differences in emotion expression between the groups during the negative film clip. Despite this individuals with AN reported feeling significantly higher levels of negative emotions during the negative clip. Diminished positive emotion expression was associated with more severe clinical symptoms, which could suggest that these individuals represent a group with serious social difficulties, which may require specific attention in treatment. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  17. Are facial expressions of emotion produced by categorical affect programs or dynamically driven by appraisal?

    Science.gov (United States)

    Scherer, Klaus R; Ellgring, Heiner

    2007-02-01

    The different assumptions made by discrete and componential emotion theories about the nature of the facial expression of emotion and the underlying mechanisms are reviewed. Explicit and implicit predictions are derived from each model. It is argued that experimental expression-production paradigms rather than recognition studies are required to critically test these differential predictions. Data from a large-scale actor portrayal study are reported to demonstrate the utility of this approach. The frequencies with which 12 professional actors use major facial muscle actions individually and in combination to express 14 major emotions show little evidence for emotion-specific prototypical affect programs. Rather, the results encourage empirical investigation of componential emotion model predictions of dynamic configurations of appraisal-driven adaptive facial actions. (c) 2007 APA, all rights reserved.

  18. The integration of visual context information in facial emotion recognition in 5- to 15-year-olds.

    Science.gov (United States)

    Theurel, Anne; Witt, Arnaud; Malsert, Jennifer; Lejeune, Fleur; Fiorentini, Chiara; Barisnikov, Koviljka; Gentaz, Edouard

    2016-10-01

    The current study investigated the role of congruent visual context information in the recognition of facial emotional expression in 190 participants from 5 to 15years of age. Children performed a matching task that presented pictures with different facial emotional expressions (anger, disgust, happiness, fear, and sadness) in two conditions: with and without a visual context. The results showed that emotions presented with visual context information were recognized more accurately than those presented in the absence of visual context. The context effect remained steady with age but varied according to the emotion presented and the gender of participants. The findings demonstrated for the first time that children from the age of 5years are able to integrate facial expression and visual context information, and this integration improves facial emotion recognition. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Recognition of Facial Expressions and Prosodic Cues with Graded Emotional Intensities in Adults with Asperger Syndrome

    Science.gov (United States)

    Doi, Hirokazu; Fujisawa, Takashi X.; Kanai, Chieko; Ohta, Haruhisa; Yokoi, Hideki; Iwanami, Akira; Kato, Nobumasa; Shinohara, Kazuyuki

    2013-01-01

    This study investigated the ability of adults with Asperger syndrome to recognize emotional categories of facial expressions and emotional prosodies with graded emotional intensities. The individuals with Asperger syndrome showed poorer recognition performance for angry and sad expressions from both facial and vocal information. The group…

  20. When action meets emotions: how facial displays of emotion influence goal-related behavior.

    Directory of Open Access Journals (Sweden)

    Francesca Ferri

    Full Text Available Many authors have proposed that facial expressions, by conveying emotional states of the person we are interacting with, influence the interaction behavior. We aimed at verifying how specific the effect is of the facial expressions of emotions of an individual (both their valence and relevance/specificity for the purpose of the action with respect to how the action aimed at the same individual is executed. In addition, we investigated whether and how the effects of emotions on action execution are modulated by participants' empathic attitudes. We used a kinematic approach to analyze the simulation of feeding others, which consisted of recording the "feeding trajectory" by using a computer mouse. Actors could express different highly arousing emotions, namely happiness, disgust, anger, or a neutral expression. Response time was sensitive to the interaction between valence and relevance/specificity of emotion: disgust caused faster response. In addition, happiness induced slower feeding time and longer time to peak velocity, but only in blocks where it alternated with expressions of disgust. The kinematic profiles described how the effect of the specificity of the emotional context for feeding, namely a modulation of accuracy requirements, occurs. An early acceleration in kinematic relative-to-neutral feeding profiles occurred when actors expressed positive emotions (happiness in blocks with specific-to-feeding negative emotions (disgust. On the other hand, the end-part of the action was slower when feeding happy with respect to neutral faces, confirming the increase of accuracy requirements and motor control. These kinematic effects were modulated by participants' empathic attitudes. In conclusion, the social dimension of emotions, that is, their ability to modulate others' action planning/execution, strictly depends on their relevance and specificity to the purpose of the action. This finding argues against a strict distinction between social

  1. When action meets emotions: how facial displays of emotion influence goal-related behavior.

    Science.gov (United States)

    Ferri, Francesca; Stoianov, Ivilin Peev; Gianelli, Claudia; D'Amico, Luigi; Borghi, Anna M; Gallese, Vittorio

    2010-10-01

    Many authors have proposed that facial expressions, by conveying emotional states of the person we are interacting with, influence the interaction behavior. We aimed at verifying how specific the effect is of the facial expressions of emotions of an individual (both their valence and relevance/specificity for the purpose of the action) with respect to how the action aimed at the same individual is executed. In addition, we investigated whether and how the effects of emotions on action execution are modulated by participants' empathic attitudes. We used a kinematic approach to analyze the simulation of feeding others, which consisted of recording the "feeding trajectory" by using a computer mouse. Actors could express different highly arousing emotions, namely happiness, disgust, anger, or a neutral expression. Response time was sensitive to the interaction between valence and relevance/specificity of emotion: disgust caused faster response. In addition, happiness induced slower feeding time and longer time to peak velocity, but only in blocks where it alternated with expressions of disgust. The kinematic profiles described how the effect of the specificity of the emotional context for feeding, namely a modulation of accuracy requirements, occurs. An early acceleration in kinematic relative-to-neutral feeding profiles occurred when actors expressed positive emotions (happiness) in blocks with specific-to-feeding negative emotions (disgust). On the other hand, the end-part of the action was slower when feeding happy with respect to neutral faces, confirming the increase of accuracy requirements and motor control. These kinematic effects were modulated by participants' empathic attitudes. In conclusion, the social dimension of emotions, that is, their ability to modulate others' action planning/execution, strictly depends on their relevance and specificity to the purpose of the action. This finding argues against a strict distinction between social and nonsocial

  2. Effects of the BDNF Val66Met polymorphism on neural responses to facial emotion.

    Science.gov (United States)

    Mukherjee, Prerona; Whalley, Heather C; McKirdy, James W; McIntosh, Andrew M; Johnstone, Eve C; Lawrie, Stephen M; Hall, Jeremy

    2011-03-31

    The brain derived neurotrophic factor (BDNF) Val66Met polymorphism has been associated with affective disorders, but its role in emotion processing has not been fully established. Due to the clinically heterogeneous nature of these disorders, studying the effect of genetic variation in the BDNF gene on a common attribute such as fear processing may elucidate how the BDNF Val66Met polymorphism impacts brain function. Here we use functional magnetic resonance imaging examine the effect of the BDNF Val66Met genotype on neural activity for fear processing. Forty healthy participants performed an implicit fear task during scanning, where subjects made gender judgments from facial images with neutral or fearful emotion. Subjects were tested for facial emotion recognition post-scan. Functional connectivity was investigated using psycho-physiological interactions. Subjects were genotyped for the BDNF Val66Met polymorphism and the measures compared between genotype groups. Met carriers showed overactivation in the anterior cingulate cortex (ACC), brainstem and insula bilaterally for fear processing, along with reduced functional connectivity from the ACC to the left hippocampus, and impaired fear recognition ability. The results show that during fear processing, Met allele carriers show an increased neural response in regions previously implicated in mediating autonomic arousal. Further, the Met carriers show decreased functional connectivity with the hippocampus, which may reflect differential retrieval of emotional associations. Together, these effects show significant differences in the neural substrate for fear processing with genetic variation in BDNF. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  3. Interpretation of time dependent facial expressions in terms of emotional stimuli

    NARCIS (Netherlands)

    Gorbunov, R.; Barakova, E.I.; Rauterberg, G.W.M.

    2012-01-01

    In this paper we demonstrate how genetic programming can be used to interpret time dependent facial expressions in terms of emotional stimuli of different types and intensities. In our analysis we have used video records of facial expressions made during the Mars-500 experiment in which six

  4. Facial Expression Emotion Detection for Real-Time Embedded Systems

    Directory of Open Access Journals (Sweden)

    Saeed Turabzadeh

    2018-01-01

    Full Text Available Recently, real-time facial expression recognition has attracted more and more research. In this study, an automatic facial expression real-time system was built and tested. Firstly, the system and model were designed and tested on a MATLAB environment followed by a MATLAB Simulink environment that is capable of recognizing continuous facial expressions in real-time with a rate of 1 frame per second and that is implemented on a desktop PC. They have been evaluated in a public dataset, and the experimental results were promising. The dataset and labels used in this study were made from videos, which were recorded twice from five participants while watching a video. Secondly, in order to implement in real-time at a faster frame rate, the facial expression recognition system was built on the field-programmable gate array (FPGA. The camera sensor used in this work was a Digilent VmodCAM — stereo camera module. The model was built on the Atlys™ Spartan-6 FPGA development board. It can continuously perform emotional state recognition in real-time at a frame rate of 30. A graphical user interface was designed to display the participant’s video in real-time and two-dimensional predict labels of the emotion at the same time.

  5. Altering sensorimotor feedback disrupts visual discrimination of facial expressions.

    Science.gov (United States)

    Wood, Adrienne; Lupyan, Gary; Sherrin, Steven; Niedenthal, Paula

    2016-08-01

    Looking at another person's facial expression of emotion can trigger the same neural processes involved in producing the expression, and such responses play a functional role in emotion recognition. Disrupting individuals' facial action, for example, interferes with verbal emotion recognition tasks. We tested the hypothesis that facial responses also play a functional role in the perceptual processing of emotional expressions. We altered the facial action of participants with a gel facemask while they performed a task that involved distinguishing target expressions from highly similar distractors. Relative to control participants, participants in the facemask condition demonstrated inferior perceptual discrimination of facial expressions, but not of nonface stimuli. The findings suggest that somatosensory/motor processes involving the face contribute to the visual perceptual-and not just conceptual-processing of facial expressions. More broadly, our study contributes to growing evidence for the fundamentally interactive nature of the perceptual inputs from different sensory modalities.

  6. Impaired Attribution of Emotion to Facial Expressions in Anxiety and Major Depression

    NARCIS (Netherlands)

    Demenescu, Liliana R.; Kortekaas, Rudie; den Boer, Johan A.; Aleman, Andre

    2010-01-01

    Background: Recognition of others' emotions is an important aspect of interpersonal communication. In major depression, a significant emotion recognition impairment has been reported. It remains unclear whether the ability to recognize emotion from facial expressions is also impaired in anxiety

  7. Project PAVE (Personality And Vision Experimentation: Role of personal and interpersonal resilience in the perception of emotional facial expression.

    Directory of Open Access Journals (Sweden)

    Michal eTanzer

    2014-08-01

    Full Text Available The aim of the proposed theoretical model is to illuminate personal and interpersonal resilience by drawing from the field of emotional face perception. We suggest that perception/recognition of emotional facial expressions serves as a central link between subjective, self-related processes and the social context. Emotional face perception constitutes a salient social cue underlying interpersonal communication and behavior. Because problems in communication and interpersonal behavior underlie most, if not all, forms of psychopathology, it follows that perception/recognition of emotional facial expressions impacts psychopathology. The ability to accurately interpret one's facial expression is crucial in subsequently deciding on an appropriate course of action. However, perception in general, and of emotional facial expressions in particular, is highly influenced by individuals’ personality and the self-concept. Herein we briefly outline well-established theories of personal and interpersonal resilience and link them to the neuro-cognitive basis of face perception. We then describe the findings of our ongoing program of research linking two well-established resilience factors, general self-efficacy (GSE and perceived social support (PSS, with face perception. We conclude by pointing out avenues for future research focusing on possible genetic markers and patterns of brain connectivity associated with the proposed model. Implications of our integrative model to psychotherapy are discussed.

  8. Exploring the nature of facial affect processing deficits in schizophrenia

    NARCIS (Netherlands)

    Wout, Mascha van 't; Aleman, Andre; Kessels, Roy P. C.; Cahn, Wiepke; Haan, Edward H. F. de; Kahn, Rene S.

    2007-01-01

    Schizophrenia has been associated with deficits in facial affect processing, especially negative emotions. However, the exact nature of the deficit remains unclear. The aim of the present study was to investigate whether schizophrenia patients have problems in automatic allocation of attention as

  9. Exploring the nature of facial affect processing deficits in schizophrenia.

    NARCIS (Netherlands)

    Wout, M. van 't; Aleman, A.; Kessels, R.P.C.; Cahn, W.; Haan, E.H.F. de; Kahn, R.S.

    2007-01-01

    Schizophrenia has been associated with deficits in facial affect processing, especially negative emotions. However, the exact nature of the deficit remains unclear. The aim of the present study was to investigate whether schizophrenia patients have problems in automatic allocation of attention as

  10. When Age Matters: Differences in Facial Mimicry and Autonomic Responses to Peers' Emotions in Teenagers and Adults

    Science.gov (United States)

    Ardizzi, Martina; Sestito, Mariateresa; Martini, Francesca; Umiltà, Maria Alessandra; Ravera, Roberto; Gallese, Vittorio

    2014-01-01

    Age-group membership effects on explicit emotional facial expressions recognition have been widely demonstrated. In this study we investigated whether Age-group membership could also affect implicit physiological responses, as facial mimicry and autonomic regulation, to observation of emotional facial expressions. To this aim, facial Electromyography (EMG) and Respiratory Sinus Arrhythmia (RSA) were recorded from teenager and adult participants during the observation of facial expressions performed by teenager and adult models. Results highlighted that teenagers exhibited greater facial EMG responses to peers' facial expressions, whereas adults showed higher RSA-responses to adult facial expressions. The different physiological modalities through which young and adults respond to peers' emotional expressions are likely to reflect two different ways to engage in social interactions with coetaneous. Findings confirmed that age is an important and powerful social feature that modulates interpersonal interactions by influencing low-level physiological responses. PMID:25337916

  11. Facial affect processing and depression susceptibility: cognitive biases and cognitive neuroscience.

    Science.gov (United States)

    Bistricky, Steven L; Ingram, Rick E; Atchley, Ruth Ann

    2011-11-01

    Facial affect processing is essential to social development and functioning and is particularly relevant to models of depression. Although cognitive and interpersonal theories have long described different pathways to depression, cognitive-interpersonal and evolutionary social risk models of depression focus on the interrelation of interpersonal experience, cognition, and social behavior. We therefore review the burgeoning depressive facial affect processing literature and examine its potential for integrating disciplines, theories, and research. In particular, we evaluate studies in which information processing or cognitive neuroscience paradigms were used to assess facial affect processing in depressed and depression-susceptible populations. Most studies have assessed and supported cognitive models. This research suggests that depressed and depression-vulnerable groups show abnormal facial affect interpretation, attention, and memory, although findings vary based on depression severity, comorbid anxiety, or length of time faces are viewed. Facial affect processing biases appear to correspond with distinct neural activity patterns and increased depressive emotion and thought. Biases typically emerge in depressed moods but are occasionally found in the absence of such moods. Indirect evidence suggests that childhood neglect might cultivate abnormal facial affect processing, which can impede social functioning in ways consistent with cognitive-interpersonal and interpersonal models. However, reviewed studies provide mixed support for the social risk model prediction that depressive states prompt cognitive hypervigilance to social threat information. We recommend prospective interdisciplinary research examining whether facial affect processing abnormalities promote-or are promoted by-depressogenic attachment experiences, negative thinking, and social dysfunction.

  12. Do proposed facial expressions of contempt, shame, embarrassment, and compassion communicate the predicted emotion?

    Science.gov (United States)

    Widen, Sherri C; Christy, Anita M; Hewett, Kristen; Russell, James A

    2011-08-01

    Shame, embarrassment, compassion, and contempt have been considered candidates for the status of basic emotions on the grounds that each has a recognisable facial expression. In two studies (N=88, N=60) on recognition of these four facial expressions, observers showed moderate agreement on the predicted emotion when assessed with forced choice (58%; 42%), but low agreement when assessed with free labelling (18%; 16%). Thus, even though some observers endorsed the predicted emotion when it was presented in a list, over 80% spontaneously interpreted these faces in a way other than the predicted emotion.

  13. Fluid Intelligence and Automatic Neural Processes in Facial Expression Perception

    DEFF Research Database (Denmark)

    Liu, Tongran; Xiao, Tong; Li, Xiaoyan

    2015-01-01

    The relationship between human fluid intelligence and social-emotional abilities has been a topic of considerable interest. The current study investigated whether adolescents with different intellectual levels had different automatic neural processing of facial expressions. Two groups of adolescent...... males were enrolled: a high IQ group and an average IQ group. Age and parental socioeconomic status were matched between the two groups. Participants counted the numbers of the central cross changes while paired facial expressions were presented bilaterally in an oddball paradigm. There were two.......2). Participants were required to concentrate on the primary task of counting the central cross changes and to ignore the expressions to ensure that facial expression processing was automatic. Event-related potentials (ERPs) were obtained during the tasks. The visual mismatch negativity (vMMN) components were...

  14. The facial and subjective emotional reaction in response to a video game designed to train emotional regulation (Playmancer).

    Science.gov (United States)

    Claes, Laurence; Jiménez-Murcia, Susana; Santamaría, Juan J; Moussa, Maher B; Sánchez, Isabel; Forcano, Laura; Magnenat-Thalmann, Nadia; Konstantas, Dimitri; Overby, Mikkel L; Nielsen, Jeppe; Bults, Richard G A; Granero, Roser; Lam, Tony; Kalapanidas, Elias; Treasure, Janet; Fernández-Aranda, Fernando

    2012-11-01

    Several aspects of social and emotional functioning are abnormal in people with eating disorders. The aim of the present study was to measure facial emotional expression in patients with eating disorders and healthy controls whilst playing a therapeutic video game (Playmancer) designed to train individuals in emotional regulation. Participants were 23 ED patients (11 AN, 12 BN) and 11 HCs. ED patients self reported more anger at baseline but expressed less facial expression of anger during the Playmancer game. The discrepancy between self-report and non-verbal expression may lead to problems in social communication. Copyright © 2012 John Wiley & Sons, Ltd and Eating Disorders Association.

  15. Individual differences in emotion lateralisation and the processing of emotional information arising from social interactions.

    Science.gov (United States)

    Bourne, Victoria J; Watling, Dawn

    2015-01-01

    Previous research examining the possible association between emotion lateralisation and social anxiety has found conflicting results. In this paper two studies are presented to assess two aspects related to different features of social anxiety: fear of negative evaluation (FNE) and emotion regulation. Lateralisation for the processing of facial emotion was measured using the chimeric faces test. Individuals with greater FNE were more strongly lateralised to the right hemisphere for the processing of anger, happiness and sadness; and, for the processing of fearful faces the relationship was found for females only. Emotion regulation strategies were reduced to two factors: positive strategies and negative strategies. For males, but not females, greater reported use of negative emotion strategies is associated with stronger right hemisphere lateralisation for processing negative emotions. The implications for further understanding the neuropsychological processing of emotion in individuals with social anxiety are discussed.

  16. Differences in holistic processing do not explain cultural differences in the recognition of facial expression.

    Science.gov (United States)

    Yan, Xiaoqian; Young, Andrew W; Andrews, Timothy J

    2017-12-01

    The aim of this study was to investigate the causes of the own-race advantage in facial expression perception. In Experiment 1, we investigated Western Caucasian and Chinese participants' perception and categorization of facial expressions of six basic emotions that included two pairs of confusable expressions (fear and surprise; anger and disgust). People were slightly better at identifying facial expressions posed by own-race members (mainly in anger and disgust). In Experiment 2, we asked whether the own-race advantage was due to differences in the holistic processing of facial expressions. Participants viewed composite faces in which the upper part of one expression was combined with the lower part of a different expression. The upper and lower parts of the composite faces were either aligned or misaligned. Both Chinese and Caucasian participants were better at identifying the facial expressions from the misaligned images, showing interference on recognizing the parts of the expressions created by holistic perception of the aligned composite images. However, this interference from holistic processing was equivalent across expressions of own-race and other-race faces in both groups of participants. Whilst the own-race advantage in recognizing facial expressions does seem to reflect the confusability of certain emotions, it cannot be explained by differences in holistic processing.

  17. Emotional recognition from dynamic facial, vocal and musical expressions following traumatic brain injury.

    Science.gov (United States)

    Drapeau, Joanie; Gosselin, Nathalie; Peretz, Isabelle; McKerral, Michelle

    2017-01-01

    To assess emotion recognition from dynamic facial, vocal and musical expressions in sub-groups of adults with traumatic brain injuries (TBI) of different severities and identify possible common underlying mechanisms across domains. Forty-one adults participated in this study: 10 with moderate-severe TBI, nine with complicated mild TBI, 11 with uncomplicated mild TBI and 11 healthy controls, who were administered experimental (emotional recognition, valence-arousal) and control tasks (emotional and structural discrimination) for each domain. Recognition of fearful faces was significantly impaired in moderate-severe and in complicated mild TBI sub-groups, as compared to those with uncomplicated mild TBI and controls. Effect sizes were medium-large. Participants with lower GCS scores performed more poorly when recognizing fearful dynamic facial expressions. Emotion recognition from auditory domains was preserved following TBI, irrespective of severity. All groups performed equally on control tasks, indicating no perceptual disorders. Although emotional recognition from vocal and musical expressions was preserved, no correlation was found across auditory domains. This preliminary study may contribute to improving comprehension of emotional recognition following TBI. Future studies of larger samples could usefully include measures of functional impacts of recognition deficits for fearful facial expressions. These could help refine interventions for emotional recognition following a brain injury.

  18. Relations between emotions, display rules, social motives, and facial behaviour.

    Science.gov (United States)

    Zaalberg, Ruud; Manstead, Antony; Fischer, Agneta

    2004-02-01

    We report research on the relations between emotions, display rules, social motives, and facial behaviour. In Study 1 we used a questionnaire methodology to examine how respondents would react to a funny or a not funny joke told to them by a close friend or a stranger. We assessed display rules and motivations for smiling and/or laughing. Display rules and social motives (partly) mediated the relationship between the experimental manipulations and self-reported facial behaviour. Study 2 was a laboratory experiment in which funny or not funny jokes were told to participants by a male or female stranger. Consistent with hypotheses, hearing a funny joke evoked a stronger motivation to share positive affect by showing longer Duchenne smiling. Contrary to hypotheses, a not funny joke did not elicit greater prosocial motivation by showing longer "polite" smiling, although such a smiling pattern did occur. Rated funniness of the joke and the motivation to share positive affect mediated the relationship between the joke manipulation and facial behaviour. Path analysis was used to explore this mediating process in greater detail.

  19. Evidence for Anger Saliency during the Recognition of Chimeric Facial Expressions of Emotions in Underage Ebola Survivors

    Directory of Open Access Journals (Sweden)

    Martina Ardizzi

    2017-06-01

    Full Text Available One of the crucial features defining basic emotions and their prototypical facial expressions is their value for survival. Childhood traumatic experiences affect the effective recognition of facial expressions of negative emotions, normally allowing the recruitment of adequate behavioral responses to environmental threats. Specifically, anger becomes an extraordinarily salient stimulus unbalancing victims’ recognition of negative emotions. Despite the plethora of studies on this topic, to date, it is not clear whether this phenomenon reflects an overall response tendency toward anger recognition or a selective proneness to the salience of specific facial expressive cues of anger after trauma exposure. To address this issue, a group of underage Sierra Leonean Ebola virus disease survivors (mean age 15.40 years, SE 0.35; years of schooling 8.8 years, SE 0.46; 14 males and a control group (mean age 14.55, SE 0.30; years of schooling 8.07 years, SE 0.30, 15 males performed a forced-choice chimeric facial expressions recognition task. The chimeric facial expressions were obtained pairing upper and lower half faces of two different negative emotions (selected from anger, fear and sadness for a total of six different combinations. Overall, results showed that upper facial expressive cues were more salient than lower facial expressive cues. This priority was lost among Ebola virus disease survivors for the chimeric facial expressions of anger. In this case, differently from controls, Ebola virus disease survivors recognized anger regardless of the upper or lower position of the facial expressive cues of this emotion. The present results demonstrate that victims’ performance in the recognition of the facial expression of anger does not reflect an overall response tendency toward anger recognition, but rather the specific greater salience of facial expressive cues of anger. Furthermore, the present results show that traumatic experiences deeply modify

  20. Facial Emotion Recognition Impairments are Associated with Brain Volume Abnormalities in Individuals with HIV

    Science.gov (United States)

    Clark, Uraina S.; Walker, Keenan A.; Cohen, Ronald A.; Devlin, Kathryn N.; Folkers, Anna M.; Pina, Mathew M.; Tashima, Karen T.

    2015-01-01

    Impaired facial emotion recognition abilities in HIV+ patients are well documented, but little is known about the neural etiology of these difficulties. We examined the relation of facial emotion recognition abilities to regional brain volumes in 44 HIV-positive (HIV+) and 44 HIV-negative control (HC) adults. Volumes of structures implicated in HIV− associated neuropathology and emotion recognition were measured on MRI using an automated segmentation tool. Relative to HC, HIV+ patients demonstrated emotion recognition impairments for fearful expressions, reduced anterior cingulate cortex (ACC) volumes, and increased amygdala volumes. In the HIV+ group, fear recognition impairments correlated significantly with ACC, but not amygdala volumes. ACC reductions were also associated with lower nadir CD4 levels (i.e., greater HIV-disease severity). These findings extend our understanding of the neurobiological substrates underlying an essential social function, facial emotion recognition, in HIV+ individuals and implicate HIV-related ACC atrophy in the impairment of these abilities. PMID:25744868

  1. The familial basis of facial emotion recognition deficits in adolescents with conduct disorder and their unaffected relatives.

    Science.gov (United States)

    Sully, K; Sonuga-Barke, E J S; Fairchild, G

    2015-07-01

    There is accumulating evidence of impairments in facial emotion recognition in adolescents with conduct disorder (CD). However, the majority of studies in this area have only been able to demonstrate an association, rather than a causal link, between emotion recognition deficits and CD. To move closer towards understanding the causal pathways linking emotion recognition problems with CD, we studied emotion recognition in the unaffected first-degree relatives of CD probands, as well as those with a diagnosis of CD. Using a family-based design, we investigated facial emotion recognition in probands with CD (n = 43), their unaffected relatives (n = 21), and healthy controls (n = 38). We used the Emotion Hexagon task, an alternative forced-choice task using morphed facial expressions depicting the six primary emotions, to assess facial emotion recognition accuracy. Relative to controls, the CD group showed impaired recognition of anger, fear, happiness, sadness and surprise (all p emotion recognition deficits are present in adolescents who are at increased familial risk for developing antisocial behaviour, as well as those who have already developed CD. Consequently, impaired emotion recognition appears to be a viable familial risk marker or candidate endophenotype for CD.

  2. Towards emotion detection in educational scenarios from facial expressions and body movements through multimodal approaches.

    Science.gov (United States)

    Saneiro, Mar; Santos, Olga C; Salmeron-Majadas, Sergio; Boticario, Jesus G

    2014-01-01

    We report current findings when considering video recordings of facial expressions and body movements to provide affective personalized support in an educational context from an enriched multimodal emotion detection approach. In particular, we describe an annotation methodology to tag facial expression and body movements that conform to changes in the affective states of learners while dealing with cognitive tasks in a learning process. The ultimate goal is to combine these annotations with additional affective information collected during experimental learning sessions from different sources such as qualitative, self-reported, physiological, and behavioral information. These data altogether are to train data mining algorithms that serve to automatically identify changes in the learners' affective states when dealing with cognitive tasks which help to provide emotional personalized support.

  3. Towards Emotion Detection in Educational Scenarios from Facial Expressions and Body Movements through Multimodal Approaches

    Directory of Open Access Journals (Sweden)

    Mar Saneiro

    2014-01-01

    Full Text Available We report current findings when considering video recordings of facial expressions and body movements to provide affective personalized support in an educational context from an enriched multimodal emotion detection approach. In particular, we describe an annotation methodology to tag facial expression and body movements that conform to changes in the affective states of learners while dealing with cognitive tasks in a learning process. The ultimate goal is to combine these annotations with additional affective information collected during experimental learning sessions from different sources such as qualitative, self-reported, physiological, and behavioral information. These data altogether are to train data mining algorithms that serve to automatically identify changes in the learners’ affective states when dealing with cognitive tasks which help to provide emotional personalized support.

  4. Recognition of facial emotion and perceived parental bonding styles in healthy volunteers and personality disorder patients.

    Science.gov (United States)

    Zheng, Leilei; Chai, Hao; Chen, Wanzhen; Yu, Rongrong; He, Wei; Jiang, Zhengyan; Yu, Shaohua; Li, Huichun; Wang, Wei

    2011-12-01

    Early parental bonding experiences play a role in emotion recognition and expression in later adulthood, and patients with personality disorder frequently experience inappropriate parental bonding styles, therefore the aim of the present study was to explore whether parental bonding style is correlated with recognition of facial emotion in personality disorder patients. The Parental Bonding Instrument (PBI) and the Matsumoto and Ekman Japanese and Caucasian Facial Expressions of Emotion (JACFEE) photo set tests were carried out in 289 participants. Patients scored lower on parental Care but higher on parental Freedom Control and Autonomy Denial subscales, and they displayed less accuracy when recognizing contempt, disgust and happiness than the healthy volunteers. In healthy volunteers, maternal Autonomy Denial significantly predicted accuracy when recognizing fear, and maternal Care predicted the accuracy of recognizing sadness. In patients, paternal Care negatively predicted the accuracy of recognizing anger, paternal Freedom Control predicted the perceived intensity of contempt, maternal Care predicted the accuracy of recognizing sadness, and the intensity of disgust. Parenting bonding styles have an impact on the decoding process and sensitivity when recognizing facial emotions, especially in personality disorder patients. © 2011 The Authors. Psychiatry and Clinical Neurosciences © 2011 Japanese Society of Psychiatry and Neurology.

  5. Facing the Problem: Impaired Emotion Recognition During Multimodal Social Information Processing in Borderline Personality Disorder.

    Science.gov (United States)

    Niedtfeld, Inga; Defiebre, Nadine; Regenbogen, Christina; Mier, Daniela; Fenske, Sabrina; Kirsch, Peter; Lis, Stefanie; Schmahl, Christian

    2017-04-01

    Previous research has revealed alterations and deficits in facial emotion recognition in patients with borderline personality disorder (BPD). During interpersonal communication in daily life, social signals such as speech content, variation in prosody, and facial expression need to be considered simultaneously. We hypothesized that deficits in higher level integration of social stimuli contribute to difficulties in emotion recognition in BPD, and heightened arousal might explain this effect. Thirty-one patients with BPD and thirty-one healthy controls were asked to identify emotions in short video clips, which were designed to represent different combinations of the three communication channels: facial expression, speech content, and prosody. Skin conductance was recorded as a measure of sympathetic arousal, while controlling for state dissociation. Patients with BPD showed lower mean accuracy scores than healthy control subjects in all conditions comprising emotional facial expressions. This was true for the condition with facial expression only, and for the combination of all three communication channels. Electrodermal responses were enhanced in BPD only in response to auditory stimuli. In line with the major body of facial emotion recognition studies, we conclude that deficits in the interpretation of facial expressions lead to the difficulties observed in multimodal emotion processing in BPD.

  6. Dynamic Changes in Amygdala Psychophysiological Connectivity Reveal Distinct Neural Networks for Facial Expressions of Basic Emotions.

    Science.gov (United States)

    Diano, Matteo; Tamietto, Marco; Celeghin, Alessia; Weiskrantz, Lawrence; Tatu, Mona-Karina; Bagnis, Arianna; Duca, Sergio; Geminiani, Giuliano; Cauda, Franco; Costa, Tommaso

    2017-03-27

    The quest to characterize the neural signature distinctive of different basic emotions has recently come under renewed scrutiny. Here we investigated whether facial expressions of different basic emotions modulate the functional connectivity of the amygdala with the rest of the brain. To this end, we presented seventeen healthy participants (8 females) with facial expressions of anger, disgust, fear, happiness, sadness and emotional neutrality and analyzed amygdala's psychophysiological interaction (PPI). In fact, PPI can reveal how inter-regional amygdala communications change dynamically depending on perception of various emotional expressions to recruit different brain networks, compared to the functional interactions it entertains during perception of neutral expressions. We found that for each emotion the amygdala recruited a distinctive and spatially distributed set of structures to interact with. These changes in amygdala connectional patters characterize the dynamic signature prototypical of individual emotion processing, and seemingly represent a neural mechanism that serves to implement the distinctive influence that each emotion exerts on perceptual, cognitive, and motor responses. Besides these differences, all emotions enhanced amygdala functional integration with premotor cortices compared to neutral faces. The present findings thus concur to reconceptualise the structure-function relation between brain-emotion from the traditional one-to-one mapping toward a network-based and dynamic perspective.

  7. Effects of Facial Expressions on Recognizing Emotions in Dance Movements

    Directory of Open Access Journals (Sweden)

    Nao Shikanai

    2011-10-01

    Full Text Available Effects of facial expressions on recognizing emotions expressed in dance movements were investigated. Dancers expressed three emotions: joy, sadness, and anger through dance movements. We used digital video cameras and a 3D motion capturing system to record and capture the movements. We then created full-video displays with an expressive face, full-video displays with an unexpressive face, stick figure displays (no face, or point-light displays (no face from these data using 3D animation software. To make point-light displays, 13 markers were attached to the body of each dancer. We examined how accurately observers were able to identify the expression that the dancers intended to create through their dance movements. Dance experienced and inexperienced observers participated in the experiment. They watched the movements and rated the compatibility of each emotion with each movement on a 5-point Likert scale. The results indicated that both experienced and inexperienced observers could identify all the emotions that dancers intended to express. Identification scores for dance movements with an expressive face were higher than for other expressions. This finding indicates that facial expressions affect the identification of emotions in dance movements, whereas only bodily expressions provide sufficient information to recognize emotions.

  8. Does Facial Expression Recognition Provide a Toehold for the Development of Emotion Understanding?

    Science.gov (United States)

    Strand, Paul S.; Downs, Andrew; Barbosa-Leiker, Celestina

    2016-01-01

    The authors explored predictions from basic emotion theory (BET) that facial emotion expression recognition skills are insular with respect to their own development, and yet foundational to the development of emotional perspective-taking skills. Participants included 417 preschool children for whom estimates of these 2 emotion understanding…

  9. Development and validation of an Argentine set of facial expressions of emotion

    NARCIS (Netherlands)

    Vaiman, M.; Wagner, M.A.; Caicedo, E.; Pereno, G.L.

    2017-01-01

    Pictures of facial expressions of emotion are used in a wide range of experiments. The last decade has seen an increase in the number of studies presenting local sets of emotion stimuli. However, only a few existing sets contain pictures of Latin Americans, despite the growing attention emotion

  10. Recognition of Emotional and Nonemotional Facial Expressions: A Comparison between Williams Syndrome and Autism

    Science.gov (United States)

    Lacroix, Agnes; Guidetti, Michele; Roge, Bernadette; Reilly, Judy

    2009-01-01

    The aim of our study was to compare two neurodevelopmental disorders (Williams syndrome and autism) in terms of the ability to recognize emotional and nonemotional facial expressions. The comparison of these two disorders is particularly relevant to the investigation of face processing and should contribute to a better understanding of social…

  11. Computerized measurement of facial expression of emotions in schizophrenia.

    Science.gov (United States)

    Alvino, Christopher; Kohler, Christian; Barrett, Frederick; Gur, Raquel E; Gur, Ruben C; Verma, Ragini

    2007-07-30

    Deficits in the ability to express emotions characterize several neuropsychiatric disorders and are a hallmark of schizophrenia, and there is need for a method of quantifying expression, which is currently done by clinical ratings. This paper presents the development and validation of a computational framework for quantifying emotional expression differences between patients with schizophrenia and healthy controls. Each face is modeled as a combination of elastic regions, and expression changes are modeled as a deformation between a neutral face and an expressive face. Functions of these deformations, known as the regional volumetric difference (RVD) functions, form distinctive quantitative profiles of expressions. Employing pattern classification techniques, we have designed expression classifiers for the four universal emotions of happiness, sadness, anger and fear by training on RVD functions of expression changes. The classifiers were cross-validated and then applied to facial expression images of patients with schizophrenia and healthy controls. The classification score for each image reflects the extent to which the expressed emotion matches the intended emotion. Group-wise statistical analysis revealed this score to be significantly different between healthy controls and patients, especially in the case of anger. This score correlated with clinical severity of flat affect. These results encourage the use of such deformation based expression quantification measures for research in clinical applications that require the automated measurement of facial affect.

  12. Unconscious processing of facial attractiveness: invisible attractive faces orient visual attention.

    Science.gov (United States)

    Hung, Shao-Min; Nieh, Chih-Hsuan; Hsieh, Po-Jang

    2016-11-16

    Past research has proven human's extraordinary ability to extract information from a face in the blink of an eye, including its emotion, gaze direction, and attractiveness. However, it remains elusive whether facial attractiveness can be processed and influences our behaviors in the complete absence of conscious awareness. Here we demonstrate unconscious processing of facial attractiveness with three distinct approaches. In Experiment 1, the time taken for faces to break interocular suppression was measured. The results showed that attractive faces enjoyed the privilege of breaking suppression and reaching consciousness earlier. In Experiment 2, we further showed that attractive faces had lower visibility thresholds, again suggesting that facial attractiveness could be processed more easily to reach consciousness. Crucially, in Experiment 3, a significant decrease of accuracy on an orientation discrimination task subsequent to an invisible attractive face showed that attractive faces, albeit suppressed and invisible, still exerted an effect by orienting attention. Taken together, for the first time, we show that facial attractiveness can be processed in the complete absence of consciousness, and an unconscious attractive face is still capable of directing our attention.

  13. Specific biases for identifying facial expression of emotion in children and adolescents with conversion disorders.

    Science.gov (United States)

    Kozlowska, Kasia; Brown, Kerri J; Palmer, Donna M; Williams, Lea M

    2013-04-01

    This study aimed to assess how children and adolescents with conversion disorders identify universal facial expressions of emotion and to determine whether identification of emotion in faces relates to subjective emotional distress. Fifty-seven participants (41 girls and 16 boys) aged 8.5 to 18 years with conversion disorders and 57 age- and sex-matched healthy controls completed a computerized task in which their accuracy and reaction times for identifying facial expressions were recorded. To isolate the effect of individual emotional expressions, participants' reaction times for each emotion (fear, anger, sadness, disgust, and happiness) were subtracted from their reaction times for the neutral control face. Participants also completed self-report measures of subjective emotional distress. Children/Adolescents with conversion disorders showed faster reaction times for identifying expressions of sadness (t(112) = -2.2, p = .03; 444 [609] versus 713 [695], p = .03) and slower reactions times for happy expressions (t(99.3) = 2.28, p ≤ .024; -33 [35] versus 174 [51], p = .024), compared with controls (F(33.75, 419.81) = 3.76, p .018). There were also no differences in identification accuracy for any emotion (p > .82). The observation of faster reaction times to sad faces in children and adolescents with conversion disorders suggests increased vigilance and motor readiness to emotional signals that are potential threats to self or to close others. These effects may occur before conscious processing.

  14. Perceived differences between chimpanzee (Pan troglodytes) and human (Homo sapiens) facial expressions are related to emotional interpretation.

    Science.gov (United States)

    Waller, Bridget M; Bard, Kim A; Vick, Sarah-Jane; Smith Pasqualini, Marcia C

    2007-11-01

    Human face perception is a finely tuned, specialized process. When comparing faces between species, therefore, it is essential to consider how people make these observational judgments. Comparing facial expressions may be particularly problematic, given that people tend to consider them categorically as emotional signals, which may affect how accurately specific details are processed. The bared-teeth display (BT), observed in most primates, has been proposed as a homologue of the human smile (J. A. R. A. M. van Hooff, 1972). In this study, judgments of similarity between BT displays of chimpanzees (Pan troglodytes) and human smiles varied in relation to perceived emotional valence. When a chimpanzee BT was interpreted as fearful, observers tended to underestimate the magnitude of the relationship between certain features (the extent of lip corner raise) and human smiles. These judgments may reflect the combined effects of categorical emotional perception, configural face processing, and perceptual organization in mental imagery and may demonstrate the advantages of using standardized observational methods in comparative facial expression research. Copyright 2007 APA.

  15. Greater perceptual sensitivity to happy facial expression.

    Science.gov (United States)

    Maher, Stephen; Ekstrom, Tor; Chen, Yue

    2014-01-01

    Perception of subtle facial expressions is essential for social functioning; yet it is unclear if human perceptual sensitivities differ in detecting varying types of facial emotions. Evidence diverges as to whether salient negative versus positive emotions (such as sadness versus happiness) are preferentially processed. Here, we measured perceptual thresholds for the detection of four types of emotion in faces--happiness, fear, anger, and sadness--using psychophysical methods. We also evaluated the association of the perceptual performances with facial morphological changes between neutral and respective emotion types. Human observers were highly sensitive to happiness compared with the other emotional expressions. Further, this heightened perceptual sensitivity to happy expressions can be attributed largely to the emotion-induced morphological change of a particular facial feature (end-lip raise).

  16. "Now I see it, now I don't": Determining Threshold Levels of Facial Emotion Recognition for Use in Patient Populations.

    Science.gov (United States)

    Chiu, Isabelle; Gfrörer, Regina I; Piguet, Olivier; Berres, Manfred; Monsch, Andreas U; Sollberger, Marc

    2015-08-01

    The importance of including measures of emotion processing, such as tests of facial emotion recognition (FER), as part of a comprehensive neuropsychological assessment is being increasingly recognized. In clinical settings, FER tests need to be sensitive, short, and easy to administer, given the limited time available and patient limitations. Current tests, however, commonly use stimuli that either display prototypical emotions, bearing the risk of ceiling effects and unequal task difficulty, or are cognitively too demanding and time-consuming. To overcome these limitations in FER testing in patient populations, we aimed to define FER threshold levels for the six basic emotions in healthy individuals. Forty-nine healthy individuals between 52 and 79 years of age were asked to identify the six basic emotions at different intensity levels (25%, 50%, 75%, 100%, and 125% of the prototypical emotion). Analyses uncovered differing threshold levels across emotions and sex of facial stimuli, ranging from 50% up to 100% intensities. Using these findings as "healthy population benchmarks", we propose to apply these threshold levels to clinical populations either as facial emotion recognition or intensity rating tasks. As part of any comprehensive social cognition test battery, this approach should allow for a rapid and sensitive assessment of potential FER deficits.

  17. Asians' Facial Responsiveness to Basic Tastes by Automated Facial Expression Analysis System.

    Science.gov (United States)

    Zhi, Ruicong; Cao, Lianyu; Cao, Gang

    2017-03-01

    Growing evidence shows that consumer choices in real life are mostly driven by unconscious mechanisms rather than conscious. The unconscious process could be measured by behavioral measurements. This study aims to apply automatic facial expression analysis technique for consumers' emotion representation, and explore the relationships between sensory perception and facial responses. Basic taste solutions (sourness, sweetness, bitterness, umami, and saltiness) with 6 levels plus water were used, which could cover most of the tastes found in food and drink. The other contribution of this study is to analyze the characteristics of facial expressions and correlation between facial expressions and perceptive hedonic liking for Asian consumers. Up until now, the facial expression application researches only reported for western consumers, while few related researches investigated the facial responses during food consuming for Asian consumers. Experimental results indicated that facial expressions could identify different stimuli with various concentrations and different hedonic levels. The perceived liking increased at lower concentrations and decreased at higher concentrations, while samples with medium concentrations were perceived as the most pleasant except sweetness and bitterness. High correlations were founded between perceived intensities of bitterness, umami, saltiness, and facial reactions of disgust and fear. Facial expression disgust and anger could characterize emotion "dislike," and happiness could characterize emotion "like," while neutral could represent "neither like nor dislike." The identified facial expressions agree with the perceived sensory emotions elicited by basic taste solutions. The correlation analysis between hedonic levels and facial expression intensities obtained in this study are in accordance with that discussed for western consumers. © 2017 Institute of Food Technologists®.

  18. Individual Differences in the Speed of Facial Emotion Recognition Show Little Specificity but Are Strongly Related with General Mental Speed: Psychometric, Neural and Genetic Evidence

    Directory of Open Access Journals (Sweden)

    Xinyang Liu

    2017-08-01

    Full Text Available Facial identity and facial expression processing are crucial socio-emotional abilities but seem to show only limited psychometric uniqueness when the processing speed is considered in easy tasks. We applied a comprehensive measurement of processing speed and contrasted performance specificity in socio-emotional, social and non-social stimuli from an individual differences perspective. Performance in a multivariate task battery could be best modeled by a general speed factor and a first-order factor capturing some specific variance due to processing emotional facial expressions. We further tested equivalence of the relationships between speed factors and polymorphisms of dopamine and serotonin transporter genes. Results show that the speed factors are not only psychometrically equivalent but invariant in their relation with the Catechol-O-Methyl-Transferase (COMT Val158Met polymorphism. However, the 5-HTTLPR/rs25531 serotonin polymorphism was related with the first-order factor of emotion perception speed, suggesting a specific genetic correlate of processing emotions. We further investigated the relationship between several components of event-related brain potentials with psychometric abilities, and tested emotion specific individual differences at the neurophysiological level. Results revealed swifter emotion perception abilities to go along with larger amplitudes of the P100 and the Early Posterior Negativity (EPN, when emotion processing was modeled on its own. However, after partialling out the shared variance of emotion perception speed with general processing speed-related abilities, brain-behavior relationships did not remain specific for emotion. Together, the present results suggest that speed abilities are strongly interrelated but show some specificity for emotion processing speed at the psychometric level. At both genetic and neurophysiological levels, emotion specificity depended on whether general cognition is taken into account

  19. Individual Differences in the Speed of Facial Emotion Recognition Show Little Specificity but Are Strongly Related with General Mental Speed: Psychometric, Neural and Genetic Evidence

    Science.gov (United States)

    Liu, Xinyang; Hildebrandt, Andrea; Recio, Guillermo; Sommer, Werner; Cai, Xinxia; Wilhelm, Oliver

    2017-01-01

    Facial identity and facial expression processing are crucial socio-emotional abilities but seem to show only limited psychometric uniqueness when the processing speed is considered in easy tasks. We applied a comprehensive measurement of processing speed and contrasted performance specificity in socio-emotional, social and non-social stimuli from an individual differences perspective. Performance in a multivariate task battery could be best modeled by a general speed factor and a first-order factor capturing some specific variance due to processing emotional facial expressions. We further tested equivalence of the relationships between speed factors and polymorphisms of dopamine and serotonin transporter genes. Results show that the speed factors are not only psychometrically equivalent but invariant in their relation with the Catechol-O-Methyl-Transferase (COMT) Val158Met polymorphism. However, the 5-HTTLPR/rs25531 serotonin polymorphism was related with the first-order factor of emotion perception speed, suggesting a specific genetic correlate of processing emotions. We further investigated the relationship between several components of event-related brain potentials with psychometric abilities, and tested emotion specific individual differences at the neurophysiological level. Results revealed swifter emotion perception abilities to go along with larger amplitudes of the P100 and the Early Posterior Negativity (EPN), when emotion processing was modeled on its own. However, after partialling out the shared variance of emotion perception speed with general processing speed-related abilities, brain-behavior relationships did not remain specific for emotion. Together, the present results suggest that speed abilities are strongly interrelated but show some specificity for emotion processing speed at the psychometric level. At both genetic and neurophysiological levels, emotion specificity depended on whether general cognition is taken into account or not. These

  20. Behavioural responses to facial and postural expressions of emotion: An interpersonal circumplex approach.

    Science.gov (United States)

    Aan Het Rot, Marije; Enea, Violeta; Dafinoiu, Ion; Iancu, Sorina; Taftă, Steluţa A; Bărbuşelu, Mariana

    2017-11-01

    While the recognition of emotional expressions has been extensively studied, the behavioural response to these expressions has not. In the interpersonal circumplex, behaviour is defined in terms of communion and agency. In this study, we examined behavioural responses to both facial and postural expressions of emotion. We presented 101 Romanian students with facial and postural stimuli involving individuals ('targets') expressing happiness, sadness, anger, or fear. Using an interpersonal grid, participants simultaneously indicated how communal (i.e., quarrelsome or agreeable) and agentic (i.e., dominant or submissive) they would be towards people displaying these expressions. Participants were agreeable-dominant towards targets showing happy facial expressions and primarily quarrelsome towards targets with angry or fearful facial expressions. Responses to targets showing sad facial expressions were neutral on both dimensions of interpersonal behaviour. Postural versus facial expressions of happiness and anger elicited similar behavioural responses. Participants responded in a quarrelsome-submissive way to fearful postural expressions and in an agreeable way to sad postural expressions. Behavioural responses to the various facial expressions were largely comparable to those previously observed in Dutch students. Observed differences may be explained from participants' cultural background. Responses to the postural expressions largely matched responses to the facial expressions. © 2017 The British Psychological Society.

  1. Facial Emotion Recognition Impairment in Patients with Parkinson's Disease and Isolated Apathy

    Directory of Open Access Journals (Sweden)

    Mercè Martínez-Corral

    2010-01-01

    Full Text Available Apathy is a frequent feature of Parkinson's disease (PD, usually related with executive dysfunction. However, in a subgroup of PD patients apathy may represent the only or predominant neuropsychiatric feature. To understand the mechanisms underlying apathy in PD, we investigated emotional processing in PD patients with and without apathy and in healthy controls (HC, assessed by a facial emotion recognition task (FERT. We excluded PD patients with cognitive impairment, depression, other affective disturbances and previous surgery for PD. PD patients with apathy scored significantly worse in the FERT, performing worse in fear, anger, and sadness recognition. No differences, however, were found between nonapathetic PD patients and HC. These findings suggest the existence of a disruption of emotional-affective processing in cognitive preserved PD patients with apathy. To identify specific dysfunction of limbic structures in PD, patients with isolated apathy may have therapeutic and prognostic implications.

  2. Space-by-time manifold representation of dynamic facial expressions for emotion categorization

    Science.gov (United States)

    Delis, Ioannis; Chen, Chaona; Jack, Rachael E.; Garrod, Oliver G. B.; Panzeri, Stefano; Schyns, Philippe G.

    2016-01-01

    Visual categorization is the brain computation that reduces high-dimensional information in the visual environment into a smaller set of meaningful categories. An important problem in visual neuroscience is to identify the visual information that the brain must represent and then use to categorize visual inputs. Here we introduce a new mathematical formalism—termed space-by-time manifold decomposition—that describes this information as a low-dimensional manifold separable in space and time. We use this decomposition to characterize the representations used by observers to categorize the six classic facial expressions of emotion (happy, surprise, fear, disgust, anger, and sad). By means of a Generative Face Grammar, we presented random dynamic facial movements on each experimental trial and used subjective human perception to identify the facial movements that correlate with each emotion category. When the random movements projected onto the categorization manifold region corresponding to one of the emotion categories, observers categorized the stimulus accordingly; otherwise they selected “other.” Using this information, we determined both the Action Unit and temporal components whose linear combinations lead to reliable categorization of each emotion. In a validation experiment, we confirmed the psychological validity of the resulting space-by-time manifold representation. Finally, we demonstrated the importance of temporal sequencing for accurate emotion categorization and identified the temporal dynamics of Action Unit components that cause typical confusions between specific emotions (e.g., fear and surprise) as well as those resolving these confusions. PMID:27305521

  3. El Infant Facial Expressions of Emotions from Looking at Pictures. Versión peruana

    Directory of Open Access Journals (Sweden)

    Pierina Traverso

    2012-12-01

    Full Text Available The Infant Facial Expressions Of Emotions From Looking at Pictures. Peruvian versionThe Peruvian version of the Infant Facial Expression of Emotions from Looking at Pictures (IFEEL, instrument that assessed the interpretation of emotions from children’s faces pictures is presented. The original version from Emde, Osofsky & Butterfield (1993 was developed in the United States and involves 30 stimuli. The Peruvian version involves 25 pictures of children with prototypic facial features of the majority of Peruvian population. A sample of 363 men and women of middle and low socio-economic status between 19 and 45 years old was recruited to develop the Peruvian version. From the results, a lexicon was created with the words that were used by the participants to designate the 14 groups of emotion that were obtained. The majority of these groups had an adequate reliability for temporal stability. Finally, it was found that the socio-economic status (SES is a variable that generates significant differences in the way how persons interpret the emotions. Therefore, referential values of differentiated interpretation were created from this variable.

  4. Impaired recognition of happy facial expressions in bipolar disorder.

    Science.gov (United States)

    Lawlor-Savage, Linette; Sponheim, Scott R; Goghari, Vina M

    2014-08-01

    The ability to accurately judge facial expressions is important in social interactions. Individuals with bipolar disorder have been found to be impaired in emotion recognition; however, the specifics of the impairment are unclear. This study investigated whether facial emotion recognition difficulties in bipolar disorder reflect general cognitive, or emotion-specific, impairments. Impairment in the recognition of particular emotions and the role of processing speed in facial emotion recognition were also investigated. Clinically stable bipolar patients (n = 17) and healthy controls (n = 50) judged five facial expressions in two presentation types, time-limited and self-paced. An age recognition condition was used as an experimental control. Bipolar patients' overall facial recognition ability was unimpaired. However, patients' specific ability to judge happy expressions under time constraints was impaired. Findings suggest a deficit in happy emotion recognition impacted by processing speed. Given the limited sample size, further investigation with a larger patient sample is warranted.

  5. Facial skin blood flow responses during exposures to emotionally charged movies.

    Science.gov (United States)

    Matsukawa, Kanji; Endo, Kana; Ishii, Kei; Ito, Momoka; Liang, Nan

    2018-03-01

    The changes in regional facial skin blood flow and vascular conductance have been assessed for the first time with noninvasive two-dimensional laser speckle flowmetry during audiovisually elicited emotional challenges for 2 min (comedy, landscape, and horror movie) in 12 subjects. Limb skin blood flow and vascular conductance and systemic cardiovascular variables were simultaneously measured. The extents of pleasantness and consciousness for each emotional stimulus were estimated by the subjective rating from -5 (the most unpleasant; the most unconscious) to +5 (the most pleasant; the most conscious). Facial skin blood flow and vascular conductance, especially in the lips, decreased during viewing of comedy and horror movies, whereas they did not change during viewing of a landscape movie. The decreases in facial skin blood flow and vascular conductance were the greatest with the comedy movie. The changes in lip, cheek, and chin skin blood flow negatively correlated (P < 0.05) with the subjective ratings of pleasantness and consciousness. The changes in lip skin vascular conductance negatively correlated (P < 0.05) with the subjective rating of pleasantness, while the changes in infraorbital, subnasal, and chin skin vascular conductance negatively correlated (P < 0.05) with the subjective rating of consciousness. However, none of the changes in limb skin blood flow and vascular conductance and systemic hemodynamics correlated with the subjective ratings. The mental arithmetic task did not alter facial and limb skin blood flows, although the task influenced systemic cardiovascular variables. These findings suggest that the more emotional status becomes pleasant or conscious, the more neurally mediated vasoconstriction may occur in facial skin blood vessels.

  6. Effects of task demands on the early neural processing of fearful and happy facial expressions.

    Science.gov (United States)

    Itier, Roxane J; Neath-Tavares, Karly N

    2017-05-15

    Task demands shape how we process environmental stimuli but their impact on the early neural processing of facial expressions remains unclear. In a within-subject design, ERPs were recorded to the same fearful, happy and neutral facial expressions presented during a gender discrimination, an explicit emotion discrimination and an oddball detection tasks, the most studied tasks in the field. Using an eye tracker, fixation on the face nose was enforced using a gaze-contingent presentation. Task demands modulated amplitudes from 200 to 350ms at occipito-temporal sites spanning the EPN component. Amplitudes were more negative for fearful than neutral expressions starting on N170 from 150 to 350ms, with a temporo-occipital distribution, whereas no clear effect of happy expressions was seen. Task and emotion effects never interacted in any time window or for the ERP components analyzed (P1, N170, EPN). Thus, whether emotion is explicitly discriminated or irrelevant for the task at hand, neural correlates of fearful and happy facial expressions seem immune to these task demands during the first 350ms of visual processing. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Lower Sensitivity to Happy and Angry Facial Emotions in Young Adults with Psychiatric Problems

    Science.gov (United States)

    Vrijen, Charlotte; Hartman, Catharina A.; Lodder, Gerine M. A.; Verhagen, Maaike; de Jonge, Peter; Oldehinkel, Albertine J.

    2016-01-01

    Many psychiatric problem domains have been associated with emotion-specific biases or general deficiencies in facial emotion identification. However, both within and between psychiatric problem domains, large variability exists in the types of emotion identification problems that were reported. Moreover, since the domain-specificity of the findings was often not addressed, it remains unclear whether patterns found for specific problem domains can be better explained by co-occurrence of other psychiatric problems or by more generic characteristics of psychopathology, for example, problem severity. In this study, we aimed to investigate associations between emotion identification biases and five psychiatric problem domains, and to determine the domain-specificity of these biases. Data were collected as part of the ‘No Fun No Glory’ study and involved 2,577 young adults. The study participants completed a dynamic facial emotion identification task involving happy, sad, angry, and fearful faces, and filled in the Adult Self-Report Questionnaire, of which we used the scales depressive problems, anxiety problems, avoidance problems, Attention-Deficit Hyperactivity Disorder (ADHD) problems and antisocial problems. Our results suggest that participants with antisocial problems were significantly less sensitive to happy facial emotions, participants with ADHD problems were less sensitive to angry emotions, and participants with avoidance problems were less sensitive to both angry and happy emotions. These effects could not be fully explained by co-occurring psychiatric problems. Whereas this seems to indicate domain-specificity, inspection of the overall pattern of effect sizes regardless of statistical significance reveals generic patterns as well, in that for all psychiatric problem domains the effect sizes for happy and angry emotions were larger than the effect sizes for sad and fearful emotions. As happy and angry emotions are strongly associated with approach and

  8. Lower sensitivity to happy and angry facial emotions in young adults with psychiatric problems

    Directory of Open Access Journals (Sweden)

    Charlotte Vrijen

    2016-11-01

    Full Text Available Many psychiatric problem domains have been associated with emotion-specific biases or general deficiencies in facial emotion identification. However, both within and between psychiatric problem domains, large variability exists in the types of emotion identification problems that were reported. Moreover, since the domain-specificity of the findings was often not addressed, it remains unclear whether patterns found for specific problem domains can be better explained by co-occurrence of other psychiatric problems or by more generic characteristics of psychopathology, for example, problem severity. In this study, we aimed to investigate associations between emotion identification biases and five psychiatric problem domains, and to determine the domain-specificity of these biases. Data were collected as part of the ‘No Fun No Glory’ study and involved 2,577 young adults. The study participants completed a dynamic facial emotion identification task involving happy, sad, angry, and fearful faces, and filled in the Adult Self-Report Questionnaire, of which we used the scales depressive problems, anxiety problems, avoidance problems, Attention-Deficit Hyperactivity Disorder (ADHD problems and antisocial problems. Our results suggest that participants with antisocial problems were significantly less sensitive to happy facial emotions, participants with ADHD problems were less sensitive to angry emotions, and participants with avoidance problems were less sensitive to both angry and happy emotions. These effects could not be fully explained by co-occurring psychiatric problems. Whereas this seems to indicate domain-specificity, inspection of the overall pattern of effect sizes regardless of statistical significance reveals generic patterns as well, in that for all psychiatric problem domains the effect sizes for happy and angry emotions were larger than the effect sizes for sad and fearful emotions. As happy and angry emotions are strongly associated

  9. Human Empathy, Personality and Experience Affect the Emotion Ratings of Dog and Human Facial Expressions

    Science.gov (United States)

    Kujala, Miiamaaria V.; Somppi, Sanni; Jokela, Markus; Vainio, Outi; Parkkonen, Lauri

    2017-01-01

    Facial expressions are important for humans in communicating emotions to the conspecifics and enhancing interpersonal understanding. Many muscles producing facial expressions in humans are also found in domestic dogs, but little is known about how humans perceive dog facial expressions, and which psychological factors influence people’s perceptions. Here, we asked 34 observers to rate the valence, arousal, and the six basic emotions (happiness, sadness, surprise, disgust, fear, and anger/aggressiveness) from images of human and dog faces with Pleasant, Neutral and Threatening expressions. We investigated how the subjects’ personality (the Big Five Inventory), empathy (Interpersonal Reactivity Index) and experience of dog behavior affect the ratings of dog and human faces. Ratings of both species followed similar general patterns: human subjects classified dog facial expressions from pleasant to threatening very similarly to human facial expressions. Subjects with higher emotional empathy evaluated Threatening faces of both species as more negative in valence and higher in anger/aggressiveness. More empathetic subjects also rated the happiness of Pleasant humans but not dogs higher, and they were quicker in their valence judgments of Pleasant human, Threatening human and Threatening dog faces. Experience with dogs correlated positively with ratings of Pleasant and Neutral dog faces. Personality also had a minor effect on the ratings of Pleasant and Neutral faces in both species. The results imply that humans perceive human and dog facial expression in a similar manner, and the perception of both species is influenced by psychological factors of the evaluators. Especially empathy affects both the speed and intensity of rating dogs’ emotional facial expressions. PMID:28114335

  10. Human Empathy, Personality and Experience Affect the Emotion Ratings of Dog and Human Facial Expressions.

    Directory of Open Access Journals (Sweden)

    Miiamaaria V Kujala

    Full Text Available Facial expressions are important for humans in communicating emotions to the conspecifics and enhancing interpersonal understanding. Many muscles producing facial expressions in humans are also found in domestic dogs, but little is known about how humans perceive dog facial expressions, and which psychological factors influence people's perceptions. Here, we asked 34 observers to rate the valence, arousal, and the six basic emotions (happiness, sadness, surprise, disgust, fear, and anger/aggressiveness from images of human and dog faces with Pleasant, Neutral and Threatening expressions. We investigated how the subjects' personality (the Big Five Inventory, empathy (Interpersonal Reactivity Index and experience of dog behavior affect the ratings of dog and human faces. Ratings of both species followed similar general patterns: human subjects classified dog facial expressions from pleasant to threatening very similarly to human facial expressions. Subjects with higher emotional empathy evaluated Threatening faces of both species as more negative in valence and higher in anger/aggressiveness. More empathetic subjects also rated the happiness of Pleasant humans but not dogs higher, and they were quicker in their valence judgments of Pleasant human, Threatening human and Threatening dog faces. Experience with dogs correlated positively with ratings of Pleasant and Neutral dog faces. Personality also had a minor effect on the ratings of Pleasant and Neutral faces in both species. The results imply that humans perceive human and dog facial expression in a similar manner, and the perception of both species is influenced by psychological factors of the evaluators. Especially empathy affects both the speed and intensity of rating dogs' emotional facial expressions.

  11. Multilevel analysis of facial expressions of emotion and script: self-report (arousal and valence) and psychophysiological correlates.

    Science.gov (United States)

    Balconi, Michela; Vanutelli, Maria Elide; Finocchiaro, Roberta

    2014-09-26

    The paper explored emotion comprehension in children with regard to facial expression of emotion. The effect of valence and arousal evaluation, of context and of psychophysiological measures was monitored. Indeed subjective evaluation of valence (positive vs. negative) and arousal (high vs. low), and contextual (facial expression vs. facial expression and script) variables were supposed to modulate the psychophysiological responses. Self-report measures (in terms of correct recognition, arousal and valence attribution) and psychophysiological correlates (facial electromyography, EMG, skin conductance response, SCR, and heart rate, HR) were observed when children (N = 26; mean age = 8.75 y; range 6-11 y) looked at six facial expressions of emotions (happiness, anger, fear, sadness, surprise, and disgust) and six emotional scripts (contextualized facial expressions). The competencies about the recognition, the evaluation on valence and arousal was tested in concomitance with psychophysiological variations. Specifically, we tested for the congruence of these multiple measures. Log-linear analysis and repeated measure ANOVAs showed different representations across the subjects, as a function of emotion. Specifically, children' recognition and attribution were well developed for some emotions (such as anger, fear, surprise and happiness), whereas some other emotions (mainly disgust and sadness) were less clearly represented. SCR, HR and EMG measures were modulated by the evaluation based on valence and arousal, with increased psychophysiological values mainly in response to anger, fear and happiness. As shown by multiple regression analysis, a significant consonance was found between self-report measures and psychophysiological behavior, mainly for emotions rated as more arousing and negative in valence. The multilevel measures were discussed at light of dimensional attribution model.

  12. Single trial classification for the categories of perceived emotional facial expressions: an event-related fMRI study

    Science.gov (United States)

    Song, Sutao; Huang, Yuxia; Long, Zhiying; Zhang, Jiacai; Chen, Gongxiang; Wang, Shuqing

    2016-03-01

    Recently, several studies have successfully applied multivariate pattern analysis methods to predict the categories of emotions. These studies are mainly focused on self-experienced emotions, such as the emotional states elicited by music or movie. In fact, most of our social interactions involve perception of emotional information from the expressions of other people, and it is an important basic skill for humans to recognize the emotional facial expressions of other people in a short time. In this study, we aimed to determine the discriminability of perceived emotional facial expressions. In a rapid event-related fMRI design, subjects were instructed to classify four categories of facial expressions (happy, disgust, angry and neutral) by pressing different buttons, and each facial expression stimulus lasted for 2s. All participants performed 5 fMRI runs. One multivariate pattern analysis method, support vector machine was trained to predict the categories of facial expressions. For feature selection, ninety masks defined from anatomical automatic labeling (AAL) atlas were firstly generated and each were treated as the input of the classifier; then, the most stable AAL areas were selected according to prediction accuracies, and comprised the final feature sets. Results showed that: for the 6 pair-wise classification conditions, the accuracy, sensitivity and specificity were all above chance prediction, among which, happy vs. neutral , angry vs. disgust achieved the lowest results. These results suggested that specific neural signatures of perceived emotional facial expressions may exist, and happy vs. neutral, angry vs. disgust might be more similar in information representation in the brain.

  13. The role of visual experience in the production of emotional facial expressions by blind people: a review.

    Science.gov (United States)

    Valente, Dannyelle; Theurel, Anne; Gentaz, Edouard

    2018-04-01

    Facial expressions of emotion are nonverbal behaviors that allow us to interact efficiently in social life and respond to events affecting our welfare. This article reviews 21 studies, published between 1932 and 2015, examining the production of facial expressions of emotion by blind people. It particularly discusses the impact of visual experience on the development of this behavior from birth to adulthood. After a discussion of three methodological considerations, the review of studies reveals that blind subjects demonstrate differing capacities for producing spontaneous expressions and voluntarily posed expressions. Seventeen studies provided evidence that blind and sighted spontaneously produce the same pattern of facial expressions, even if some variations can be found, reflecting facial and body movements specific to blindness or differences in intensity and control of emotions in some specific contexts. This suggests that lack of visual experience seems to not have a major impact when this behavior is generated spontaneously in real emotional contexts. In contrast, eight studies examining voluntary expressions indicate that blind individuals have difficulty posing emotional expressions. The opportunity for prior visual observation seems to affect performance in this case. Finally, we discuss three new directions for research to provide additional and strong evidence for the debate regarding the innate or the culture-constant learning character of the production of emotional facial expressions by blind individuals: the link between perception and production of facial expressions, the impact of display rules in the absence of vision, and the role of other channels in expression of emotions in the context of blindness.

  14. Gender and facial dominance in gaze cuing: Emotional context matters in the eyes that we follow

    NARCIS (Netherlands)

    Ohlsen, G.; van Zoest, W.; van Vugt, M.

    2013-01-01

    Gaze following is a socio-cognitive process that provides adaptive information about potential threats and opportunities in the individual's environment. The aim of the present study was to investigate the potential interaction between emotional context and facial dominance in gaze following. We

  15. Impaired mixed emotion processing in the right ventrolateral prefrontal cortex in schizophrenia: an fMRI study.

    Science.gov (United States)

    Szabó, Ádám György; Farkas, Kinga; Marosi, Csilla; Kozák, Lajos R; Rudas, Gábor; Réthelyi, János; Csukly, Gábor

    2017-12-08

    Schizophrenia has a negative effect on the activity of the temporal and prefrontal cortices in the processing of emotional facial expressions. However no previous research focused on the evaluation of mixed emotions in schizophrenia, albeit they are frequently expressed in everyday situations and negative emotions are frequently expressed by mixed facial expressions. Altogether 37 subjects, 19 patients with schizophrenia and 18 healthy control subjects were enrolled in the study. The two study groups did not differ in age and education. The stimulus set consisted of 10 fearful (100%), 10 happy (100%), 10 mixed fear (70% fear and 30% happy) and 10 mixed happy facial expressions. During the fMRI acquisition pictures were presented in a randomized order and subjects had to categorize expressions by button press. A decreased activation was found in the patient group during fear, mixed fear and mixed happy processing in the right ventrolateral prefrontal cortex (VLPFC) and the right anterior insula (RAI) at voxel and cluster level after familywise error correction. No difference was found between study groups in activations to happy facial condition. Patients with schizophrenia did not show a differential activation between mixed happy and happy facial expression similar to controls in the right dorsolateral prefrontal cortex (DLPFC). Patients with schizophrenia showed decreased functioning in right prefrontal regions responsible for salience signaling and valence evaluation during emotion recognition. Our results indicate that fear and mixed happy/fear processing are impaired in schizophrenia, while happy facial expression processing is relatively intact.

  16. Recognition of facial and musical emotions in Parkinson's disease.

    Science.gov (United States)

    Saenz, A; Doé de Maindreville, A; Henry, A; de Labbey, S; Bakchine, S; Ehrlé, N

    2013-03-01

    Patients with amygdala lesions were found to be impaired in recognizing the fear emotion both from face and from music. In patients with Parkinson's disease (PD), impairment in recognition of emotions from facial expressions was reported for disgust, fear, sadness and anger, but no studies had yet investigated this population for the recognition of emotions from both face and music. The ability to recognize basic universal emotions (fear, happiness and sadness) from both face and music was investigated in 24 medicated patients with PD and 24 healthy controls. The patient group was tested for language (verbal fluency tasks), memory (digit and spatial span), executive functions (Similarities and Picture Completion subtests of the WAIS III, Brixton and Stroop tests), visual attention (Bells test), and fulfilled self-assessment tests for anxiety and depression. Results showed that the PD group was significantly impaired for recognition of both fear and sadness emotions from facial expressions, whereas their performance in recognition of emotions from musical excerpts was not different from that of the control group. The scores of fear and sadness recognition from faces were neither correlated to scores in tests for executive and cognitive functions, nor to scores in self-assessment scales. We attributed the observed dissociation to the modality (visual vs. auditory) of presentation and to the ecological value of the musical stimuli that we used. We discuss the relevance of our findings for the care of patients with PD. © 2012 The Author(s) European Journal of Neurology © 2012 EFNS.

  17. Test battery for measuring the perception and recognition of facial expressions of emotion

    Science.gov (United States)

    Wilhelm, Oliver; Hildebrandt, Andrea; Manske, Karsten; Schacht, Annekathrin; Sommer, Werner

    2014-01-01

    Despite the importance of perceiving and recognizing facial expressions in everyday life, there is no comprehensive test battery for the multivariate assessment of these abilities. As a first step toward such a compilation, we present 16 tasks that measure the perception and recognition of facial emotion expressions, and data illustrating each task's difficulty and reliability. The scoring of these tasks focuses on either the speed or accuracy of performance. A sample of 269 healthy young adults completed all tasks. In general, accuracy and reaction time measures for emotion-general scores showed acceptable and high estimates of internal consistency and factor reliability. Emotion-specific scores yielded lower reliabilities, yet high enough to encourage further studies with such measures. Analyses of task difficulty revealed that all tasks are suitable for measuring emotion perception and emotion recognition related abilities in normal populations. PMID:24860528

  18. The contemptuous separation: Facial expressions of emotion and breakups in young adulthood.

    Science.gov (United States)

    Heshmati, Saeideh; Sbarra, David A; Mason, Ashley E

    2017-06-01

    The importance of studying specific and expressed emotions after a stressful life event is well known, yet few studies have moved beyond assessing self-reported emotional responses to a romantic breakup. This study examined associations between computer-recognized facial expressions and self-reported breakup-related distress among recently separated college-aged young adults ( N = 135; 37 men) on four visits across 9 weeks. Participants' facial expressions were coded using the Computer Expression Recognition Toolbox while participants spoke about their breakups. Of the seven expressed emotions studied, only Contempt showed a unique association with breakup-related distress over time. At baseline, greater Contempt was associated with less breakup-related distress; however, over time, greater Contempt was associated with greater breakup-related distress.

  19. Psilocybin biases facial recognition, goal-directed behavior, and mood state toward positive relative to negative emotions through different serotonergic subreceptors.

    Science.gov (United States)

    Kometer, Michael; Schmidt, André; Bachmann, Rosilla; Studerus, Erich; Seifritz, Erich; Vollenweider, Franz X

    2012-12-01

    Serotonin (5-HT) 1A and 2A receptors have been associated with dysfunctional emotional processing biases in mood disorders. These receptors further predominantly mediate the subjective and behavioral effects of psilocybin and might be important for its recently suggested antidepressive effects. However, the effect of psilocybin on emotional processing biases and the specific contribution of 5-HT2A receptors across different emotional domains is unknown. In a randomized, double-blind study, 17 healthy human subjects received on 4 separate days placebo, psilocybin (215 μg/kg), the preferential 5-HT2A antagonist ketanserin (50 mg), or psilocybin plus ketanserin. Mood states were assessed by self-report ratings, and behavioral and event-related potential measurements were used to quantify facial emotional recognition and goal-directed behavior toward emotional cues. Psilocybin enhanced positive mood and attenuated recognition of negative facial expression. Furthermore, psilocybin increased goal-directed behavior toward positive compared with negative cues, facilitated positive but inhibited negative sequential emotional effects, and valence-dependently attenuated the P300 component. Ketanserin alone had no effects but blocked the psilocybin-induced mood enhancement and decreased recognition of negative facial expression. This study shows that psilocybin shifts the emotional bias across various psychological domains and that activation of 5-HT2A receptors is central in mood regulation and emotional face recognition in healthy subjects. These findings may not only have implications for the pathophysiology of dysfunctional emotional biases but may also provide a framework to delineate the mechanisms underlying psylocybin's putative antidepressant effects. Copyright © 2012 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  20. Alexithymia and the processing of emotional facial expressions (EFEs: systematic review, unanswered questions and further perspectives.

    Directory of Open Access Journals (Sweden)

    Delphine Grynberg

    Full Text Available Alexithymia is characterized by difficulties in identifying, differentiating and describing feelings. A high prevalence of alexithymia has often been observed in clinical disorders characterized by low social functioning. This review aims to assess the association between alexithymia and the ability to decode emotional facial expressions (EFEs within clinical and healthy populations. More precisely, this review has four main objectives: (1 to assess if alexithymia is a better predictor of the ability to decode EFEs than the diagnosis of clinical disorder; (2 to assess the influence of comorbid factors (depression and anxiety disorder on the ability to decode EFE; (3 to investigate if deficits in decoding EFEs are specific to some levels of processing or task types; (4 to investigate if the deficits are specific to particular EFEs. Twenty four studies (behavioural and neuroimaging were identified through a computerized literature search of Psycinfo, PubMed, and Web of Science databases from 1990 to 2010. Data on methodology, clinical characteristics, and possible confounds were analyzed. The review revealed that: (1 alexithymia is associated with deficits in labelling EFEs among clinical disorders, (2 the level of depression and anxiety partially account for the decoding deficits, (3 alexithymia is associated with reduced perceptual abilities, and is likely to be associated with impaired semantic representations of emotional concepts, and (4 alexithymia is associated with neither specific EFEs nor a specific valence. These studies are discussed with respect to processes involved in the recognition of EFEs. Future directions for research on emotion perception are also discussed.

  1. How does context affect assessments of facial emotion? The role of culture and age.

    Science.gov (United States)

    Ko, Seon-Gyu; Lee, Tae-Ho; Yoon, Hyea-Young; Kwon, Jung-Hye; Mather, Mara

    2011-03-01

    People from Asian cultures are more influenced by context in their visual processing than people from Western cultures. In this study, we examined how these cultural differences in context processing affect how people interpret facial emotions. We found that younger Koreans were more influenced than younger Americans by emotional background pictures when rating the emotion of a central face, especially those younger Koreans with low self-rated stress. In contrast, among older adults, neither Koreans nor Americans showed significant influences of context in their face emotion ratings. These findings suggest that cultural differences in reliance on context to interpret others' emotions depend on perceptual integration processes that decline with age, leading to fewer cultural differences in perception among older adults than among younger adults. Furthermore, when asked to recall the background pictures, younger participants recalled more negative pictures than positive pictures, whereas older participants recalled similar numbers of positive and negative pictures. These age differences in the valence of memory were consistent across culture. (c) 2011 APA, all rights reserved.

  2. Psychopathy and facial emotion recognition ability in patients with bipolar affective disorder with or without delinquent behaviors.

    Science.gov (United States)

    Demirel, Husrev; Yesilbas, Dilek; Ozver, Ismail; Yuksek, Erhan; Sahin, Feyzi; Aliustaoglu, Suheyla; Emul, Murat

    2014-04-01

    It is well known that patients with bipolar disorder are more prone to violence and have more criminal behaviors than general population. A strong relationship between criminal behavior and inability to empathize and imperceptions to other person's feelings and facial expressions increases the risk of delinquent behaviors. In this study, we aimed to investigate the deficits of facial emotion recognition ability in euthymic bipolar patients who committed an offense and compare with non-delinquent euthymic patients with bipolar disorder. Fifty-five euthymic patients with delinquent behaviors and 54 non-delinquent euthymic bipolar patients as a control group were included in the study. Ekman's Facial Emotion Recognition Test, sociodemographic data, Hare Psychopathy Checklist, Hamilton Depression Rating Scale and Young Mania Rating Scale were applied to both groups. There were no significant differences between case and control groups in the meaning of average age, gender, level of education, mean age onset of disease and suicide attempt (p>0.05). The three types of most committed delinquent behaviors in patients with euthymic bipolar disorder were as follows: injury (30.8%), threat or insult (20%) and homicide (12.7%). The best accurate percentage of identified facial emotion was "happy" (>99%, for both) while the worst misidentified facial emotion was "fear" in both groups (delinquent behaviors than non-delinquent ones (pdelinquent behaviors. We have shown that patients with bipolar disorder who had delinquent behaviors may have some social interaction problems i.e., misrecognizing fearful and modestly anger facial emotions and need some more time to response facial emotions even in remission. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Processing of masked and unmasked emotional faces under different attentional conditions: an electrophysiological investigation.

    Directory of Open Access Journals (Sweden)

    Marzia eDel Zotto

    2015-10-01

    Full Text Available In order to investigate the interactions between non-spatial selective attention, awareness and emotion processing, we carried out an ERP study using a backward masking paradigm, in which angry, fearful, happy and neutral facial expressions were presented, while participants attempted to detect the presence of one or the other category of facial expressions in the different experimental blocks. ERP results showed that negative emotions enhanced an early N170 response over temporal-occipital leads in both masked and unmasked conditions, independently of selective attention. A later effect arising at the P2 was linked to awareness. Finally, selective attention was found to affect the N2 and N3 components over occipito-parietal leads. Our findings reveal that i the initial processing of facial expressions arises prior to attention and awareness; ii attention and awareness give rise to temporally distinct periods of activation independently of the type of emotion with only a partial degree of overlap; and iii selective attention appears to be influenced by the emotional nature of the stimuli, which in turn impinges on unconscious processing at a very early stage. This study confirms previous reports that negative facial expressions can be processed rapidly, in absence of visual awareness and independently of selective attention. On the other hand, attention and awareness may operate in a synergistic way, depending on task demand.

  4. A neural network underlying intentional emotional facial expression in neurodegenerative disease

    Directory of Open Access Journals (Sweden)

    Kelly A. Gola

    2017-01-01

    Full Text Available Intentional facial expression of emotion is critical to healthy social interactions. Patients with neurodegenerative disease, particularly those with right temporal or prefrontal atrophy, show dramatic socioemotional impairment. This was an exploratory study examining the neural and behavioral correlates of intentional facial expression of emotion in neurodegenerative disease patients and healthy controls. One hundred and thirty three participants (45 Alzheimer's disease, 16 behavioral variant frontotemporal dementia, 8 non-fluent primary progressive aphasia, 10 progressive supranuclear palsy, 11 right-temporal frontotemporal dementia, 9 semantic variant primary progressive aphasia patients and 34 healthy controls were video recorded while imitating static images of emotional faces and producing emotional expressions based on verbal command; the accuracy of their expression was rated by blinded raters. Participants also underwent face-to-face socioemotional testing and informants described participants' typical socioemotional behavior. Patients' performance on emotion expression tasks was correlated with gray matter volume using voxel-based morphometry (VBM across the entire sample. We found that intentional emotional imitation scores were related to fundamental socioemotional deficits; patients with known socioemotional deficits performed worse than controls on intentional emotion imitation; and intentional emotional expression predicted caregiver ratings of empathy and interpersonal warmth. Whole brain VBMs revealed a rightward cortical atrophy pattern homologous to the left lateralized speech production network was associated with intentional emotional imitation deficits. Results point to a possible neural mechanisms underlying complex socioemotional communication deficits in neurodegenerative disease patients.

  5. Brain response to masked and unmasked facial emotions as a function of implicit and explicit personality self-concept of extraversion.

    Science.gov (United States)

    Suslow, Thomas; Kugel, Harald; Lindner, Christian; Dannlowski, Udo; Egloff, Boris

    2017-01-06

    Extraversion-introversion is a personality dimension referring to individual differences in social behavior. In the past, neurobiological research on extraversion was almost entirely based upon questionnaires which inform about the explicit self-concept. Today, indirect measures are available that tap into the implicit self-concept of extraversion which is assumed to result from automatic processing functions. In our study, brain activation while viewing facial expression of affiliation relevant (i.e., happiness, and disgust) and irrelevant (i.e., fear) emotions was examined as a function of the implicit and explicit self-concept of extraversion and processing mode (automatic vs. controlled). 40 healthy volunteers watched blocks of masked and unmasked emotional faces while undergoing functional magnetic resonance imaging. The Implicit Association Test and the NEO Five-Factor Inventory were applied as implicit and explicit measures of extraversion which were uncorrelated in our sample. Implicit extraversion was found to be positively associated with neural response to masked happy faces in the thalamus and temporo-parietal regions and to masked disgust faces in cerebellar areas. Moreover, it was positively correlated with brain response to unmasked disgust faces in the amygdala and cortical areas. Explicit extraversion was not related to brain response to facial emotions when controlling trait anxiety. The implicit compared to the explicit self-concept of extraversion seems to be more strongly associated with brain activation not only during automatic but also during controlled processing of affiliation relevant facial emotions. Enhanced neural response to facial disgust could reflect high sensitivity to signals of interpersonal rejection in extraverts (i.e., individuals with affiliative tendencies). Copyright © 2016 IBRO. Published by Elsevier Ltd. All rights reserved.

  6. Emotion Index of Cover Song Music Video Clips based on Facial Expression Recognition

    DEFF Research Database (Denmark)

    Kavallakis, George; Vidakis, Nikolaos; Triantafyllidis, Georgios

    2017-01-01

    This paper presents a scheme of creating an emotion index of cover song music video clips by recognizing and classifying facial expressions of the artist in the video. More specifically, it fuses effective and robust algorithms which are employed for expression recognition, along with the use...... of a neural network system using the features extracted by the SIFT algorithm. Also we support the need of this fusion of different expression recognition algorithms, because of the way that emotions are linked to facial expressions in music video clips....

  7. Multilevel analysis of facial expressions of emotion and script: self-report (arousal and valence) and psychophysiological correlates

    OpenAIRE

    Balconi, Michela; Vanutelli, Maria Elide; Finocchiaro, Roberta

    2014-01-01

    Background The paper explored emotion comprehension in children with regard to facial expression of emotion. The effect of valence and arousal evaluation, of context and of psychophysiological measures was monitored. Indeed subjective evaluation of valence (positive vs. negative) and arousal (high vs. low), and contextual (facial expression vs. facial expression and script) variables were supposed to modulate the psychophysiological responses. Methods Self-report measures (in terms of correct...

  8. Visual Scanning Patterns and Executive Function in Relation to Facial Emotion Recognition in Aging

    Science.gov (United States)

    Circelli, Karishma S.; Clark, Uraina S.; Cronin-Golomb, Alice

    2012-01-01

    Objective The ability to perceive facial emotion varies with age. Relative to younger adults (YA), older adults (OA) are less accurate at identifying fear, anger, and sadness, and more accurate at identifying disgust. Because different emotions are conveyed by different parts of the face, changes in visual scanning patterns may account for age-related variability. We investigated the relation between scanning patterns and recognition of facial emotions. Additionally, as frontal-lobe changes with age may affect scanning patterns and emotion recognition, we examined correlations between scanning parameters and performance on executive function tests. Methods We recorded eye movements from 16 OA (mean age 68.9) and 16 YA (mean age 19.2) while they categorized facial expressions and non-face control images (landscapes), and administered standard tests of executive function. Results OA were less accurate than YA at identifying fear (precognition of sad expressions and with scanning patterns for fearful, sad, and surprised expressions. Conclusion We report significant age-related differences in visual scanning that are specific to faces. The observed relation between scanning patterns and executive function supports the hypothesis that frontal-lobe changes with age may underlie some changes in emotion recognition. PMID:22616800

  9. Measuring Consumer Emotional Response to Tastes and Foods through Facial Expression Analysis

    OpenAIRE

    Arnade, Elizabeth Amalia

    2014-01-01

    Emotions are thought to play a crucial role in food behavior. Non-rational emotional decision making may be credited as the reason why consumers select what, how, and when they choose to interact with a food product. In this research, three experiments were completed for the overall goal of understanding the usefulness and validity of selected emotional measurement tools, specifically emotion questionnaire ballots and facial expression analysis, as compared to conventional sensory methods in ...

  10. Generation of facial expressions from emotion using a fuzzy rule based system

    NARCIS (Netherlands)

    Bui, T.D.; Heylen, Dirk K.J.; Poel, Mannes; Nijholt, Antinus; Stumptner, Markus; Corbett, Dan; Brooks, Mike

    2001-01-01

    We propose a fuzzy rule-based system to map representations of the emotional state of an animated agent onto muscle contraction values for the appropriate facial expressions. Our implementation pays special attention to the way in which continuous changes in the intensity of emotions can be

  11. Moral processing deficit in behavioral variant frontotemporal dementia is associated with facial emotion recognition and brain changes in default mode and salience network areas.

    Science.gov (United States)

    Van den Stock, Jan; Stam, Daphne; De Winter, François-Laurent; Mantini, Dante; Szmrecsanyi, Benedikt; Van Laere, Koen; Vandenberghe, Rik; Vandenbulcke, Mathieu

    2017-12-01

    Behavioral variant frontotemporal dementia (bvFTD) is associated with abnormal emotion recognition and moral processing. We assessed emotion detection, discrimination, matching, selection, and categorization as well as judgments of nonmoral, moral impersonal, moral personal low- and high-conflict scenarios. bvFTD patients gave more utilitarian responses on low-conflict personal moral dilemmas. There was a significant correlation between a facial emotion processing measure derived through principal component analysis and utilitarian responses on low-conflict personal scenarios in the bvFTD group (controlling for MMSE-score and syntactic abilities). Voxel-based morphometric multiple regression analysis in the bvFTD group revealed a significant association between the proportion of utilitarian responses on personal low-conflict dilemmas and gray matter volume in ventromedial prefrontal areas ( p height  emotions in moral cognition and suggest a common basis for deficits in both abilities, possibly related to reduced experience of emotional sensations. At the neural level abnormal moral cognition in bvFTD is related to structural integrity of the medial prefrontal cortex and functional characteristics of the anterior insula. The present findings provide a common basis for emotion recognition and moral reasoning and link them with areas in the default mode and salience network.

  12. Facial emotion recognition deficits following moderate-severe Traumatic Brain Injury (TBI): re-examining the valence effect and the role of emotion intensity.

    Science.gov (United States)

    Rosenberg, Hannah; McDonald, Skye; Dethier, Marie; Kessels, Roy P C; Westbrook, R Frederick

    2014-11-01

    Many individuals who sustain moderate-severe traumatic brain injuries (TBI) are poor at recognizing emotional expressions, with a greater impairment in recognizing negative (e.g., fear, disgust, sadness, and anger) than positive emotions (e.g., happiness and surprise). It has been questioned whether this "valence effect" might be an artifact of the wide use of static facial emotion stimuli (usually full-blown expressions) which differ in difficulty rather than a real consequence of brain impairment. This study aimed to investigate the valence effect in TBI, while examining emotion recognition across different intensities (low, medium, and high). Twenty-seven individuals with TBI and 28 matched control participants were tested on the Emotion Recognition Task (ERT). The TBI group was more impaired in overall emotion recognition, and less accurate recognizing negative emotions. However, examining the performance across the different intensities indicated that this difference was driven by some emotions (e.g., happiness) being much easier to recognize than others (e.g., fear and surprise). Our findings indicate that individuals with TBI have an overall deficit in facial emotion recognition, and that both people with TBI and control participants found some emotions more difficult than others. These results suggest that conventional measures of facial affect recognition that do not examine variance in the difficulty of emotions may produce erroneous conclusions about differential impairment. They also cast doubt on the notion that dissociable neural pathways underlie the recognition of positive and negative emotions, which are differentially affected by TBI and potentially other neurological or psychiatric disorders.

  13. Facial Emotion Recognition in Child Psychiatry: A Systematic Review

    Science.gov (United States)

    Collin, Lisa; Bindra, Jasmeet; Raju, Monika; Gillberg, Christopher; Minnis, Helen

    2013-01-01

    This review focuses on facial affect (emotion) recognition in children and adolescents with psychiatric disorders other than autism. A systematic search, using PRISMA guidelines, was conducted to identify original articles published prior to October 2011 pertaining to face recognition tasks in case-control studies. Used in the qualitative…

  14. Reduced Recognition of Dynamic Facial Emotional Expressions and Emotion-Specific Response Bias in Children with an Autism Spectrum Disorder

    Science.gov (United States)

    Evers, Kris; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2015-01-01

    Emotion labelling was evaluated in two matched samples of 6-14-year old children with and without an autism spectrum disorder (ASD; N = 45 and N = 50, resp.), using six dynamic facial expressions. The Emotion Recognition Task proved to be valuable demonstrating subtle emotion recognition difficulties in ASD, as we showed a general poorer emotion…

  15. MEG evidence for dynamic amygdala modulations by gaze and facial emotions.

    Directory of Open Access Journals (Sweden)

    Thibaud Dumas

    Full Text Available Amygdala is a key brain region for face perception. While the role of amygdala in the perception of facial emotion and gaze has been extensively highlighted with fMRI, the unfolding in time of amydgala responses to emotional versus neutral faces with different gaze directions is scarcely known.Here we addressed this question in healthy subjects using MEG combined with an original source imaging method based on individual amygdala volume segmentation and the localization of sources in the amygdala volume. We found an early peak of amygdala activity that was enhanced for fearful relative to neutral faces between 130 and 170 ms. The effect of emotion was again significant in a later time range (310-350 ms. Moreover, the amygdala response was greater for direct relative averted gaze between 190 and 350 ms, and this effect was selective of fearful faces in the right amygdala.Altogether, our results show that the amygdala is involved in the processing and integration of emotion and gaze cues from faces in different time ranges, thus underlining its role in multiple stages of face perception.

  16. Visuo-spatial interference affects the identification of emotional facial expressions in unmedicated Parkinson's patients.

    Science.gov (United States)

    García-Rodríguez, Beatriz; Guillén, Carmen Casares; Barba, Rosa Jurado; io Valladolid, Gabriel Rub; Arjona, José Antonio Molina; Ellgring, Heiner

    2012-02-15

    There is evidence that visuo-spatial capacity can become overloaded when processing a secondary visual task (Dual Task, DT), as occurs in daily life. Hence, we investigated the influence of the visuo-spatial interference in the identification of emotional facial expressions (EFEs) in early stages of Parkinson's disease (PD). We compared the identification of 24 emotional faces that illustrate six basic emotions in, unmedicated recently diagnosed PD patients (16) and healthy adults (20), under two different conditions: a) simple EFE identification, and b) identification with a concurrent visuo-spatial task (Corsi Blocks). EFE identification by PD patients was significantly worse than that of healthy adults when combined with another visual stimulus. Published by Elsevier B.V.

  17. Colour Perception on Facial Expression towards Emotion

    Directory of Open Access Journals (Sweden)

    Rubita Sudirman

    2012-12-01

    Full Text Available This study is to investigate human perceptions on pairing of facial expressions of emotion with colours. A group of 27 subjects consisting mainly of younger and Malaysian had participated in this study. For each of the seven faces, which expresses the basic emotions neutral, happiness, surprise, anger, disgust, fear and sadness, a single colour is chosen from the eight basic colours for the match of best visual look to the face accordingly. The different emotions appear well characterized by a single colour. The approaches used in this experiment for analysis are psychology disciplines and colours engineering. These seven emotions are being matched by the subjects with their perceptions and feeling. Then, 12 male and 12 female data are randomly chosen from among the previous data to make a colour perception comparison between genders. The successes or failures in running of this test depend on the possibility of subjects to propose their every single colour for each expression. The result will translate into number and percentage as a guide for colours designers and psychology field.

  18. Facial Emotion Recognition: A Survey and Real-World User Experiences in Mixed Reality.

    Science.gov (United States)

    Mehta, Dhwani; Siddiqui, Mohammad Faridul Haque; Javaid, Ahmad Y

    2018-02-01

    Extensive possibilities of applications have made emotion recognition ineluctable and challenging in the field of computer science. The use of non-verbal cues such as gestures, body movement, and facial expressions convey the feeling and the feedback to the user. This discipline of Human-Computer Interaction places reliance on the algorithmic robustness and the sensitivity of the sensor to ameliorate the recognition. Sensors play a significant role in accurate detection by providing a very high-quality input, hence increasing the efficiency and the reliability of the system. Automatic recognition of human emotions would help in teaching social intelligence in the machines. This paper presents a brief study of the various approaches and the techniques of emotion recognition. The survey covers a succinct review of the databases that are considered as data sets for algorithms detecting the emotions by facial expressions. Later, mixed reality device Microsoft HoloLens (MHL) is introduced for observing emotion recognition in Augmented Reality (AR). A brief introduction of its sensors, their application in emotion recognition and some preliminary results of emotion recognition using MHL are presented. The paper then concludes by comparing results of emotion recognition by the MHL and a regular webcam.

  19. Influence of Emotional Facial Expressions on 3-5-Year-Olds' Face Recognition

    Science.gov (United States)

    Freitag, Claudia; Schwarzer, Gudrun

    2011-01-01

    Three experiments examined 3- and 5-year-olds' recognition of faces in constant and varied emotional expressions. Children were asked to identify repeatedly presented target faces, distinguishing them from distractor faces, during an immediate recognition test and during delayed assessments after 10 min and one week. Emotional facial expression…

  20. Emotional facial expressions in European-American, Japanese, and Chinese infants.

    Science.gov (United States)

    Camras, Linda A; Oster, Harriet; Campos, Joseph J; Bakemand, Roger

    2003-12-01

    Charles Darwin was among the first to recognize the important contribution that infant studies could make to our understanding of human emotional expression. Noting that infants come to exhibit many emotions, he also observed that at first their repertoire of expression is highly restricted. Today, considerable controversy exists regarding the question of whether infants experience and express discrete emotions. According to one position, discrete emotions emerge during infancy along with their prototypic facial expressions. These expressions closely resemble adult emotional expressions and are invariantly concordant with their corresponding emotions. In contrast, we propose that the relation between expression and emotion during infancy is more complex. Some infant emotions and emotional expressions may not be invariantly concordant. Furthermore, infant emotional expressions may be less differentiated than previously proposed. Together with past developmental studies, recent cross-cultural research supports this view and suggests that negative emotional expression in particular is only partly differentiated towards the end of the first year.

  1. Identification and intensity of disgust: Distinguishing visual, linguistic and facial expressions processing in Parkinson disease.

    Science.gov (United States)

    Sedda, Anna; Petito, Sara; Guarino, Maria; Stracciari, Andrea

    2017-07-14

    Most of the studies since now show an impairment for facial displays of disgust recognition in Parkinson disease. A general impairment in disgust processing in patients with Parkinson disease might adversely affect their social interactions, given the relevance of this emotion for human relations. However, despite the importance of faces, disgust is also expressed through other format of visual stimuli such as sentences and visual images. The aim of our study was to explore disgust processing in a sample of patients affected by Parkinson disease, by means of various tests tackling not only facial recognition but also other format of visual stimuli through which disgust can be recognized. Our results confirm that patients are impaired in recognizing facial displays of disgust. Further analyses show that patients are also impaired and slower for other facial expressions, with the only exception of happiness. Notably however, patients with Parkinson disease processed visual images and sentences as controls. Our findings show a dissociation within different formats of visual stimuli of disgust, suggesting that Parkinson disease is not characterized by a general compromising of disgust processing, as often suggested. The involvement of the basal ganglia-frontal cortex system might spare some cognitive components of emotional processing, related to memory and culture, at least for disgust. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Linking children's neuropsychological processing of emotion with their knowledge of emotion expression regulation.

    Science.gov (United States)

    Watling, Dawn; Bourne, Victoria J

    2007-09-01

    Understanding of emotions has been shown to develop between the ages of 4 and 10 years; however, individual differences exist in this development. While previous research has typically examined these differences in terms of developmental and/or social factors, little research has considered the possible impact of neuropsychological development on the behavioural understanding of emotions. Emotion processing tends to be lateralised to the right hemisphere of the brain in adults, yet this pattern is not as evident in children until around the age of 10 years. In this study 136 children between 5 and 10 years were given both behavioural and neuropsychological tests of emotion processing. The behavioural task examined expression regulation knowledge (ERK) for prosocial and self-presentational hypothetical interactions. The chimeric faces test was given as a measure of lateralisation for processing positive facial emotion. An interaction between age and lateralisation for emotion processing was predictive of children's ERK for only the self-presentational interactions. The relationships between children's ERK and lateralisation for emotion processing changes across the three age groups, emerging as a positive relationship in the 10-year-olds. The 10-years-olds who were more lateralised to the right hemisphere for emotion processing tended to show greater understanding of the need for regulating negative emotions during interactions that would have a self-presentational motivation. This finding suggests an association between the behavioural and neuropsychological development of emotion processing.

  3. Impact of Social Cognition on Alcohol Dependence Treatment Outcome: Poorer Facial Emotion Recognition Predicts Relapse/Dropout.

    Science.gov (United States)

    Rupp, Claudia I; Derntl, Birgit; Osthaus, Friederike; Kemmler, Georg; Fleischhacker, W Wolfgang

    2017-12-01

    Despite growing evidence for neurobehavioral deficits in social cognition in alcohol use disorder (AUD), the clinical relevance remains unclear, and little is known about its impact on treatment outcome. This study prospectively investigated the impact of neurocognitive social abilities at treatment onset on treatment completion. Fifty-nine alcohol-dependent patients were assessed with measures of social cognition including 3 core components of empathy via paradigms measuring: (i) emotion recognition (the ability to recognize emotions via facial expression), (ii) emotional perspective taking, and (iii) affective responsiveness at the beginning of inpatient treatment for alcohol dependence. Subjective measures were also obtained, including estimates of task performance and a self-report measure of empathic abilities (Interpersonal Reactivity Index). According to treatment outcomes, patients were divided into a patient group with a regular treatment course (e.g., with planned discharge and without relapse during treatment) or an irregular treatment course (e.g., relapse and/or premature and unplanned termination of treatment, "dropout"). Compared with patients completing treatment in a regular fashion, patients with relapse and/or dropout of treatment had significantly poorer facial emotion recognition ability at treatment onset. Additional logistic regression analyses confirmed these results and identified poor emotion recognition performance as a significant predictor for relapse/dropout. Self-report (subjective) measures did not correspond with neurobehavioral social cognition measures, respectively objective task performance. Analyses of individual subtypes of facial emotions revealed poorer recognition particularly of disgust, anger, and no (neutral faces) emotion in patients with relapse/dropout. Social cognition in AUD is clinically relevant. Less successful treatment outcome was associated with poorer facial emotion recognition ability at the beginning of

  4. Emotion and sex of facial stimuli modulate conditional automaticity in behavioral and neuronal interference in healthy men.

    Science.gov (United States)

    Kohn, Nils; Fernández, Guillén

    2017-12-06

    Our surrounding provides a host of sensory input, which we cannot fully process without streamlining and automatic processing. Levels of automaticity differ for different cognitive and affective processes. Situational and contextual interactions between cognitive and affective processes in turn influence the level of automaticity. Automaticity can be measured by interference in Stroop tasks. We applied an emotional version of the Stroop task to investigate how stress as a contextual factor influences the affective valence-dependent level of automaticity. 120 young, healthy men were investigated for behavioral and brain interference following a stress induction or control procedure in a counter-balanced cross-over-design. Although Stroop interference was always observed, sex and emotion of the face strongly modulated interference, which was larger for fearful and male faces. These effects suggest higher automaticity when processing happy and also female faces. Supporting behavioral patterns, brain data show lower interference related brain activity in executive control related regions in response to happy and female faces. In the absence of behavioral stress effects, congruent compared to incongruent trials (reverse interference) showed little to no deactivation under stress in response to happy female and fearful male trials. These congruency effects are potentially based on altered context- stress-related facial processing that interact with sex-emotion stereotypes. Results indicate that sex and facial emotion modulate Stroop interference in brain and behavior. These effects can be explained by altered response difficulty as a consequence of the contextual and stereotype related modulation of automaticity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Priming the Secure Attachment Schema Affects the Emotional Face Processing Bias in Attachment Anxiety: An fMRI Research

    Directory of Open Access Journals (Sweden)

    Xu Chen

    2017-04-01

    Full Text Available Our study explored how priming with a secure base schema affects the processing of emotional facial stimuli in individuals with attachment anxiety. We enrolled 42 undergraduate students between 18 and 27 years of age, and divided them into two groups: attachment anxiety and attachment secure. All participants were primed under two conditions, the secure priming using references to the partner, and neutral priming using neutral references. We performed repeated attachment security priming combined with a dual-task paradigm and functional magnetic resonance imaging. Participants’ reaction times in terms of responding to the facial stimuli were also measured. Attachment security priming can facilitate an individual’s processing of positive emotional faces; for instance, the presentation of the partner’s name was associated with stronger activities in a wide range of brain regions and faster reaction times for positive facial expressions in the subjects. The current finding of higher activity in the left-hemisphere regions for secure priming rather than neutral priming is consistent with the prediction that attachment security priming triggers the spread of the activation of a positive emotional state. However, the difference in brain activity during processing of both, positive and negative emotional facial stimuli between the two priming conditions appeared in the attachment anxiety group alone. This study indicates that the effect of attachment secure priming on the processing of emotional facial stimuli could be mediated by chronic attachment anxiety. In addition, it highlights the association between higher-order processes of the attachment system (secure attachment schema priming and early-stage information processing system (attention, given the increased attention toward the effects of secure base schema on the processing of emotion- and attachment-related information among the insecure population. Thus, the following study has

  6. Facial emotion recognition, face scan paths, and face perception in children with neurofibromatosis type 1.

    Science.gov (United States)

    Lewis, Amelia K; Porter, Melanie A; Williams, Tracey A; Bzishvili, Samantha; North, Kathryn N; Payne, Jonathan M

    2017-05-01

    This study aimed to investigate face scan paths and face perception abilities in children with Neurofibromatosis Type 1 (NF1) and how these might relate to emotion recognition abilities in this population. The authors investigated facial emotion recognition, face scan paths, and face perception in 29 children with NF1 compared to 29 chronological age-matched typically developing controls. Correlations between facial emotion recognition, face scan paths, and face perception in children with NF1 were examined. Children with NF1 displayed significantly poorer recognition of fearful expressions compared to controls, as well as a nonsignificant trend toward poorer recognition of anger. Although there was no significant difference between groups in time spent viewing individual core facial features (eyes, nose, mouth, and nonfeature regions), children with NF1 spent significantly less time than controls viewing the face as a whole. Children with NF1 also displayed significantly poorer face perception abilities than typically developing controls. Facial emotion recognition deficits were not significantly associated with aberrant face scan paths or face perception abilities in the NF1 group. These results suggest that impairments in the perception, identification, and interpretation of information from faces are important aspects of the social-cognitive phenotype of NF1. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. What a smile means: contextual beliefs and facial emotion expressions in a nonverbal zero-sum game

    Directory of Open Access Journals (Sweden)

    Fábio Pimenta De Pádua Júnior

    2016-04-01

    Full Text Available Research into the authenticity of facial emotion expressions often focuses on the physical properties of the face while paying little attention to the role of beliefs in emotion perception. Further, the literature most often investigates how people express a pre-determined emotion rather than what facial emotion expressions people strategically choose to express. To fill these gaps, this paper proposes a nonverbal zero-sum game – the Face X Game – to assess the role of contextual beliefs and strategic displays of facial emotion expression in interpersonal interactions. This new research paradigm was used in a series of three studies, where two participants are asked to play the role of the sender (individual expressing emotional information on his/her face or the observer (individual interpreting the meaning of that expression. Study 1 examines the outcome of the game with reference to the sex of the pair, where senders won more frequently when the pair was comprised of at least one female. Study 2 examines the strategic display of facial emotion expressions. The outcome of the game was again contingent upon the sex of the pair. Among female pairs, senders won the game more frequently, replicating the pattern of results from study 1. We also demonstrate that senders who strategically express an emotion incongruent with the valence of the event (e.g., smile after seeing a negative event are able to mislead observers, who tend to hold a congruent belief about the meaning of the emotion expression. If sending an incongruent signal helps to explain why female senders win more frequently, it logically follows that female observers were more prone to hold a congruent, and therefore inaccurate, belief. This prospect implies that while female senders are willing and/or capable of displaying fake smiles, paired-female observers are not taking this into account. Study 3 investigates the role of contextual factors by manipulating female observers

  8. Behavioral and Neuroimaging Evidence for Facial Emotion Recognition in Elderly Korean Adults with Mild Cognitive Impairment, Alzheimer's Disease, and Frontotemporal Dementia.

    Science.gov (United States)

    Park, Soowon; Kim, Taehoon; Shin, Seong A; Kim, Yu Kyeong; Sohn, Bo Kyung; Park, Hyeon-Ju; Youn, Jung-Hae; Lee, Jun-Young

    2017-01-01

    Background: Facial emotion recognition (FER) is impaired in individuals with frontotemporal dementia (FTD) and Alzheimer's disease (AD) when compared to healthy older adults. Since deficits in emotion recognition are closely related to caregiver burden or social interactions, researchers have fundamental interest in FER performance in patients with dementia. Purpose: The purpose of this study was to identify the performance profiles of six facial emotions (i.e., fear, anger, disgust, sadness, surprise, and happiness) and neutral faces measured among Korean healthy control (HCs), and those with mild cognitive impairment (MCI), AD, and FTD. Additionally, the neuroanatomical correlates of facial emotions were investigated. Methods: A total of 110 (33 HC, 32 MCI, 32 AD, 13 FTD) older adult participants were recruited from two different medical centers in metropolitan areas of South Korea. These individuals underwent an FER test that was used to assess the recognition of emotions or absence of emotion (neutral) in 35 facial stimuli. Repeated measures two-way analyses of variance were used to examine the distinct profiles of emotional recognition among the four groups. We also performed brain imaging and voxel-based morphometry (VBM) on the participants to examine the associations between FER scores and gray matter volume. Results: The mean score of negative emotion recognition (i.e., fear, anger, disgust, and sadness) clearly discriminated FTD participants from individuals with MCI and AD and HC [ F (3,106) = 10.829, p < 0.001, η 2 = 0.235], whereas the mean score of positive emotion recognition (i.e., surprise and happiness) did not. A VBM analysis showed negative emotions were correlated with gray matter volume of anterior temporal regions, whereas positive emotions were related to gray matter volume of fronto-parietal regions. Conclusion: Impairment of negative FER in patients with FTD is cross-cultural. The discrete neural correlates of FER indicate that emotional

  9. Positive and negative symptom scores are correlated with activation in different brain regions during facial emotion perception in schizophrenia patients: a voxel-based sLORETA source activity study.

    Science.gov (United States)

    Kim, Do-Won; Kim, Han-Sung; Lee, Seung-Hwan; Im, Chang-Hwan

    2013-12-01

    Schizophrenia is one of the most devastating of all mental illnesses, and has dimensional characteristics that include both positive and negative symptoms. One problem reported in schizophrenia patients is that they tend to show deficits in face emotion processing, on which negative symptoms are thought to have stronger influence. In this study, four event-related potential (ERP) components (P100, N170, N250, and P300) and their source activities were analyzed using EEG data acquired from 23 schizophrenia patients while they were presented with facial emotion picture stimuli. Correlations between positive and negative syndrome scale (PANSS) scores and source activations during facial emotion processing were calculated to identify the brain areas affected by symptom scores. Our analysis demonstrates that PANSS positive scores are negatively correlated with major areas of the left temporal lobule for early ERP components (P100, N170) and with the right middle frontal lobule for a later component (N250), which indicates that positive symptoms affect both early face processing and facial emotion processing. On the other hand, PANSS negative scores are negatively correlated with several clustered regions, including the left fusiform gyrus (at P100), most of which are not overlapped with regions showing correlations with PANSS positive scores. Our results suggest that positive and negative symptoms affect independent brain regions during facial emotion processing, which may help to explain the heterogeneous characteristics of schizophrenia. © 2013 Elsevier B.V. All rights reserved.

  10. Emotion in Stories: Facial EMG Evidence for Both Mental Simulation and Moral Evaluation

    Directory of Open Access Journals (Sweden)

    Björn 't Hart

    2018-04-01

    Full Text Available Facial electromyography research shows that corrugator supercilii (“frowning muscle” activity tracks the emotional valence of linguistic stimuli. Grounded or embodied accounts of language processing take such activity to reflect the simulation or “reenactment” of emotion, as part of the retrieval of word meaning (e.g., of “furious” and/or of building a situation model (e.g., for “Mark is furious”. However, the same muscle also expresses our primary emotional evaluation of things we encounter. Language-driven affective simulation can easily be at odds with the reader's affective evaluation of what language describes (e.g., when we like Mark being furious. To examine what happens in such cases, we independently manipulated simulation valence and moral evaluative valence in short narratives. Participants first read about characters behaving in a morally laudable or objectionable fashion: this immediately led to corrugator activity reflecting positive or negative affect. Next, and critically, a positive or negative event befell these same characters. Here, the corrugator response did not track the valence of the event, but reflected both simulation and moral evaluation. This highlights the importance of unpacking coarse notions of affective meaning in language processing research into components that reflect simulation and evaluation. Our results also call for a re-evaluation of the interpretation of corrugator EMG, as well as other affect-related facial muscles and other peripheral physiological measures, as unequivocal indicators of simulation. Research should explore how such measures behave in richer and more ecologically valid language processing, such as narrative; refining our understanding of simulation within a framework of grounded language comprehension.

  11. [Hemodynamic activities in children with autism while imitating emotional facial expressions: a near-infrared spectroscopy study].

    Science.gov (United States)

    Mori, Kenji; Mori, Tatsuo; Goji, Aya; Ito, Hiromichi; Toda, Yoshihiro; Fujii, Emiko; Miyazaki, Masahito; Harada, Masafumi; Kagami, Shoji

    2014-07-01

    To examine the hemodynamic activities in the frontal lobe, children with autistic disorder and matched controls underwent near-infrared spectroscopy (NIRS) while imitating emotional facial expressions. The subjects consisted of 10 boys with autistic disorder without mental retardation (9 - 14 years) and 10 normally developing boys (9 - 14 years). The concentrations of oxyhemoglobin (oxy-Hb) were measured with frontal probes using a 34-channel NIRS machine while the subjects imitated emotional facial expressions. The increments in the concentration of oxy-Hb in the pars opercularis of the inferior frontal gyrus in autistic subjects were significantly lower than those in the controls. However, the concentrations of oxy-Hb in this area were significantly elevated in autistic subjects after they were trained to imitate emotional facial expressions. The increments in the concentration of oxy-Hb in this area in autistic subjects were positively correlated with the scores on a test of labeling emotional facial expressions. The pars opercularis of the inferior frontal gyrus is an important component of the mirror neuron system. The present results suggest that mirror neurons could be activated by repeated imitation in children with autistic disorder.

  12. Effects of induced sad mood on facial emotion perception in young and older adults.

    Science.gov (United States)

    Lawrie, Louisa; Jackson, Margaret C; Phillips, Louise H

    2018-02-15

    Older adults perceive less intense negative emotion in facial expressions compared to younger counterparts. Prior research has also demonstrated that mood alters facial emotion perception. Nevertheless, there is little evidence which evaluates the interactive effects of age and mood on emotion perception. This study investigated the effects of sad mood on younger and older adults' perception of emotional and neutral faces. Participants rated the intensity of stimuli while listening to sad music and in silence. Measures of mood were administered. Younger and older participants' rated sad faces as displaying stronger sadness when they experienced sad mood. While younger participants showed no influence of sad mood on happiness ratings of happy faces, older adults rated happy faces as conveying less happiness when they experienced sad mood. This study demonstrates how emotion perception can change when a controlled mood induction procedure is applied to alter mood in young and older participants.

  13. Are there differential deficits in facial emotion recognition between paranoid and non-paranoid schizophrenia? A signal detection analysis.

    Science.gov (United States)

    Huang, Charles Lung-Cheng; Hsiao, Sigmund; Hwu, Hai-Gwo; Howng, Shen-Long

    2013-10-30

    This study assessed facial emotion recognition abilities in subjects with paranoid and non-paranoid schizophrenia (NPS) using signal detection theory. We explore the differential deficits in facial emotion recognition in 44 paranoid patients with schizophrenia (PS) and 30 non-paranoid patients with schizophrenia (NPS), compared to 80 healthy controls. We used morphed faces with different intensities of emotion and computed the sensitivity index (d') of each emotion. The results showed that performance differed between the schizophrenia and healthy controls groups in the recognition of both negative and positive affects. The PS group performed worse than the healthy controls group but better than the NPS group in overall performance. Performance differed between the NPS and healthy controls groups in the recognition of all basic emotions and neutral faces; between the PS and healthy controls groups in the recognition of angry faces; and between the PS and NPS groups in the recognition of happiness, anger, sadness, disgust, and neutral affects. The facial emotion recognition impairment in schizophrenia may reflect a generalized deficit rather than a negative-emotion specific deficit. The PS group performed worse than the control group, but better than the NPS group in facial expression recognition, with differential deficits between PS and NPS patients. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  14. Effects of the potential lithium-mimetic, ebselen, on impulsivity and emotional processing.

    Science.gov (United States)

    Masaki, Charles; Sharpley, Ann L; Cooper, Charlotte M; Godlewska, Beata R; Singh, Nisha; Vasudevan, Sridhar R; Harmer, Catherine J; Churchill, Grant C; Sharp, Trevor; Rogers, Robert D; Cowen, Philip J

    2016-07-01

    Lithium remains the most effective treatment for bipolar disorder and also has important effects to lower suicidal behaviour, a property that may be linked to its ability to diminish impulsive, aggressive behaviour. The antioxidant drug, ebselen, has been proposed as a possible lithium-mimetic based on its ability in animals to inhibit inositol monophosphatase (IMPase), an action which it shares with lithium. The aim of the study was to determine whether treatment with ebselen altered emotional processing and diminished measures of risk-taking behaviour. We studied 20 healthy participants who were tested on two occasions receiving either ebselen (3600 mg over 24 h) or identical placebo in a double-blind, randomized, cross-over design. Three hours after the final dose of ebselen/placebo, participants completed the Cambridge Gambling Task (CGT) and a task that required the detection of emotional facial expressions (facial emotion recognition task (FERT)). On the CGT, relative to placebo, ebselen reduced delay aversion while on the FERT, it increased the recognition of positive vs negative facial expressions. The study suggests that at the dosage used, ebselen can decrease impulsivity and produce a positive bias in emotional processing. These findings have implications for the possible use of ebselen in the disorders characterized by impulsive behaviour and dysphoric mood.

  15. Updating schematic emotional facial expressions in working memory: Response bias and sensitivity.

    Science.gov (United States)

    Tamm, Gerly; Kreegipuu, Kairi; Harro, Jaanus; Cowan, Nelson

    2017-01-01

    It is unclear if positive, negative, or neutral emotional expressions have an advantage in short-term recognition. Moreover, it is unclear from previous studies of working memory for emotional faces whether effects of emotions comprise response bias or sensitivity. The aim of this study was to compare how schematic emotional expressions (sad, angry, scheming, happy, and neutral) are discriminated and recognized in an updating task (2-back recognition) in a representative sample of birth cohort of young adults. Schematic facial expressions allow control of identity processing, which is separate from expression processing, and have been used extensively in attention research but not much, until now, in working memory research. We found that expressions with a U-curved mouth (i.e., upwardly curved), namely happy and scheming expressions, favoured a bias towards recognition (i.e., towards indicating that the probe and the stimulus in working memory are the same). Other effects of emotional expression were considerably smaller (1-2% of the variance explained)) compared to a large proportion of variance that was explained by the physical similarity of items being compared. We suggest that the nature of the stimuli plays a role in this. The present application of signal detection methodology with emotional, schematic faces in a working memory procedure requiring fast comparisons helps to resolve important contradictions that have emerged in the emotional perception literature. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. A Model of the Perception of Facial Expressions of Emotion by Humans: Research Overview and Perspectives.

    Science.gov (United States)

    Martinez, Aleix; Du, Shichuan

    2012-05-01

    In cognitive science and neuroscience, there have been two leading models describing how humans perceive and classify facial expressions of emotion-the continuous and the categorical model. The continuous model defines each facial expression of emotion as a feature vector in a face space. This model explains, for example, how expressions of emotion can be seen at different intensities. In contrast, the categorical model consists of C classifiers, each tuned to a specific emotion category. This model explains, among other findings, why the images in a morphing sequence between a happy and a surprise face are perceived as either happy or surprise but not something in between. While the continuous model has a more difficult time justifying this latter finding, the categorical model is not as good when it comes to explaining how expressions are recognized at different intensities or modes. Most importantly, both models have problems explaining how one can recognize combinations of emotion categories such as happily surprised versus angrily surprised versus surprise. To resolve these issues, in the past several years, we have worked on a revised model that justifies the results reported in the cognitive science and neuroscience literature. This model consists of C distinct continuous spaces. Multiple (compound) emotion categories can be recognized by linearly combining these C face spaces. The dimensions of these spaces are shown to be mostly configural. According to this model, the major task for the classification of facial expressions of emotion is precise, detailed detection of facial landmarks rather than recognition. We provide an overview of the literature justifying the model, show how the resulting model can be employed to build algorithms for the recognition of facial expression of emotion, and propose research directions in machine learning and computer vision researchers to keep pushing the state of the art in these areas. We also discuss how the model can

  17. The effect of comorbid depression on facial and prosody emotion recognition in first-episode schizophrenia spectrum.

    Science.gov (United States)

    Herniman, Sarah E; Allott, Kelly A; Killackey, Eóin; Hester, Robert; Cotton, Sue M

    2017-01-15

    Comorbid depression is common in first-episode schizophrenia spectrum (FES) disorders. Both depression and FES are associated with significant deficits in facial and prosody emotion recognition performance. However, it remains unclear whether people with FES and comorbid depression, compared to those without comorbid depression, have overall poorer emotion recognition, or instead, a different pattern of emotion recognition deficits. The aim of this study was to compare facial and prosody emotion recognition performance between those with and without comorbid depression in FES. This study involved secondary analysis of baseline data from a randomized controlled trial of vocational intervention for young people with first-episode psychosis (N=82; age range: 15-25 years). Those with comorbid depression (n=24) had more accurate recognition of sadness in faces compared to those without comorbid depression. Severity of depressive symptoms was also associated with more accurate recognition of sadness in faces. Such results did not recur for prosody emotion recognition. In addition to the cross-sectional design, limitations of this study include the absence of facial and prosodic recognition of neutral emotions. Findings indicate a mood congruent negative bias in facial emotion recognition in those with comorbid depression and FES, and provide support for cognitive theories of depression that emphasise the role of such biases in the development and maintenance of depression. Longitudinal research is needed to determine whether mood-congruent negative biases are implicated in the development and maintenance of depression in FES, or whether such biases are simply markers of depressed state. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Do Dynamic Compared to Static Facial Expressions of Happiness and Anger Reveal Enhanced Facial Mimicry?

    Directory of Open Access Journals (Sweden)

    Krystyna Rymarczyk

    Full Text Available Facial mimicry is the spontaneous response to others' facial expressions by mirroring or matching the interaction partner. Recent evidence suggested that mimicry may not be only an automatic reaction but could be dependent on many factors, including social context, type of task in which the participant is engaged, or stimulus properties (dynamic vs static presentation. In the present study, we investigated the impact of dynamic facial expression and sex differences on facial mimicry and judgment of emotional intensity. Electromyography recordings were recorded from the corrugator supercilii, zygomaticus major, and orbicularis oculi muscles during passive observation of static and dynamic images of happiness and anger. The ratings of the emotional intensity of facial expressions were also analysed. As predicted, dynamic expressions were rated as more intense than static ones. Compared to static images, dynamic displays of happiness also evoked stronger activity in the zygomaticus major and orbicularis oculi, suggesting that subjects experienced positive emotion. No muscles showed mimicry activity in response to angry faces. Moreover, we found that women exhibited greater zygomaticus major muscle activity in response to dynamic happiness stimuli than static stimuli. Our data support the hypothesis that people mimic positive emotions and confirm the importance of dynamic stimuli in some emotional processing.

  19. Is the emotion recognition deficit associated with frontotemporal dementia caused by selective inattention to diagnostic facial features?

    Science.gov (United States)

    Oliver, Lindsay D; Virani, Karim; Finger, Elizabeth C; Mitchell, Derek G V

    2014-07-01

    Frontotemporal dementia (FTD) is a debilitating neurodegenerative disorder characterized by severely impaired social and emotional behaviour, including emotion recognition deficits. Though fear recognition impairments seen in particular neurological and developmental disorders can be ameliorated by reallocating attention to critical facial features, the possibility that similar benefits can be conferred to patients with FTD has yet to be explored. In the current study, we examined the impact of presenting distinct regions of the face (whole face, eyes-only, and eyes-removed) on the ability to recognize expressions of anger, fear, disgust, and happiness in 24 patients with FTD and 24 healthy controls. A recognition deficit was demonstrated across emotions by patients with FTD relative to controls. Crucially, removal of diagnostic facial features resulted in an appropriate decline in performance for both groups; furthermore, patients with FTD demonstrated a lack of disproportionate improvement in emotion recognition accuracy as a result of isolating critical facial features relative to controls. Thus, unlike some neurological and developmental disorders featuring amygdala dysfunction, the emotion recognition deficit observed in FTD is not likely driven by selective inattention to critical facial features. Patients with FTD also mislabelled negative facial expressions as happy more often than controls, providing further evidence for abnormalities in the representation of positive affect in FTD. This work suggests that the emotional expression recognition deficit associated with FTD is unlikely to be rectified by adjusting selective attention to diagnostic features, as has proven useful in other select disorders. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Mother's Happiness with Cognitive - Executive Functions and Facial Emotional Recognition in School Children with Down Syndrome.

    Science.gov (United States)

    Malmir, Maryam; Seifenaraghi, Maryam; Farhud, Dariush D; Afrooz, G Ali; Khanahmadi, Mohammad

    2015-05-01

    According to the mother's key roles in bringing up emotional and cognitive abilities of mentally retarded children and respect to positive psychology in recent decades, this research is administered to assess the relation between mother's happiness level with cognitive- executive functions (i.e. attention, working memory, inhibition and planning) and facial emotional recognition ability as two factors in learning and adjustment skills in mentally retarded children with Down syndrome. This study was an applied research and data were analyzed by Pearson correlation procedure. Population is included all school children with Down syndrome (9-12 yr) that come from Tehran, Iran. Overall, 30 children were selected as an in access sample. After selection and agreement of parents, the Wechsler Intelligence Scale for Children-Revised (WISC-R) was performed to determine the student's IQ, and then mothers were invited to fill out the Oxford Happiness Inventory (OHI). Cognitive-executive functions were evaluated by tests as followed: Continues Performance Test (CPT), N-Back, Stroop test (day and night version) and Tower of London. Ekman emotion facial expression test was also accomplished for assessing facial emotional recognition in children with Down syndrome, individually. Mother's happiness level had a positive relation with cognitive-executive functions (attention, working memory, inhibition and planning) and facial emotional recognition in her children with Down syndrome, significantly. Parents' happiness (especially mothers) is a powerful predictor for cognitive and emotional abilities of their children.

  1. Facial Emotion Recognition: A Survey and Real-World User Experiences in Mixed Reality

    Directory of Open Access Journals (Sweden)

    Dhwani Mehta

    2018-02-01

    Full Text Available Extensive possibilities of applications have made emotion recognition ineluctable and challenging in the field of computer science. The use of non-verbal cues such as gestures, body movement, and facial expressions convey the feeling and the feedback to the user. This discipline of Human–Computer Interaction places reliance on the algorithmic robustness and the sensitivity of the sensor to ameliorate the recognition. Sensors play a significant role in accurate detection by providing a very high-quality input, hence increasing the efficiency and the reliability of the system. Automatic recognition of human emotions would help in teaching social intelligence in the machines. This paper presents a brief study of the various approaches and the techniques of emotion recognition. The survey covers a succinct review of the databases that are considered as data sets for algorithms detecting the emotions by facial expressions. Later, mixed reality device Microsoft HoloLens (MHL is introduced for observing emotion recognition in Augmented Reality (AR. A brief introduction of its sensors, their application in emotion recognition and some preliminary results of emotion recognition using MHL are presented. The paper then concludes by comparing results of emotion recognition by the MHL and a regular webcam.

  2. Nasal Oxytocin Treatment Biases Dogs’ Visual Attention and Emotional Response toward Positive Human Facial Expressions

    Directory of Open Access Journals (Sweden)

    Sanni Somppi

    2017-10-01

    Full Text Available The neuropeptide oxytocin plays a critical role in social behavior and emotion regulation in mammals. The aim of this study was to explore how nasal oxytocin administration affects gazing behavior during emotional perception in domestic dogs. Looking patterns of dogs, as a measure of voluntary attention, were recorded during the viewing of human facial expression photographs. The pupil diameters of dogs were also measured as a physiological index of emotional arousal. In a placebo-controlled within-subjects experimental design, 43 dogs, after having received either oxytocin or placebo (saline nasal spray treatment, were presented with pictures of unfamiliar male human faces displaying either a happy or an angry expression. We found that, depending on the facial expression, the dogs’ gaze patterns were affected selectively by oxytocin treatment. After receiving oxytocin, dogs fixated less often on the eye regions of angry faces and revisited (glanced back at more often the eye regions of smiling (happy faces than after the placebo treatment. Furthermore, following the oxytocin treatment dogs fixated and revisited the eyes of happy faces significantly more often than the eyes of angry faces. The analysis of dogs’ pupil diameters during viewing of human facial expressions indicated that oxytocin may also have a modulatory effect on dogs’ emotional arousal. While subjects’ pupil sizes were significantly larger when viewing angry faces than happy faces in the control (placebo treatment condition, oxytocin treatment not only eliminated this effect but caused an opposite pupil response. Overall, these findings suggest that nasal oxytocin administration selectively changes the allocation of attention and emotional arousal in domestic dogs. Oxytocin has the potential to decrease vigilance toward threatening social stimuli and increase the salience of positive social stimuli thus making eye gaze of friendly human faces more salient for dogs. Our

  3. Cradling Side Preference Is Associated with Lateralized Processing of Baby Facial Expressions in Females

    Science.gov (United States)

    Huggenberger, Harriet J.; Suter, Susanne E.; Reijnen, Ester; Schachinger, Hartmut

    2009-01-01

    Women's cradling side preference has been related to contralateral hemispheric specialization of processing emotional signals; but not of processing baby's facial expression. Therefore, 46 nulliparous female volunteers were characterized as left or non-left holders (HG) during a doll holding task. During a signal detection task they were then…

  4. Neural Correlates of Facial Mimicry: Simultaneous Measurements of EMG and BOLD Responses during Perception of Dynamic Compared to Static Facial Expressions

    Science.gov (United States)

    Rymarczyk, Krystyna; Żurawski, Łukasz; Jankowiak-Siuda, Kamila; Szatkowska, Iwona

    2018-01-01

    Facial mimicry (FM) is an automatic response to imitate the facial expressions of others. However, neural correlates of the phenomenon are as yet not well established. We investigated this issue using simultaneously recorded EMG and BOLD signals during perception of dynamic and static emotional facial expressions of happiness and anger. During display presentations, BOLD signals and zygomaticus major (ZM), corrugator supercilii (CS) and orbicularis oculi (OO) EMG responses were recorded simultaneously from 46 healthy individuals. Subjects reacted spontaneously to happy facial expressions with increased EMG activity in ZM and OO muscles and decreased CS activity, which was interpreted as FM. Facial muscle responses correlated with BOLD activity in regions associated with motor simulation of facial expressions [i.e., inferior frontal gyrus, a classical Mirror Neuron System (MNS)]. Further, we also found correlations for regions associated with emotional processing (i.e., insula, part of the extended MNS). It is concluded that FM involves both motor and emotional brain structures, especially during perception of natural emotional expressions. PMID:29467691

  5. Perceptions of Emotion from Facial Expressions are Not Culturally Universal: Evidence from a Remote Culture

    Science.gov (United States)

    Gendron, Maria; Roberson, Debi; van der Vyver, Jacoba Marietta; Barrett, Lisa Feldman

    2014-01-01

    It is widely believed that certain emotions are universally recognized in facial expressions. Recent evidence indicates that Western perceptions (e.g., scowls as anger) depend on cues to US emotion concepts embedded in experiments. Since such cues are standard feature in methods used in cross-cultural experiments, we hypothesized that evidence of universality depends on this conceptual context. In our study, participants from the US and the Himba ethnic group sorted images of posed facial expressions into piles by emotion type. Without cues to emotion concepts, Himba participants did not show the presumed “universal” pattern, whereas US participants produced a pattern with presumed universal features. With cues to emotion concepts, participants in both cultures produced sorts that were closer to the presumed “universal” pattern, although substantial cultural variation persisted. Our findings indicate that perceptions of emotion are not universal, but depend on cultural and conceptual contexts. PMID:24708506

  6. Biases in emotional processing are associated with vulnerability to eating disorders over time.

    Science.gov (United States)

    Pringle, A; Harmer, C J; Cooper, M J

    2011-01-01

    Biases in emotional processing are thought to play a role in the maintenance of eating disorders (EDs). In a previous study (Pringle et al., 2010), we were able to demonstrate that biases in the processing of negative self beliefs (a self-schema processing task), facial expressions of emotion (a facial expression recognition task) and information relating to eating, shape and weight (an emotional Stroop) were all predictive of the level of subclinical ED symptoms (used here as a measure of risk) cross-sectionally in a vulnerable sample of dieters. The present study was a 12-month follow up of the participants from Pringle et al. (2010). Longitudinally, greater endorsement of ED relevant and depression relevant negative self beliefs in the self-schema processing task at time 1 was related to subclinical ED systems (level of risk) 12 months later at time 2. Compared to the cross-sectional study, there was no clear relationship between performance on the facial expression recognition task, emotional Stroop task and level of risk 12 months later. Although these findings are preliminary, one tentative interpretation may be that whilst biases in the processing of ED specific stimuli are predictive of level of risk at a given moment, over time less specific stimuli relating to beliefs about the self, including mood related variables, are more closely related to level of risk. Copyright © 2010 Elsevier Ltd. All rights reserved.

  7. Joint recognition-expression impairment of facial emotions in Huntington's disease despite intact understanding of feelings.

    Science.gov (United States)

    Trinkler, Iris; Cleret de Langavant, Laurent; Bachoud-Lévi, Anne-Catherine

    2013-02-01

    Patients with Huntington's disease (HD), a neurodegenerative disorder that causes major motor impairments, also show cognitive and emotional deficits. While their deficit in recognising emotions has been explored in depth, little is known about their ability to express emotions and understand their feelings. If these faculties were impaired, patients might not only mis-read emotion expressions in others but their own emotions might be mis-interpreted by others as well, or thirdly, they might have difficulties understanding and describing their feelings. We compared the performance of recognition and expression of facial emotions in 13 HD patients with mild motor impairments but without significant bucco-facial abnormalities, and 13 controls matched for age and education. Emotion recognition was investigated in a forced-choice recognition test (FCR), and emotion expression by filming participants while they mimed the six basic emotional facial expressions (anger, disgust, fear, surprise, sadness and joy) to the experimenter. The films were then segmented into 60 stimuli per participant and four external raters performed a FCR on this material. Further, we tested understanding of feelings in self (alexithymia) and others (empathy) using questionnaires. Both recognition and expression were impaired across different emotions in HD compared to controls and recognition and expression scores were correlated. By contrast, alexithymia and empathy scores were very similar in HD and controls. This might suggest that emotion deficits in HD might be tied to the expression itself. Because similar emotion recognition-expression deficits are also found in Parkinson's Disease and vascular lesions of the striatum, our results further confirm the importance of the striatum for emotion recognition and expression, while access to the meaning of feelings relies on a different brain network, and is spared in HD. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. Spatiotemporal neural network dynamics for the processing of dynamic facial expressions

    Science.gov (United States)

    Sato, Wataru; Kochiyama, Takanori; Uono, Shota

    2015-01-01

    The dynamic facial expressions of emotion automatically elicit multifaceted psychological activities; however, the temporal profiles and dynamic interaction patterns of brain activities remain unknown. We investigated these issues using magnetoencephalography. Participants passively observed dynamic facial expressions of fear and happiness, or dynamic mosaics. Source-reconstruction analyses utilizing functional magnetic-resonance imaging data revealed higher activation in broad regions of the bilateral occipital and temporal cortices in response to dynamic facial expressions than in response to dynamic mosaics at 150–200 ms and some later time points. The right inferior frontal gyrus exhibited higher activity for dynamic faces versus mosaics at 300–350 ms. Dynamic causal-modeling analyses revealed that dynamic faces activated the dual visual routes and visual–motor route. Superior influences of feedforward and feedback connections were identified before and after 200 ms, respectively. These results indicate that hierarchical, bidirectional neural network dynamics within a few hundred milliseconds implement the processing of dynamic facial expressions. PMID:26206708

  9. Spatiotemporal neural network dynamics for the processing of dynamic facial expressions.

    Science.gov (United States)

    Sato, Wataru; Kochiyama, Takanori; Uono, Shota

    2015-07-24

    The dynamic facial expressions of emotion automatically elicit multifaceted psychological activities; however, the temporal profiles and dynamic interaction patterns of brain activities remain unknown. We investigated these issues using magnetoencephalography. Participants passively observed dynamic facial expressions of fear and happiness, or dynamic mosaics. Source-reconstruction analyses utilizing functional magnetic-resonance imaging data revealed higher activation in broad regions of the bilateral occipital and temporal cortices in response to dynamic facial expressions than in response to dynamic mosaics at 150-200 ms and some later time points. The right inferior frontal gyrus exhibited higher activity for dynamic faces versus mosaics at 300-350 ms. Dynamic causal-modeling analyses revealed that dynamic faces activated the dual visual routes and visual-motor route. Superior influences of feedforward and feedback connections were identified before and after 200 ms, respectively. These results indicate that hierarchical, bidirectional neural network dynamics within a few hundred milliseconds implement the processing of dynamic facial expressions.

  10. MEG Evidence for Dynamic Amygdala Modulations by Gaze and Facial Emotions

    Science.gov (United States)

    Dumas, Thibaud; Dubal, Stéphanie; Attal, Yohan; Chupin, Marie; Jouvent, Roland; Morel, Shasha; George, Nathalie

    2013-01-01

    Background Amygdala is a key brain region for face perception. While the role of amygdala in the perception of facial emotion and gaze has been extensively highlighted with fMRI, the unfolding in time of amydgala responses to emotional versus neutral faces with different gaze directions is scarcely known. Methodology/Principal Findings Here we addressed this question in healthy subjects using MEG combined with an original source imaging method based on individual amygdala volume segmentation and the localization of sources in the amygdala volume. We found an early peak of amygdala activity that was enhanced for fearful relative to neutral faces between 130 and 170 ms. The effect of emotion was again significant in a later time range (310–350 ms). Moreover, the amygdala response was greater for direct relative averted gaze between 190 and 350 ms, and this effect was selective of fearful faces in the right amygdala. Conclusion Altogether, our results show that the amygdala is involved in the processing and integration of emotion and gaze cues from faces in different time ranges, thus underlining its role in multiple stages of face perception. PMID:24040190

  11. Behavioral and Neuroimaging Evidence for Facial Emotion Recognition in Elderly Korean Adults with Mild Cognitive Impairment, Alzheimer’s Disease, and Frontotemporal Dementia

    Directory of Open Access Journals (Sweden)

    Soowon Park

    2017-11-01

    Full Text Available Background: Facial emotion recognition (FER is impaired in individuals with frontotemporal dementia (FTD and Alzheimer’s disease (AD when compared to healthy older adults. Since deficits in emotion recognition are closely related to caregiver burden or social interactions, researchers have fundamental interest in FER performance in patients with dementia.Purpose: The purpose of this study was to identify the performance profiles of six facial emotions (i.e., fear, anger, disgust, sadness, surprise, and happiness and neutral faces measured among Korean healthy control (HCs, and those with mild cognitive impairment (MCI, AD, and FTD. Additionally, the neuroanatomical correlates of facial emotions were investigated.Methods: A total of 110 (33 HC, 32 MCI, 32 AD, 13 FTD older adult participants were recruited from two different medical centers in metropolitan areas of South Korea. These individuals underwent an FER test that was used to assess the recognition of emotions or absence of emotion (neutral in 35 facial stimuli. Repeated measures two-way analyses of variance were used to examine the distinct profiles of emotional recognition among the four groups. We also performed brain imaging and voxel-based morphometry (VBM on the participants to examine the associations between FER scores and gray matter volume.Results: The mean score of negative emotion recognition (i.e., fear, anger, disgust, and sadness clearly discriminated FTD participants from individuals with MCI and AD and HC [F(3,106 = 10.829, p < 0.001, η2 = 0.235], whereas the mean score of positive emotion recognition (i.e., surprise and happiness did not. A VBM analysis showed negative emotions were correlated with gray matter volume of anterior temporal regions, whereas positive emotions were related to gray matter volume of fronto-parietal regions.Conclusion: Impairment of negative FER in patients with FTD is cross-cultural. The discrete neural correlates of FER indicate that

  12. Behavioral and Neuroimaging Evidence for Facial Emotion Recognition in Elderly Korean Adults with Mild Cognitive Impairment, Alzheimer’s Disease, and Frontotemporal Dementia

    Science.gov (United States)

    Park, Soowon; Kim, Taehoon; Shin, Seong A; Kim, Yu Kyeong; Sohn, Bo Kyung; Park, Hyeon-Ju; Youn, Jung-Hae; Lee, Jun-Young

    2017-01-01

    Background: Facial emotion recognition (FER) is impaired in individuals with frontotemporal dementia (FTD) and Alzheimer’s disease (AD) when compared to healthy older adults. Since deficits in emotion recognition are closely related to caregiver burden or social interactions, researchers have fundamental interest in FER performance in patients with dementia. Purpose: The purpose of this study was to identify the performance profiles of six facial emotions (i.e., fear, anger, disgust, sadness, surprise, and happiness) and neutral faces measured among Korean healthy control (HCs), and those with mild cognitive impairment (MCI), AD, and FTD. Additionally, the neuroanatomical correlates of facial emotions were investigated. Methods: A total of 110 (33 HC, 32 MCI, 32 AD, 13 FTD) older adult participants were recruited from two different medical centers in metropolitan areas of South Korea. These individuals underwent an FER test that was used to assess the recognition of emotions or absence of emotion (neutral) in 35 facial stimuli. Repeated measures two-way analyses of variance were used to examine the distinct profiles of emotional recognition among the four groups. We also performed brain imaging and voxel-based morphometry (VBM) on the participants to examine the associations between FER scores and gray matter volume. Results: The mean score of negative emotion recognition (i.e., fear, anger, disgust, and sadness) clearly discriminated FTD participants from individuals with MCI and AD and HC [F(3,106) = 10.829, p emotion recognition (i.e., surprise and happiness) did not. A VBM analysis showed negative emotions were correlated with gray matter volume of anterior temporal regions, whereas positive emotions were related to gray matter volume of fronto-parietal regions. Conclusion: Impairment of negative FER in patients with FTD is cross-cultural. The discrete neural correlates of FER indicate that emotional recognition processing is a multi-modal system in the brain

  13. DAT by perceived MC interaction on human prefrontal activity and connectivity during emotion processing.

    Science.gov (United States)

    Taurisano, Paolo; Blasi, Giuseppe; Romano, Raffaella; Sambataro, Fabio; Fazio, Leonardo; Gelao, Barbara; Ursini, Gianluca; Lo Bianco, Luciana; Di Giorgio, Annabella; Ferrante, Francesca; Papazacharias, Apostolos; Porcelli, Annamaria; Sinibaldi, Lorenzo; Popolizio, Teresa; Bertolino, Alessandro

    2013-12-01

    Maternal care (MC) and dopamine modulate brain activity during emotion processing in inferior frontal gyrus (IFG), striatum and amygdala. Reuptake of dopamine from the synapse is performed by the dopamine transporter (DAT), whose abundance is predicted by variation in its gene (DAT 3'VNTR; 10 > 9-repeat alleles). Here, we investigated the interaction between perceived MC and DAT 3'VNTR genotype on brain activity during processing of aversive facial emotional stimuli. Sixty-one healthy subjects were genotyped for DAT 3'VNTR and categorized in low and high MC individuals. They underwent functional magnetic resonance imaging while performing a task requiring gender discrimination of facial stimuli with angry, fearful or neutral expressions. An interaction between facial expression, DAT genotype and MC was found in left IFG, such that low MC and homozygosity for the 10-repeat allele are associated with greater activity during processing of fearful faces. This greater activity was also inversely correlated with a measure of emotion control as scored with the Big Five Questionnaire. Moreover, MC and DAT genotype described a double dissociation on functional connectivity between IFG and amygdala. These findings suggest that perceived early parental bonding may interact with DAT 3'VNTR genotype in modulating brain activity during emotionally relevant inputs.

  14. Processing of emotional faces in congenital amusia: An emotional music priming event-related potential study.

    Science.gov (United States)

    Zhishuai, Jin; Hong, Liu; Daxing, Wu; Pin, Zhang; Xuejing, Lu

    2017-01-01

    Congenital amusia is characterized by lifelong impairments in music perception and processing. It is unclear whether pitch detection deficits impact amusic individuals' perception of musical emotion. In the current work, 19 amusics and 21 healthy controls were subjected to electroencephalography (EEG) while being exposed to music excerpts and emotional faces. We assessed each individual's ability to discriminate positive- and negative-valenced emotional faces and analyzed electrophysiological indices, in the form of event-related potentials (ERPs) recorded at 32 sites, following exposure to emotionally positive or negative music excerpts. We observed smaller N2 amplitudes in response to facial expressions in the amusia group than in the control group, suggesting that amusics were less affected by the musical stimuli. The late-positive component (LPC) in amusics was similar to that in controls. Our results suggest that the neurocognitive deficit characteristic of congenital amusia is fundamentally an impairment in musical information processing rather than an impairment in emotional processing.

  15. Processing of emotional faces in congenital amusia: An emotional music priming event-related potential study

    Directory of Open Access Journals (Sweden)

    Jin Zhishuai

    2017-01-01

    Full Text Available Congenital amusia is characterized by lifelong impairments in music perception and processing. It is unclear whether pitch detection deficits impact amusic individuals' perception of musical emotion. In the current work, 19 amusics and 21 healthy controls were subjected to electroencephalography (EEG while being exposed to music excerpts and emotional faces. We assessed each individual's ability to discriminate positive- and negative-valenced emotional faces and analyzed electrophysiological indices, in the form of event-related potentials (ERPs recorded at 32 sites, following exposure to emotionally positive or negative music excerpts. We observed smaller N2 amplitudes in response to facial expressions in the amusia group than in the control group, suggesting that amusics were less affected by the musical stimuli. The late-positive component (LPC in amusics was similar to that in controls. Our results suggest that the neurocognitive deficit characteristic of congenital amusia is fundamentally an impairment in musical information processing rather than an impairment in emotional processing.

  16. When familiarity breeds accuracy: cultural exposure and facial emotion recognition.

    Science.gov (United States)

    Elfenbein, Hillary Anger; Ambady, Nalini

    2003-08-01

    Two studies provide evidence for the role of cultural familiarity in recognizing facial expressions of emotion. For Chinese located in China and the United States, Chinese Americans, and non-Asian Americans, accuracy and speed in judging Chinese and American emotions was greater with greater participant exposure to the group posing the expressions. Likewise, Tibetans residing in China and Africans residing in the United States were faster and more accurate when judging emotions expressed by host versus nonhost society members. These effects extended across generations of Chinese Americans, seemingly independent of ethnic or biological ties. Results suggest that the universal affect system governing emotional expression may be characterized by subtle differences in style across cultures, which become more familiar with greater cultural contact.

  17. Emotional sounds modulate early neural processing of emotional pictures

    Directory of Open Access Journals (Sweden)

    Antje B M Gerdes

    2013-10-01

    Full Text Available In our natural environment, emotional information is conveyed by converging visual and auditory information; multimodal integration is of utmost importance. In the laboratory, however, emotion researchers have mostly focused on the examination of unimodal stimuli. Few existing studies on multimodal emotion processing have focused on human communication such as the integration of facial and vocal expressions. Extending the concept of multimodality, the current study examines how the neural processing of emotional pictures is influenced by simultaneously presented sounds. Twenty pleasant, unpleasant, and neutral pictures of complex scenes were presented to 22 healthy participants. On the critical trials these pictures were paired with pleasant, unpleasant and neutral sounds. Sound presentation started 500 ms before picture onset and each stimulus presentation lasted for 2s. EEG was recorded from 64 channels and ERP analyses focused on the picture onset. In addition, valence, and arousal ratings were obtained. Previous findings for the neural processing of emotional pictures were replicated. Specifically, unpleasant compared to neutral pictures were associated with an increased parietal P200 and a more pronounced centroparietal late positive potential (LPP, independent of the accompanying sound valence. For audiovisual stimulation, increased parietal P100 and P200 were found in response to all pictures which were accompanied by unpleasant or pleasant sounds compared to pictures with neutral sounds. Most importantly, incongruent audiovisual pairs of unpleasant pictures and pleasant sounds enhanced parietal P100 and P200 compared to pairings with congruent sounds. Taken together, the present findings indicate that emotional sounds modulate early stages of visual processing and, therefore, provide an avenue by which multimodal experience may enhance perception.

  18. Mistakes, Too Few to Mention? Impaired Self-conscious Emotional Processing of Errors in the Behavioral Variant of Frontotemporal Dementia

    Directory of Open Access Journals (Sweden)

    Carole S. Scherling

    2017-10-01

    Full Text Available Anosognosia, or lack of awareness of one's deficits, is a core feature of the behavioral variant of frontotemporal dementia (bvFTD. We hypothesized that this deficit has its origins in failed emotional processing of errors. We studied autonomic and facial emotional reactivity to errors in patients with bvFTD (n = 17, Alzheimer's disease (AD, n = 20, and healthy controls (HC, n = 35 during performance of a timed two-alternative-choice button press task. Performance-related behavioral responses to errors were quantified using rates of error correction and post-error slowing of reaction times. Facial emotional responses were measured by monitoring facial reactivity via video and subsequently coding the type, duration and intensity of all emotional reactions. Skin conductance response (SCR was measured via noninvasive sensors. SCR and total score for each facial emotion expression were quantified for each trial. Facial emotions were grouped into self-conscious (amusement, embarrassment and negative (fear, sadness, anger, disgust, contempt emotions. HCs corrected 99.4% of their errors. BvFTD patients corrected 94% (not statistically different compared with HC and AD corrected 74.8% of their errors (p < 0.05 compared with HC and bvFTD. All groups showed similar post-error slowing. Errors in HCs were associated with greater facial reactivity and SCRs compared with non-error trials, including both negative and self-conscious emotions. BvFTD patients failed to produce self-conscious emotions or an increase in SCR for errors, although they did produce negative emotional responses to a similar degree as HCs. AD showed no deficit in facial reactivity to errors. Although, SCR was generally reduced in AD during error trials, they showed a preserved increase in SCR for errors relative to correct trials. These results demonstrate a specific deficit in emotional responses to errors in bvFTD, encompassing both physiological response and a specific deficit in self

  19. What a Smile Means: Contextual Beliefs and Facial Emotion Expressions in a Non-verbal Zero-Sum Game.

    Science.gov (United States)

    Pádua Júnior, Fábio P; Prado, Paulo H M; Roeder, Scott S; Andrade, Eduardo B

    2016-01-01

    Research into the authenticity of facial emotion expressions often focuses on the physical properties of the face while paying little attention to the role of beliefs in emotion perception. Further, the literature most often investigates how people express a pre-determined emotion rather than what facial emotion expressions people strategically choose to express. To fill these gaps, this paper proposes a non-verbal zero-sum game - the Face X Game - to assess the role of contextual beliefs and strategic displays of facial emotion expression in interpersonal interactions. This new research paradigm was used in a series of three studies, where two participants are asked to play the role of the sender (individual expressing emotional information on his/her face) or the observer (individual interpreting the meaning of that expression). Study 1 examines the outcome of the game with reference to the sex of the pair, where senders won more frequently when the pair was comprised of at least one female. Study 2 examines the strategic display of facial emotion expressions. The outcome of the game was again contingent upon the sex of the pair. Among female pairs, senders won the game more frequently, replicating the pattern of results from study 1. We also demonstrate that senders who strategically express an emotion incongruent with the valence of the event (e.g., smile after seeing a negative event) are able to mislead observers, who tend to hold a congruent belief about the meaning of the emotion expression. If sending an incongruent signal helps to explain why female senders win more frequently, it logically follows that female observers were more prone to hold a congruent, and therefore inaccurate, belief. This prospect implies that while female senders are willing and/or capable of displaying fake smiles, paired-female observers are not taking this into account. Study 3 investigates the role of contextual factors by manipulating female observers' beliefs. When prompted

  20. Social Adjustment, Academic Adjustment, and the Ability to Identify Emotion in Facial Expressions of 7-Year-Old Children

    Science.gov (United States)

    Goodfellow, Stephanie; Nowicki, Stephen, Jr.

    2009-01-01

    The authors aimed to examine the possible association between (a) accurately reading emotion in facial expressions and (b) social and academic competence among elementary school-aged children. Participants were 840 7-year-old children who completed a test of the ability to read emotion in facial expressions. Teachers rated children's social and…

  1. Recognition of Schematic Facial Displays of Emotion in Parents of Children with Autism

    Science.gov (United States)

    Palermo, Mark T.; Pasqualetti, Patrizio; Barbati, Giulia; Intelligente, Fabio; Rossini, Paolo Maria

    2006-01-01

    Performance on an emotional labeling task in response to schematic facial patterns representing five basic emotions without the concurrent presentation of a verbal category was investigated in 40 parents of children with autism and 40 matched controls. "Autism fathers" performed worse than "autism mothers," who performed worse than controls in…

  2. Classification of facial-emotion expression in the application of psychotherapy using Viola-Jones and Edge-Histogram of Oriented Gradient.

    Science.gov (United States)

    Candra, Henry; Yuwono, Mitchell; Rifai Chai; Nguyen, Hung T; Su, Steven

    2016-08-01

    Psychotherapy requires appropriate recognition of patient's facial-emotion expression to provide proper treatment in psychotherapy session. To address the needs this paper proposed a facial emotion recognition system using Combination of Viola-Jones detector together with a feature descriptor we term Edge-Histogram of Oriented Gradients (E-HOG). The performance of the proposed method is compared with various feature sources including the face, the eyes, the mouth, as well as both the eyes and the mouth. Seven classes of basic emotions have been successfully identified with 96.4% accuracy using Multi-class Support Vector Machine (SVM). The proposed descriptor E-HOG is much leaner to compute compared to traditional HOG as shown by a significant improvement in processing time as high as 1833.33% (p-value = 2.43E-17) with a slight reduction in accuracy of only 1.17% (p-value = 0.0016).

  3. Impaired Recognition of Facially Expressed Emotions in Different Groups of Patients with Sleep Disorders.

    Science.gov (United States)

    Crönlein, Tatjana; Langguth, Berthold; Eichhammer, Peter; Busch, Volker

    2016-01-01

    Recently it has been shown that acute sleep loss has a direct impact on emotional processing in healthy individuals. Here we studied the effect of chronically disturbed sleep on emotional processing by investigating two samples of patients with sleep disorders. 25 patients with psychophysiologic insomnia (23 women and 2 men, mean age: 51.6 SD; 10.9 years), 19 patients with sleep apnea syndrome (4 women and 15 men, mean age: 51.9; SD 11.1) and a control sample of 24 subjects with normal sleep (15 women and 9 men, mean age 45.3; SD 8.8) completed a Facial Expressed Emotion Labelling (FEEL) task, requiring participants to categorize and rate the intensity of six emotional expression categories: anger, anxiety, fear, happiness, disgust and sadness. Differences in FEEL score and its subscales among the three samples were analysed using ANOVA with gender as a covariate. Both patients with psychophysiologic insomnia and patients with sleep apnea showed significantly lower performance in the FEEL test as compared to the control group. Differences were seen in the scales happiness and sadness. Patient groups did not differ from each other. By demonstrating that previously known effects of acute sleep deprivation on emotional processing can be extended to persons experiencing chronically disturbed sleep, our data contribute to a deeper understanding of the relationship between sleep loss and emotions.

  4. Impaired Recognition of Facially Expressed Emotions in Different Groups of Patients with Sleep Disorders.

    Directory of Open Access Journals (Sweden)

    Tatjana Crönlein

    Full Text Available Recently it has been shown that acute sleep loss has a direct impact on emotional processing in healthy individuals. Here we studied the effect of chronically disturbed sleep on emotional processing by investigating two samples of patients with sleep disorders.25 patients with psychophysiologic insomnia (23 women and 2 men, mean age: 51.6 SD; 10.9 years, 19 patients with sleep apnea syndrome (4 women and 15 men, mean age: 51.9; SD 11.1 and a control sample of 24 subjects with normal sleep (15 women and 9 men, mean age 45.3; SD 8.8 completed a Facial Expressed Emotion Labelling (FEEL task, requiring participants to categorize and rate the intensity of six emotional expression categories: anger, anxiety, fear, happiness, disgust and sadness. Differences in FEEL score and its subscales among the three samples were analysed using ANOVA with gender as a covariate.Both patients with psychophysiologic insomnia and patients with sleep apnea showed significantly lower performance in the FEEL test as compared to the control group. Differences were seen in the scales happiness and sadness. Patient groups did not differ from each other.By demonstrating that previously known effects of acute sleep deprivation on emotional processing can be extended to persons experiencing chronically disturbed sleep, our data contribute to a deeper understanding of the relationship between sleep loss and emotions.

  5. Not just fear and sadness: meta-analytic evidence of pervasive emotion recognition deficits for facial and vocal expressions in psychopathy.

    Science.gov (United States)

    Dawel, Amy; O'Kearney, Richard; McKone, Elinor; Palermo, Romina

    2012-11-01

    The present meta-analysis aimed to clarify whether deficits in emotion recognition in psychopathy are restricted to certain emotions and modalities or whether they are more pervasive. We also attempted to assess the influence of other important variables: age, and the affective factor of psychopathy. A systematic search of electronic databases and a subsequent manual search identified 26 studies that included 29 experiments (N = 1376) involving six emotion categories (anger, disgust, fear, happiness, sadness, surprise) across three modalities (facial, vocal, postural). Meta-analyses found evidence of pervasive impairments across modalities (facial and vocal) with significant deficits evident for several emotions (i.e., not only fear and sadness) in both adults and children/adolescents. These results are consistent with recent theorizing that the amygdala, which is believed to be dysfunctional in psychopathy, has a broad role in emotion processing. We discuss limitations of the available data that restrict the ability of meta-analysis to consider the influence of age and separate the sub-factors of psychopathy, highlighting important directions for future research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Facial and prosodic emotion recognition in social anxiety disorder.

    Science.gov (United States)

    Tseng, Huai-Hsuan; Huang, Yu-Lien; Chen, Jian-Ting; Liang, Kuei-Yu; Lin, Chao-Cheng; Chen, Sue-Huei

    2017-07-01

    Patients with social anxiety disorder (SAD) have a cognitive preference to negatively evaluate emotional information. In particular, the preferential biases in prosodic emotion recognition in SAD have been much less explored. The present study aims to investigate whether SAD patients retain negative evaluation biases across visual and auditory modalities when given sufficient response time to recognise emotions. Thirty-one SAD patients and 31 age- and gender-matched healthy participants completed a culturally suitable non-verbal emotion recognition task and received clinical assessments for social anxiety and depressive symptoms. A repeated measures analysis of variance was conducted to examine group differences in emotion recognition. Compared to healthy participants, SAD patients were significantly less accurate at recognising facial and prosodic emotions, and spent more time on emotion recognition. The differences were mainly driven by the lower accuracy and longer reaction times for recognising fearful emotions in SAD patients. Within the SAD patients, lower accuracy of sad face recognition was associated with higher severity of depressive and social anxiety symptoms, particularly with avoidance symptoms. These findings may represent a cross-modality pattern of avoidance in the later stage of identifying negative emotions in SAD. This pattern may be linked to clinical symptom severity.

  7. A neuroendocrine account of facial mimicry and its dynamic modulation

    NARCIS (Netherlands)

    Kraaijenvanger, Eline J.; Hofman, Dennis; Bos, Peter A.

    2017-01-01

    Facial expressions are considered central in conveying information about one's emotional state. During social encounters, facial expressions of another individual are often automatically imitated by the observer, a process referred to as ‘facial mimicry’. This process is assumed to facilitate

  8. Mother’s Happiness with Cognitive - Executive Functions and Facial Emotional Recognition in School Children with Down Syndrome

    Science.gov (United States)

    MALMIR, Maryam; SEIFENARAGHI, Maryam; FARHUD, Dariush D.; AFROOZ, G.Ali; KHANAHMADI, Mohammad

    2015-01-01

    Background: According to the mother’s key roles in bringing up emotional and cognitive abilities of mentally retarded children and respect to positive psychology in recent decades, this research is administered to assess the relation between mother’s happiness level with cognitive- executive functions (i.e. attention, working memory, inhibition and planning) and facial emotional recognition ability as two factors in learning and adjustment skills in mentally retarded children with Down syndrome. Methods: This study was an applied research and data were analyzed by Pearson correlation procedure. Population is included all school children with Down syndrome (9–12 yr) that come from Tehran, Iran. Overall, 30 children were selected as an in access sample. After selection and agreement of parents, the Wechsler Intelligence Scale for Children-Revised (WISC-R) was performed to determine the student’s IQ, and then mothers were invited to fill out the Oxford Happiness Inventory (OHI). Cognitive-executive functions were evaluated by tests as followed: Continues Performance Test (CPT), N-Back, Stroop test (day and night version) and Tower of London. Ekman emotion facial expression test was also accomplished for assessing facial emotional recognition in children with Down syndrome, individually. Results: Mother’s happiness level had a positive relation with cognitive-executive functions (attention, working memory, inhibition and planning) and facial emotional recognition in her children with Down syndrome, significantly. Conclusion: Parents’ happiness (especially mothers) is a powerful predictor for cognitive and emotional abilities of their children. PMID:26284205

  9. Influences of sex, type and intensity of emotion in the ecognition of static and dynamic facial expressions*

    Directory of Open Access Journals (Sweden)

    Nelson Torro-Alves

    2013-01-01

    Full Text Available Ecological validity of static and intense facial expressions in emotional recognition has been questioned. Recent studies have recommended the use of facial stimuli more compatible to the natural conditions of social interaction, which involves motion and variations in emotional intensity. In this study, we compared the recognition of static and dynamic facial expressions of happiness, fear, anger and sadness, presented in four emotional intensities (25 %, 50 %, 75 % and 100 %. Twenty volunteers (9 women and 11 men, aged between 19 and 31 years, took part in the study. The experiment consisted of two sessions in which participants had to identify the emotion of static (photographs and dynamic (videos displays of facial expressions on the computer screen. The mean accuracy was submitted to an Anova for repeated measures of model: 2 sexes x [2 conditions x 4 expressions x 4 intensities]. We observed an advantage for the recognition of dynamic expressions of happiness and fear compared to the static stimuli (p < .05. Analysis of interactions showed that expressions with intensity of 25 % were better recognized in the dynamic condition (p < .05. The addition of motion contributes to improve recognition especially in male participants (p < .05. We concluded that the effect of the motion varies as a function of the type of emotion, intensity of the expression and sex of the participant. These results support the hypothesis that dynamic stimuli have more ecological validity and are more appropriate to the research with emotions.

  10. Processing of facial affect in social drinkers: a dose-response study of alcohol using dynamic emotion expressions.

    Science.gov (United States)

    Kamboj, Sunjeev K; Joye, Alyssa; Bisby, James A; Das, Ravi K; Platt, Bradley; Curran, H Valerie

    2013-05-01

    Studies of affect recognition can inform our understanding of the interpersonal effects of alcohol and help develop a more complete neuropsychological profile of this drug. The objective of the study was to examine affect recognition in social drinkers using a novel dynamic affect-recognition task, sampling performance across a range of evolutionarily significant target emotions and neutral expressions. Participants received 0, 0.4 or 0.8 g/kg alcohol in a double-blind, independent groups design. Relatively naturalistic changes in facial expression-from neutral (mouth open) to increasing intensities of target emotions, as well as neutral (mouth closed)-were simulated using computer-generated dynamic morphs. Accuracy and reaction time were measured and a two-high-threshold model applied to hits and false-alarm data to determine sensitivity and response bias. While there was no effect on the principal emotion expressions (happiness, sadness, fear, anger and disgust), compared to those receiving 0.8 g/kg of alcohol and placebo, participants administered with 0.4 g/kg alcohol tended to show an enhanced response bias to neutral expressions. Exploration of this effect suggested an accompanying tendency to misattribute neutrality to sad expressions following the 0.4-g/kg dose. The 0.4-g/kg alcohol-but not 0.8 g/kg-produced a limited and specific modification in affect recognition evidenced by a neutral response bias and possibly an accompanying tendency to misclassify sad expressions as neutral. In light of previous findings on involuntary negative memory following the 0.4-g/kg dose, we suggest that moderate-but not high-doses of alcohol have a special relevance to emotional processing in social drinkers.

  11. Schematic drawings of facial expressions for emotion recognition and interpretation by preschool-aged children.

    Science.gov (United States)

    MacDonald, P M; Kirkpatrick, S W; Sullivan, L A

    1996-11-01

    Schematic drawings of facial expressions were evaluated as a possible assessment tool for research on emotion recognition and interpretation involving young children. A subset of Ekman and Friesen's (1976) Pictures of Facial Affect was used as the standard for comparison. Preschool children (N = 138) were shown drawing and photographs in two context conditions for six emotions (anger, disgust, fear, happiness, sadness, and surprise). The overall correlation between accuracy for the photographs and drawings was .677. A significant difference was found for the stimulus condition (photographs vs. drawings) but not for the administration condition (label-based vs. context-based). Children were significantly more accurate in interpreting drawings than photographs and tended to be more accurate in identifying facial expressions in the label-based administration condition for both photographs and drawings than in the context-based administration condition.

  12. Reduced white matter integrity and facial emotion perception in never-medicated patients with first-episode schizophrenia: A diffusion tensor imaging study.

    Science.gov (United States)

    Zhao, Xiaoxin; Sui, Yuxiu; Yao, Jingjing; Lv, Yiding; Zhang, Xinyue; Jin, Zhuma; Chen, Lijun; Zhang, Xiangrong

    2017-07-03

    Facial emotion perception is impaired in schizophrenia. Although the pathology of schizophrenia is thought to involve abnormality in white matter (WM), few studies have examined the correlation between facial emotion perception and WM abnormalities in never-medicated patients with first-episode schizophrenia. The present study tested associations between facial emotion perception and WM integrity in order to investigate the neural basis of impaired facial emotion perception in schizophrenia. Sixty-three schizophrenic patients and thirty control subjects underwent facial emotion categorization (FEC). The FEC data was inserted into a logistic function model with subsequent analysis by independent-samples T test and the shift point and slope as outcome measurements. Severity of symptoms was measured using a five-factor model of the Positive and Negative Syndrome Scale (PANSS). Voxelwise group comparison of WM fractional anisotropy (FA) was operated using tract-based spatial statistics (TBSS). The correlation between impaired facial emotion perception and FA reduction was examined in patients using simple regression analysis within brain areas that showed a significant FA reduction in patients compared with controls. The same correlation analysis was also performed for control subjects in the whole brain. The patients with schizophrenia reported a higher shift point and a steeper slope than control subjects in FEC. The patients showed a significant FA reduction in left deep WM in the parietal, temporal and occipital lobes, a small portion of the corpus callosum (CC), and the corona radiata. In voxelwise correlation analysis, we found that facial emotion perception significantly correlated with reduced FA in various WM regions, including left forceps major (FM), inferior longitudinal fasciculus (ILF), inferior fronto-occipital fasciculus (IFOF), Left splenium of CC, and left ILF. The correlation analyses in healthy controls revealed no significant correlation of FA with

  13. Identifying Facial Emotions: Valence Specific Effects and an Exploration of the Effects of Viewer Gender

    Science.gov (United States)

    Jansari, Ashok; Rodway, Paul; Goncalves, Salvador

    2011-01-01

    The valence hypothesis suggests that the right hemisphere is specialised for negative emotions and the left hemisphere is specialised for positive emotions (Silberman & Weingartner, 1986). It is unclear to what extent valence-specific effects in facial emotion perception depend upon the gender of the perceiver. To explore this question 46…

  14. Fixation to features and neural processing of facial expressions in a gender discrimination task.

    Science.gov (United States)

    Neath, Karly N; Itier, Roxane J

    2015-10-01

    Early face encoding, as reflected by the N170 ERP component, is sensitive to fixation to the eyes. Whether this sensitivity varies with facial expressions of emotion and can also be seen on other ERP components such as P1 and EPN, was investigated. Using eye-tracking to manipulate fixation on facial features, we found the N170 to be the only eye-sensitive component and this was true for fearful, happy and neutral faces. A different effect of fixation to features was seen for the earlier P1 that likely reflected general sensitivity to face position. An early effect of emotion (∼120 ms) for happy faces was seen at occipital sites and was sustained until ∼350 ms post-stimulus. For fearful faces, an early effect was seen around 80 ms followed by a later effect appearing at ∼150 ms until ∼300 ms at lateral posterior sites. Results suggests that in this emotion-irrelevant gender discrimination task, processing of fearful and happy expressions occurred early and largely independently of the eye-sensitivity indexed by the N170. Processing of the two emotions involved different underlying brain networks active at different times. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Perception of Emotional Facial Expressions in Amyotrophic Lateral Sclerosis (ALS) at Behavioural and Brain Metabolic Level.

    Science.gov (United States)

    Aho-Özhan, Helena E A; Keller, Jürgen; Heimrath, Johanna; Uttner, Ingo; Kassubek, Jan; Birbaumer, Niels; Ludolph, Albert C; Lulé, Dorothée

    2016-01-01

    Amyotrophic lateral sclerosis (ALS) primarily impairs motor abilities but also affects cognition and emotional processing. We hypothesise that subjective ratings of emotional stimuli depicting social interactions and facial expressions is changed in ALS. It was found that recognition of negative emotions and ability to mentalize other's intentions is reduced. Processing of emotions in faces was investigated. A behavioural test of Ekman faces expressing six basic emotions was presented to 30 ALS patients and 29 age-, gender and education matched healthy controls. Additionally, a subgroup of 15 ALS patients that were able to lie supine in the scanner and 14 matched healthy controls viewed the Ekman faces during functional magnetic resonance imaging (fMRI). Affective state and a number of daily social contacts were measured. ALS patients recognized disgust and fear less accurately than healthy controls. In fMRI, reduced brain activity was seen in areas involved in processing of negative emotions replicating our previous results. During processing of sad faces, increased brain activity was seen in areas associated with social emotions in right inferior frontal gyrus and reduced activity in hippocampus bilaterally. No differences in brain activity were seen for any of the other emotional expressions. Inferior frontal gyrus activity for sad faces was associated with increased amount of social contacts of ALS patients. ALS patients showed decreased brain and behavioural responses in processing of disgust and fear and an altered brain response pattern for sadness. The negative consequences of neurodegenerative processes in the course of ALS might be counteracted by positive emotional activity and positive social interactions.

  16. Evaluating Posed and Evoked Facial Expressions of Emotion from Adults with Autism Spectrum Disorder

    Science.gov (United States)

    Faso, Daniel J.; Sasson, Noah J.; Pinkham, Amy E.

    2015-01-01

    Though many studies have examined facial affect perception by individuals with autism spectrum disorder (ASD), little research has investigated how facial expressivity in ASD is perceived by others. Here, naïve female observers (n = 38) judged the intensity, naturalness and emotional category of expressions produced by adults with ASD (n = 6) and…

  17. Serotonin transporter gene-linked polymorphism affects detection of facial expressions.

    Directory of Open Access Journals (Sweden)

    Ai Koizumi

    Full Text Available Previous studies have demonstrated that the serotonin transporter gene-linked polymorphic region (5-HTTLPR affects the recognition of facial expressions and attention to them. However, the relationship between 5-HTTLPR and the perceptual detection of others' facial expressions, the process which takes place prior to emotional labeling (i.e., recognition, is not clear. To examine whether the perceptual detection of emotional facial expressions is influenced by the allelic variation (short/long of 5-HTTLPR, happy and sad facial expressions were presented at weak and mid intensities (25% and 50%. Ninety-eight participants, genotyped for 5-HTTLPR, judged whether emotion in images of faces was present. Participants with short alleles showed higher sensitivity (d' to happy than to sad expressions, while participants with long allele(s showed no such positivity advantage. This effect of 5-HTTLPR was found at different facial expression intensities among males and females. The results suggest that at the perceptual stage, a short allele enhances the processing of positive facial expressions rather than that of negative facial expressions.

  18. Neuroanatomical correlates of impaired decision-making and facial emotion recognition in early Parkinson's disease.

    Science.gov (United States)

    Ibarretxe-Bilbao, Naroa; Junque, Carme; Tolosa, Eduardo; Marti, Maria-Jose; Valldeoriola, Francesc; Bargallo, Nuria; Zarei, Mojtaba

    2009-09-01

    Decision-making and recognition of emotions are often impaired in patients with Parkinson's disease (PD). The orbitofrontal cortex (OFC) and the amygdala are critical structures subserving these functions. This study was designed to test whether there are any structural changes in these areas that might explain the impairment of decision-making and recognition of facial emotions in early PD. We used the Iowa Gambling Task (IGT) and the Ekman 60 faces test which are sensitive to the integrity of OFC and amygdala dysfunctions in 24 early PD patients and 24 controls. High-resolution structural magnetic resonance images (MRI) were also obtained. Group analysis using voxel-based morphometry (VBM) showed significant and corrected (P decision-making and recognition of facial emotions occurs at the early stages of PD, (ii) these neuropsychological deficits are accompanied by degeneration of OFC and amygdala, and (iii) bilateral OFC reductions are associated with impaired recognition of emotions, and GM volume loss in left lateral OFC is related to decision-making impairment in PD.

  19. USE OF FACIAL EMOTION RECOGNITION IN E-LEARNING SYSTEMS

    Directory of Open Access Journals (Sweden)

    Uğur Ayvaz

    2017-09-01

    Full Text Available Since the personal computer usage and internet bandwidth are increasing, e-learning systems are also widely spreading. Although e-learning has some advantages in terms of information accessibility, time and place flexibility compared to the formal learning, it does not provide enough face-to-face interactivity between an educator and learners. In this study, we are proposing a hybrid information system, which is combining computer vision and machine learning technologies for visual and interactive e-learning systems. The proposed information system detects emotional states of the learners and gives feedback to an educator about their instant and weighted emotional states based on facial expressions. In this way, the educator will be aware of the general emotional state of the virtual classroom and the system will create a formal learning-like interactive environment. Herein, several classification algorithms were applied to learn instant emotional state and the best accuracy rates were obtained using kNN and SVM algorithms.

  20. How to make a robot smile? Perception of emotional expressions from digitally-extracted facial landmark configurations

    NARCIS (Netherlands)

    Liu, C.; Ham, J.R.C.; Postma, E.O.; Midden, C.J.H.; Joosten, B.; Goudbeek, M.; Ge, S.S.; Khatib, O.

    2012-01-01

    Abstract. To design robots or embodied conversational agents that can accurately display facial expressions indicating an emotional state, we need technology to produce those facial expressions, and research that investigates the relationship between those technologies and human social perception of

  1. Image-based Analysis of Emotional Facial Expressions in Full Face Transplants.

    Science.gov (United States)

    Bedeloglu, Merve; Topcu, Çagdas; Akgul, Arzu; Döger, Ela Naz; Sever, Refik; Ozkan, Ozlenen; Ozkan, Omer; Uysal, Hilmi; Polat, Ovunc; Çolak, Omer Halil

    2018-01-20

    In this study, it is aimed to determine the degree of the development in emotional expression of full face transplant patients from photographs. Hence, a rehabilitation process can be planned according to the determination of degrees as a later work. As envisaged, in full face transplant cases, the determination of expressions can be confused or cannot be achieved as the healthy control group. In order to perform image-based analysis, a control group consist of 9 healthy males and 2 full-face transplant patients participated in the study. Appearance-based Gabor Wavelet Transform (GWT) and Local Binary Pattern (LBP) methods are adopted for recognizing neutral and 6 emotional expressions which consist of angry, scared, happy, hate, confused and sad. Feature extraction was carried out by using both methods and combination of these methods serially. In the performed expressions, the extracted features of the most distinct zones in the facial area where the eye and mouth region, have been used to classify the emotions. Also, the combination of these region features has been used to improve classifier performance. Control subjects and transplant patients' ability to perform emotional expressions have been determined with K-nearest neighbor (KNN) classifier with region-specific and method-specific decision stages. The results have been compared with healthy group. It has been observed that transplant patients don't reflect some emotional expressions. Also, there were confusions among expressions.

  2. Slowing down Presentation of Facial Movements and Vocal Sounds Enhances Facial Expression Recognition and Induces Facial-Vocal Imitation in Children with Autism

    Science.gov (United States)

    Tardif, Carole; Laine, France; Rodriguez, Melissa; Gepner, Bruno

    2007-01-01

    This study examined the effects of slowing down presentation of facial expressions and their corresponding vocal sounds on facial expression recognition and facial and/or vocal imitation in children with autism. Twelve autistic children and twenty-four normal control children were presented with emotional and non-emotional facial expressions on…

  3. Influence of spatial frequency and emotion expression on face processing in patients with panic disorder.

    Science.gov (United States)

    Shim, Miseon; Kim, Do-Won; Yoon, Sunkyung; Park, Gewnhi; Im, Chang-Hwan; Lee, Seung-Hwan

    2016-06-01

    Deficits in facial emotion processing is a major characteristic of patients with panic disorder. It is known that visual stimuli with different spatial frequencies take distinct neural pathways. This study investigated facial emotion processing involving stimuli presented at broad, high, and low spatial frequencies in patients with panic disorder. Eighteen patients with panic disorder and 19 healthy controls were recruited. Seven event-related potential (ERP) components: (P100, N170, early posterior negativity (EPN); vertex positive potential (VPP), N250, P300; and late positive potential (LPP)) were evaluated while the participants looked at fearful and neutral facial stimuli presented at three spatial frequencies. When a fearful face was presented, panic disorder patients showed a significantly increased P100 amplitude in response to low spatial frequency compared to high spatial frequency; whereas healthy controls demonstrated significant broad spatial frequency dependent processing in P100 amplitude. Vertex positive potential amplitude was significantly increased in high and broad spatial frequency, compared to low spatial frequency in panic disorder. Early posterior negativity amplitude was significantly different between HSF and BSF, and between LSF and BSF processing in both groups, regardless of facial expression. The possibly confounding effects of medication could not be controlled. During early visual processing, patients with panic disorder prefer global to detailed information. However, in later processing, panic disorder patients overuse detailed information for the perception of facial expressions. These findings suggest that unique spatial frequency-dependent facial processing could shed light on the neural pathology associated with panic disorder. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Associations between facial emotion recognition and young adolescents' behaviors in bullying.

    Directory of Open Access Journals (Sweden)

    Tiziana Pozzoli

    Full Text Available This study investigated whether different behaviors young adolescents can act during bullying episodes were associated with their ability to recognize morphed facial expressions of the six basic emotions, expressed at high and low intensity. The sample included 117 middle-school students (45.3% girls; mean age = 12.4 years who filled in a peer nomination questionnaire and individually performed a computerized emotion recognition task. Bayesian generalized mixed-effects models showed a complex picture, in which type and intensity of emotions, students' behavior and gender interacted in explaining recognition accuracy. Results were discussed with a particular focus on negative emotions and suggesting a "neutral" nature of emotion recognition ability, which does not necessarily lead to moral behavior but can also be used for pursuing immoral goals.

  5. Associations between facial emotion recognition and young adolescents’ behaviors in bullying

    Science.gov (United States)

    Gini, Gianluca; Altoè, Gianmarco

    2017-01-01

    This study investigated whether different behaviors young adolescents can act during bullying episodes were associated with their ability to recognize morphed facial expressions of the six basic emotions, expressed at high and low intensity. The sample included 117 middle-school students (45.3% girls; mean age = 12.4 years) who filled in a peer nomination questionnaire and individually performed a computerized emotion recognition task. Bayesian generalized mixed-effects models showed a complex picture, in which type and intensity of emotions, students’ behavior and gender interacted in explaining recognition accuracy. Results were discussed with a particular focus on negative emotions and suggesting a “neutral” nature of emotion recognition ability, which does not necessarily lead to moral behavior but can also be used for pursuing immoral goals. PMID:29131871

  6. Lateralisation effect in comprehension of emotional facial expression: a comparison between EEG alpha band power and behavioural inhibition (BIS) and activation (BAS) systems.

    Science.gov (United States)

    Balconi, Michela; Mazza, Guido

    2010-05-01

    Asymmetry in comprehension of facial expression of emotions was explored in the present study by analysing alpha band variation within the right and left cortical sides. Second, the behavioural activation system (BAS) and behavioural inhibition system (BIS) were considered as an explicative factor to verify the effect of a motivational/emotional variable on alpha activity. A total of 19 participants looked at an ample range of facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral) in random order. The results demonstrated that anterior frontal sites were more active than central and parietal sites in response to facial stimuli. Moreover, right and left side responses varied as a function of emotional types, with an increased right frontal activity for negative, aversive emotions vs an increased left response for positive emotion. Finally, whereas higher BIS participants generated more right hemisphere activation for some negative emotions (such as fear, anger, surprise, and disgust), BAS participants were more responsive to positive emotion (happiness) within the left hemisphere. Motivational significance of facial expressions was considered to elucidate cortical differences in participants' responses to emotional types.

  7. The Relative Power of an Emotion's Facial Expression, Label, and Behavioral Consequence to Evoke Preschoolers' Knowledge of Its Cause

    Science.gov (United States)

    Widen, Sherri C.; Russell, James A.

    2004-01-01

    Lay people and scientists alike assume that, especially for young children, facial expressions are a strong cue to another's emotion. We report a study in which children (N=120; 3-4 years) described events that would cause basic emotions (surprise, fear, anger, disgust, sadness) presented as its facial expression, as its label, or as its…

  8. Effects of Early Neglect Experience on Recognition and Processing of Facial Expressions: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Victoria Doretto

    2018-01-01

    Full Text Available Background: Child neglect is highly prevalent and associated with a series of biological and social consequences. Early neglect may alter the recognition of emotional faces, but its precise impact remains unclear. We aim to review and analyze data from recent literature about recognition and processing of facial expressions in individuals with history of childhood neglect. Methods: We conducted a systematic review using PubMed, PsycINFO, ScIELO and EMBASE databases in the search of studies for the past 10 years. Results: In total, 14 studies were selected and critically reviewed. A heterogeneity was detected across methods and sample frames. Results were mixed across studies. Different forms of alterations to perception of facial expressions were found across 12 studies. There was alteration to the recognition and processing of both positive and negative emotions, but for emotional face processing there was predominance in alteration toward negative emotions. Conclusions: This is the first review to examine specifically the effects of early neglect experience as a prevalent condition of child maltreatment. The results of this review are inconclusive due to methodological diversity, implement of distinct instruments and differences in the composition of the samples. Despite these limitations, some studies support our hypothesis that individuals with history of early negligence may present alteration to the ability to perceive face expressions of emotions. The article brings relevant information that can help in the development of more effective therapeutic strategies to reduce the impact of neglect on the cognitive and emotional development of the child.

  9. Facial Emotion Recognition in Children with High Functioning Autism and Children with Social Phobia

    Science.gov (United States)

    Wong, Nina; Beidel, Deborah C.; Sarver, Dustin E.; Sims, Valerie

    2012-01-01

    Recognizing facial affect is essential for effective social functioning. This study examines emotion recognition abilities in children aged 7-13 years with High Functioning Autism (HFA = 19), Social Phobia (SP = 17), or typical development (TD = 21). Findings indicate that all children identified certain emotions more quickly (e.g., happy [less…

  10. Facial Expression at Retrieval Affects Recognition of Facial Identity

    Directory of Open Access Journals (Sweden)

    Wenfeng eChen

    2015-06-01

    Full Text Available It is well known that memory can be modulated by emotional stimuli at the time of encoding and consolidation. For example, happy faces create better identity recognition than faces with certain other expressions. However, the influence of facial expression at the time of retrieval remains unknown in the literature. To separate the potential influence of expression at retrieval from its effects at earlier stages, we had participants learn neutral faces but manipulated facial expression at the time of memory retrieval in a standard old/new recognition task. The results showed a clear effect of facial expression, where happy test faces were identified more successfully than angry test faces. This effect is unlikely due to greater image similarity between the neutral learning face and the happy test face, because image analysis showed that the happy test faces are in fact less similar to the neutral learning faces relative to the angry test faces. In the second experiment, we investigated whether this emotional effect is influenced by the expression at the time of learning. We employed angry or happy faces as learning stimuli, and angry, happy, and neutral faces as test stimuli. The results showed that the emotional effect at retrieval is robust across different encoding conditions with happy or angry expressions. These findings indicate that emotional expressions affect the retrieval process in identity recognition, and identity recognition does not rely on emotional association between learning and test faces.

  11. Intact mirror mechanisms for automatic facial emotions in children and adolescents with autism spectrum disorder.

    Science.gov (United States)

    Schulte-Rüther, Martin; Otte, Ellen; Adigüzel, Kübra; Firk, Christine; Herpertz-Dahlmann, Beate; Koch, Iring; Konrad, Kerstin

    2017-02-01

    It has been suggested that an early deficit in the human mirror neuron system (MNS) is an important feature of autism. Recent findings related to simple hand and finger movements do not support a general dysfunction of the MNS in autism. Studies investigating facial actions (e.g., emotional expressions) have been more consistent, however, mostly relied on passive observation tasks. We used a new variant of a compatibility task for the assessment of automatic facial mimicry responses that allowed for simultaneous control of attention to facial stimuli. We used facial electromyography in 18 children and adolescents with Autism spectrum disorder (ASD) and 18 typically developing controls (TDCs). We observed a robust compatibility effect in ASD, that is, the execution of a facial expression was facilitated if a congruent facial expression was observed. Time course analysis of RT distributions and comparison to a classic compatibility task (symbolic Simon task) revealed that the facial compatibility effect appeared early and increased with time, suggesting fast and sustained activation of motor codes during observation of facial expressions. We observed a negative correlation of the compatibility effect with age across participants and in ASD, and a positive correlation between self-rated empathy and congruency for smiling faces in TDC but not in ASD. This pattern of results suggests that basic motor mimicry is intact in ASD, but is not associated with complex social cognitive abilities such as emotion understanding and empathy. Autism Res 2017, 10: 298-310. © 2016 International Society for Autism Research, Wiley Periodicals, Inc. © 2016 International Society for Autism Research, Wiley Periodicals, Inc.

  12. Caricaturing facial expressions.

    Science.gov (United States)

    Calder, A J; Rowland, D; Young, A W; Nimmo-Smith, I; Keane, J; Perrett, D I

    2000-08-14

    The physical differences between facial expressions (e.g. fear) and a reference norm (e.g. a neutral expression) were altered to produce photographic-quality caricatures. In Experiment 1, participants rated caricatures of fear, happiness and sadness for their intensity of these three emotions; a second group of participants rated how 'face-like' the caricatures appeared. With increasing levels of exaggeration the caricatures were rated as more emotionally intense, but less 'face-like'. Experiment 2 demonstrated a similar relationship between emotional intensity and level of caricature for six different facial expressions. Experiments 3 and 4 compared intensity ratings of facial expression caricatures prepared relative to a selection of reference norms - a neutral expression, an average expression, or a different facial expression (e.g. anger caricatured relative to fear). Each norm produced a linear relationship between caricature and rated intensity of emotion; this finding is inconsistent with two-dimensional models of the perceptual representation of facial expression. An exemplar-based multidimensional model is proposed as an alternative account.

  13. Botulinum toxin-induced facial muscle paralysis affects amygdala responses to the perception of emotional expressions: preliminary findings from an A-B-A design

    OpenAIRE

    Kim, M Justin; Neta, Maital; Davis, F Caroline; Ruberry, Erika J; Dinescu, Diana; Heatherton, Todd F; Stotland, Mitchell A; Whalen, Paul J

    2014-01-01

    Background It has long been suggested that feedback signals from facial muscles influence emotional experience. The recent surge in use of botulinum toxin (BTX) to induce temporary muscle paralysis offers a unique opportunity to directly test this ?facial feedback hypothesis.? Previous research shows that the lack of facial muscle feedback due to BTX-induced paralysis influences subjective reports of emotional experience, as well as brain activity associated with the imitation of emotional fa...

  14. Discovering cultural differences (and similarities) in facial expressions of emotion.

    Science.gov (United States)

    Chen, Chaona; Jack, Rachael E

    2017-10-01

    Understanding the cultural commonalities and specificities of facial expressions of emotion remains a central goal of Psychology. However, recent progress has been stayed by dichotomous debates (e.g. nature versus nurture) that have created silos of empirical and theoretical knowledge. Now, an emerging interdisciplinary scientific culture is broadening the focus of research to provide a more unified and refined account of facial expressions within and across cultures. Specifically, data-driven approaches allow a wider, more objective exploration of face movement patterns that provide detailed information ontologies of their cultural commonalities and specificities. Similarly, a wider exploration of the social messages perceived from face movements diversifies knowledge of their functional roles (e.g. the 'fear' face used as a threat display). Together, these new approaches promise to diversify, deepen, and refine knowledge of facial expressions, and deliver the next major milestones for a functional theory of human social communication that is transferable to social robotics. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Can Gaze Avoidance Explain Why Individuals with Asperger's Syndrome Can't Recognise Emotions from Facial Expressions?

    Science.gov (United States)

    Sawyer, Alyssa C. P.; Williamson, Paul; Young, Robyn L.

    2012-01-01

    Research has shown that individuals with Autism Spectrum Disorders (ASD) have difficulties recognising emotions from facial expressions. Since eye contact is important for accurate emotion recognition, and individuals with ASD tend to avoid eye contact, this tendency for gaze aversion has been proposed as an explanation for the emotion recognition…

  16. Emotional voices in context: a neurobiological model of multimodal affective information processing.

    Science.gov (United States)

    Brück, Carolin; Kreifelts, Benjamin; Wildgruber, Dirk

    2011-12-01

    Just as eyes are often considered a gateway to the soul, the human voice offers a window through which we gain access to our fellow human beings' minds - their attitudes, intentions and feelings. Whether in talking or singing, crying or laughing, sighing or screaming, the sheer sound of a voice communicates a wealth of information that, in turn, may serve the observant listener as valuable guidepost in social interaction. But how do human beings extract information from the tone of a voice? In an attempt to answer this question, the present article reviews empirical evidence detailing the cerebral processes that underlie our ability to decode emotional information from vocal signals. The review will focus primarily on two prominent classes of vocal emotion cues: laughter and speech prosody (i.e. the tone of voice while speaking). Following a brief introduction, behavioral as well as neuroimaging data will be summarized that allows to outline cerebral mechanisms associated with the decoding of emotional voice cues, as well as the influence of various context variables (e.g. co-occurring facial and verbal emotional signals, attention focus, person-specific parameters such as gender and personality) on the respective processes. Building on the presented evidence, a cerebral network model will be introduced that proposes a differential contribution of various cortical and subcortical brain structures to the processing of emotional voice signals both in isolation and in context of accompanying (facial and verbal) emotional cues. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Emotional voices in context: A neurobiological model of multimodal affective information processing

    Science.gov (United States)

    Brück, Carolin; Kreifelts, Benjamin; Wildgruber, Dirk

    2011-12-01

    Just as eyes are often considered a gateway to the soul, the human voice offers a window through which we gain access to our fellow human beings' minds - their attitudes, intentions and feelings. Whether in talking or singing, crying or laughing, sighing or screaming, the sheer sound of a voice communicates a wealth of information that, in turn, may serve the observant listener as valuable guidepost in social interaction. But how do human beings extract information from the tone of a voice? In an attempt to answer this question, the present article reviews empirical evidence detailing the cerebral processes that underlie our ability to decode emotional information from vocal signals. The review will focus primarily on two prominent classes of vocal emotion cues: laughter and speech prosody (i.e. the tone of voice while speaking). Following a brief introduction, behavioral as well as neuroimaging data will be summarized that allows to outline cerebral mechanisms associated with the decoding of emotional voice cues, as well as the influence of various context variables (e.g. co-occurring facial and verbal emotional signals, attention focus, person-specific parameters such as gender and personality) on the respective processes. Building on the presented evidence, a cerebral network model will be introduced that proposes a differential contribution of various cortical and subcortical brain structures to the processing of emotional voice signals both in isolation and in context of accompanying (facial and verbal) emotional cues.

  18. How facial expressions in a Rett syndrome population are recognised and interpreted by those around them as conveying emotions

    DEFF Research Database (Denmark)

    Bergström-Isacsson, Märith; Lagerkvist, Bengt; Holck, Ulla

    2013-01-01

    to investigate if the Facial Action Coding System (FACS) could be used to identify facial expressions, and differentiate between those that expressed emotions and those that were elicited by abnormal brainstem activation in RTT. The sample comprised 29 participants with RTT and 11 children with a normal...... developmental pattern, exposed to six different musical stimuli during non-invasive registration of autonomic brainstem functions. The results indicate that FACS makes it possible both to identify facial expressions and to differentiate between those that stem from emotions and those caused by abnormal...

  19. The Effect of Gender and Age Differences on the Recognition of Emotions from Facial Expressions

    DEFF Research Database (Denmark)

    Schneevogt, Daniela; Paggio, Patrizia

    2016-01-01

    subjects. We conduct an emotion recognition task followed by two stereotype question- naires with different genders and age groups. While recent findings (Krems et al., 2015) suggest that women are biased to see anger in neutral facial expressions posed by females, in our sample both genders assign higher...... ratings of anger to all emotions expressed by females. Furthermore, we demonstrate an effect of gender on the fear-surprise-confusion observed by Tomkins and McCarter (1964); females overpredict fear, while males overpredict surprise.......Recent studies have demonstrated gender and cultural differences in the recognition of emotions in facial expressions. However, most studies were conducted on American subjects. In this pa- per, we explore the generalizability of several findings to a non-American culture in the form of Danish...

  20. Writ Large on Your Face: Observing Emotions Using Automatic Facial Analysis

    Directory of Open Access Journals (Sweden)

    Dieckmann Anja

    2014-05-01

    Full Text Available Emotions affect all of our daily decisions and, of course, they also influence our evaluations of brands, products and advertisements. But what exactly do consumers feel when they watch a TV commercial, visit a website or when they interact with a brand in different ways? Measuring such emotions is not an easy task. In the past, the effectiveness of marketing material was evaluated mostly by subsequent surveys. Now, with the emergence of neuroscientific approaches like EEG, the measurement of real-time reactions is possible, for instance, when watching a commercial. However, most neuroscientific procedures are fairly invasive and irritating. For an EEG, for instance, numerous electrodes need to be placed on the participant's scalp. Furthermore, data analysis is highly complex. Scientific expertise is necessary for interpretation, so the procedure remains a black box to most practitioners and the results are still rather controversial. By contrast, automatic facial analysis provides similar information without having to wire study participants. In addition, the results of such analyses are intuitive and easy to interpret even for laypeople. These convincing advantages led GfK Company to decide on facial analysis and to develop a tool suitable for measuring emotional responses to marketing stimuli, making it easily applicable in marketing research practice.

  1. Decoding of Emotion through Facial Expression, Prosody and Verbal Content in Children and Adolescents with Asperger's Syndrome

    Science.gov (United States)

    Lindner, Jennifer L.; Rosen, Lee A.

    2006-01-01

    This study examined differences in the ability to decode emotion through facial expression, prosody, and verbal content between 14 children with Asperger's Syndrome (AS) and 16 typically developing peers. The ability to decode emotion was measured by the Perception of Emotion Test (POET), which portrayed the emotions of happy, angry, sad, and…

  2. Speed and accuracy of facial expression classification in avoidant personality disorder: a preliminary study.

    Science.gov (United States)

    Rosenthal, M Zachary; Kim, Kwanguk; Herr, Nathaniel R; Smoski, Moria J; Cheavens, Jennifer S; Lynch, Thomas R; Kosson, David S

    2011-10-01

    The aim of this preliminary study was to examine whether individuals with avoidant personality disorder (APD) could be characterized by deficits in the classification of dynamically presented facial emotional expressions. Using a community sample of adults with APD (n = 17) and non-APD controls (n = 16), speed and accuracy of facial emotional expression recognition was investigated in a task that morphs facial expressions from neutral to prototypical expressions (Multi-Morph Facial Affect Recognition Task; Blair, Colledge, Murray, & Mitchell, 2001). Results indicated that individuals with APD were significantly more likely than controls to make errors when classifying fully expressed fear. However, no differences were found between groups in the speed to correctly classify facial emotional expressions. The findings are some of the first to investigate facial emotional processing in a sample of individuals with APD and point to an underlying deficit in processing social cues that may be involved in the maintenance of APD.

  3. Difficulty identifying feelings and automatic activation in the fusiform gyrus in response to facial emotion.

    Science.gov (United States)

    Eichmann, Mischa; Kugel, Harald; Suslow, Thomas

    2008-12-01

    Difficulties in identifying and differentiating one's emotions are a central characteristic of alexithymia. In the present study, automatic activation of the fusiform gyrus to facial emotion was investigated as a function of alexithymia as assessed by the 20-item Toronto Alexithymia Scale. During 3 Tesla fMRI scanning, pictures of faces bearing sad, happy, and neutral expressions masked by neutral faces were presented to 22 healthy adults who also responded to the Toronto Alexithymia Scale. The fusiform gyrus was selected as the region of interest, and voxel values of this region were extracted, summarized as means, and tested among the different conditions (sad, happy, and neutral faces). Masked sad facial emotions were associated with greater bilateral activation of the fusiform gyrus than masked neutral faces. The subscale, Difficulty Identifying Feelings, was negatively correlated with the neural response of the fusiform gyrus to masked sad faces. The correlation results suggest that automatic hyporesponsiveness of the fusiform gyrus to negative emotion stimuli may reflect problems in recognizing one's emotions in everyday life.

  4. Children's Scripts for Social Emotions: Causes and Consequences Are More Central than Are Facial Expressions

    Science.gov (United States)

    Widen, Sherri C.; Russell, James A.

    2010-01-01

    Understanding and recognition of emotions relies on emotion concepts, which are narrative structures (scripts) specifying facial expressions, causes, consequences, label, etc. organized in a temporal and causal order. Scripts and their development are revealed by examining which components better tap which concepts at which ages. This study…

  5. That "poker face" just might lose you the game! The impact of expressive suppression and mimicry on sensitivity to facial expressions of emotion.

    Science.gov (United States)

    Schneider, Kristin G; Hempel, Roelie J; Lynch, Thomas R

    2013-10-01

    Successful interpersonal functioning often requires both the ability to mask inner feelings and the ability to accurately recognize others' expressions--but what if effortful control of emotional expressions impacts the ability to accurately read others? In this study, we examined the influence of self-controlled expressive suppression and mimicry on facial affect sensitivity--the speed with which one can accurately identify gradually intensifying facial expressions of emotion. Muscle activity of the brow (corrugator, related to anger), upper lip (levator, related to disgust), and cheek (zygomaticus, related to happiness) were recorded using facial electromyography while participants randomized to one of three conditions (Suppress, Mimic, and No-Instruction) viewed a series of six distinct emotional expressions (happiness, sadness, fear, anger, surprise, and disgust) as they morphed from neutral to full expression. As hypothesized, individuals instructed to suppress their own facial expressions showed impairment in facial affect sensitivity. Conversely, mimicry of emotion expressions appeared to facilitate facial affect sensitivity. Results suggest that it is difficult for a person to be able to simultaneously mask inner feelings and accurately "read" the facial expressions of others, at least when these expressions are at low intensity. The combined behavioral and physiological data suggest that the strategies an individual selects to control his or her own expression of emotion have important implications for interpersonal functioning.

  6. Emotional Incongruence of Facial Expression and Voice Tone Investigated with Event-Related Brain Potentials of Infants

    Directory of Open Access Journals (Sweden)

    Kota Arai

    2011-10-01

    Full Text Available Human emotions are perceived from multi-modal information including facial expression and voice tone. We aimed to investigate development of neural mechanism for cross-modal perception of emotions. We presented congruent and incongruent combinations of facial expression (happy and voice tone (happy or angry, and measured EEG to analyze event-related brain potentials for 8-10 month-old infants and adults. Ten repetitions of 10 trials were presented in random order for each participant. Half of them performed 20% congruent (happy face with happy voice and 80% incongruent (happy face with angry voice trials, and the others performed 80% congruent and 20% incongruent trials. We employed the oddball paradigm, but did not instruct participants to count a target. The odd-ball (infrequent stimulus increased the amplitude of P2 and delayed its latency for infants in comparison with the frequent stimulus. When the odd-ball stimulus was also emotionally incongruent, P2 amplitude was more increased and its latency was more delayed than for the odd-ball and emotionally congruent stimulus. However, we did not find difference of P2 amplitude or latency for adults between conditions. These results suggested that the 8–10 month-old infants already have a neural basis for detecting emotional incongruence of facial expression and voice tone.

  7. Different underlying mechanisms for face emotion and gender processing during feature-selective attention: Evidence from event-related potential studies.

    Science.gov (United States)

    Wang, Hailing; Ip, Chengteng; Fu, Shimin; Sun, Pei

    2017-05-01

    Face recognition theories suggest that our brains process invariant (e.g., gender) and changeable (e.g., emotion) facial dimensions separately. To investigate whether these two dimensions are processed in different time courses, we analyzed the selection negativity (SN, an event-related potential component reflecting attentional modulation) elicited by face gender and emotion during a feature selective attention task. Participants were instructed to attend to a combination of face emotion and gender attributes in Experiment 1 (bi-dimensional task) and to either face emotion or gender in Experiment 2 (uni-dimensional task). The results revealed that face emotion did not elicit a substantial SN, whereas face gender consistently generated a substantial SN in both experiments. These results suggest that face gender is more sensitive to feature-selective attention and that face emotion is encoded relatively automatically on SN, implying the existence of different underlying processing mechanisms for invariant and changeable facial dimensions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Understanding Emotions from Standardized Facial Expressions in Autism and Normal Development

    Science.gov (United States)

    Castelli, Fulvia

    2005-01-01

    The study investigated the recognition of standardized facial expressions of emotion (anger, fear, disgust, happiness, sadness, surprise) at a perceptual level (experiment 1) and at a semantic level (experiments 2 and 3) in children with autism (N= 20) and normally developing children (N= 20). Results revealed that children with autism were as…

  9. Recognition of Facial Expressions of Emotion in Adults with Down Syndrome

    Science.gov (United States)

    Virji-Babul, Naznin; Watt, Kimberley; Nathoo, Farouk; Johnson, Peter

    2012-01-01

    Research on facial expressions in individuals with Down syndrome (DS) has been conducted using photographs. Our goal was to examine the effect of motion on perception of emotional expressions. Adults with DS, adults with typical development matched for chronological age (CA), and children with typical development matched for developmental age (DA)…

  10. Sex differences in emotion recognition: Evidence for a small overall female superiority on facial disgust.

    Science.gov (United States)

    Connolly, Hannah L; Lefevre, Carmen E; Young, Andrew W; Lewis, Gary J

    2018-05-21

    Although it is widely believed that females outperform males in the ability to recognize other people's emotions, this conclusion is not well supported by the extant literature. The current study sought to provide a strong test of the female superiority hypothesis by investigating sex differences in emotion recognition for five basic emotions using stimuli well-calibrated for individual differences assessment, across two expressive domains (face and body), and in a large sample (N = 1,022: Study 1). We also assessed the stability and generalizability of our findings with two independent replication samples (N = 303: Study 2, N = 634: Study 3). In Study 1, we observed that females were superior to males in recognizing facial disgust and sadness. In contrast, males were superior to females in recognizing bodily happiness. The female superiority for recognition of facial disgust was replicated in Studies 2 and 3, and this observation also extended to an independent stimulus set in Study 2. No other sex differences were stable across studies. These findings provide evidence for the presence of sex differences in emotion recognition ability, but show that these differences are modest in magnitude and appear to be limited to facial disgust. We discuss whether this sex difference may reflect human evolutionary imperatives concerning reproductive fitness and child care. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  11. Perceptually Valid Facial Expressions for Character-Based Applications

    Directory of Open Access Journals (Sweden)

    Ali Arya

    2009-01-01

    Full Text Available This paper addresses the problem of creating facial expression of mixed emotions in a perceptually valid way. The research has been done in the context of a “game-like” health and education applications aimed at studying social competency and facial expression awareness in autistic children as well as native language learning, but the results can be applied to many other applications such as games with need for dynamic facial expressions or tools for automating the creation of facial animations. Most existing methods for creating facial expressions of mixed emotions use operations like averaging to create the combined effect of two universal emotions. Such methods may be mathematically justifiable but are not necessarily valid from a perceptual point of view. The research reported here starts by user experiments aiming at understanding how people combine facial actions to express mixed emotions, and how the viewers perceive a set of facial actions in terms of underlying emotions. Using the results of these experiments and a three-dimensional emotion model, we associate facial actions to dimensions and regions in the emotion space, and create a facial expression based on the location of the mixed emotion in the three-dimensional space. We call these regionalized facial actions “facial expression units.”

  12. Inferior Frontal Gyrus Activity Triggers Anterior Insula Response to Emotional Facial Expressions

    NARCIS (Netherlands)

    Jabbi, Mbemba; Keysers, Christian

    2008-01-01

    The observation of movies of facial expressions of others has been shown to recruit similar areas involved in experiencing one's own emotions: the inferior frontal gyrus (IFG). the anterior insula and adjacent frontal operculum (IFO). The Causal link bet between activity in these 2 regions,

  13. Production of Emotional Facial Expressions in European American, Japanese, and Chinese Infants.

    Science.gov (United States)

    Camras, Linda A.; And Others

    1998-01-01

    European American, Japanese, and Chinese 11-month-olds participated in emotion-inducing laboratory procedures. Facial responses were scored with BabyFACS, an anatomically based coding system. Overall, Chinese infants were less expressive than European American and Japanese infants, suggesting that differences in expressivity between European…

  14. Functional variation of the dopamine D2 receptor gene is associated with emotional control as well as brain activity and connectivity during emotion processing in humans.

    Science.gov (United States)

    Blasi, Giuseppe; Lo Bianco, Luciana; Taurisano, Paolo; Gelao, Barbara; Romano, Raffaella; Fazio, Leonardo; Papazacharias, Apostolos; Di Giorgio, Annabella; Caforio, Grazia; Rampino, Antonio; Masellis, Rita; Papp, Audrey; Ursini, Gianluca; Sinibaldi, Lorenzo; Popolizio, Teresa; Sadee, Wolfgang; Bertolino, Alessandro

    2009-11-25

    Personality traits related to emotion processing are, at least in part, heritable and genetically determined. Dopamine D(2) receptor signaling is involved in modulation of emotional behavior and activity of associated brain regions such as the amygdala and the prefrontal cortex. An intronic single nucleotide polymorphism within the D(2) receptor gene (DRD2) (rs1076560, guanine > thymine or G > T) shifts splicing of the two protein isoforms (D(2) short, mainly presynaptic, and D(2) long) and has been associated with modulation of memory performance and brain activity. Here, our aim was to investigate the association of DRD2 rs1076560 genotype with personality traits of emotional stability and with brain physiology during processing of emotionally relevant stimuli. DRD2 genotype and Big Five Questionnaire scores were evaluated in 134 healthy subjects demonstrating that GG subjects have reduced "emotion control" compared with GT subjects. Functional magnetic resonance imaging in a sample of 24 individuals indicated greater amygdala activity during implicit processing and greater dorsolateral prefrontal cortex (DLPFC) response during explicit processing of facial emotional stimuli in GG subjects compared with GT. Other results also demonstrate an interaction between DRD2 genotype and facial emotional expression on functional connectivity of both amygdala and dorsolateral prefrontal regions with overlapping medial prefrontal areas. Moreover, rs1076560 genotype is associated with differential relationships between amygdala/DLPFC functional connectivity and emotion control scores. These results suggest that genetically determined D(2) signaling may explain part of personality traits related to emotion processing and individual variability in specific brain responses to emotionally relevant inputs.

  15. Right Hemisphere Dominance for Emotion Processing in Baboons

    Science.gov (United States)

    Wallez, Catherine; Vauclair, Jacques

    2011-01-01

    Asymmetries of emotional facial expressions in humans offer reliable indexes to infer brain lateralization and mostly revealed right hemisphere dominance. Studies concerned with oro-facial asymmetries in nonhuman primates largely showed a left-sided asymmetry in chimpanzees, marmosets and macaques. The presence of asymmetrical oro-facial…

  16. P2-35: The KU Facial Expression Database: A Validated Database of Emotional and Conversational Expressions

    Directory of Open Access Journals (Sweden)

    Haenah Lee

    2012-10-01

    Full Text Available Facial expressions are one of the most important means of nonverbal communication transporting both emotional and conversational content. For investigating this large space of expressions we recently developed a large database containing dynamic emotional and conversational expressions in Germany (MPI facial expression database. As facial expressions crucially depend on the cultural context, however, a similar resource is needed for studies outside of Germany. Here, we introduce and validate a new, extensive Korean facial expression database containing dynamic emotional and conversational information. Ten individuals performed 62 expressions following a method-acting protocol, in which each person was asked to imagine themselves in one of 62 corresponding everyday scenarios and to react accordingly. To validate this database, we conducted two experiments: 20 participants were asked to name the appropriate expression for each of the 62 everyday scenarios shown as text. Ten additional participants were asked to name each of the 62 expression videos from 10 actors in addition to rating its naturalness. All naming answers were then rated as valid or invalid. Scenario validation yielded 89% valid answers showing that the scenarios are effective in eliciting appropriate expressions. Video sequences were judged as natural with an average of 66% valid answers. This is an excellent result considering that videos were seen without any conversational context and that 62 expressions were to be recognized. These results validate our Korean database and, as they also parallel the German validation results, will enable detailed cross-cultural comparisons of the complex space of emotional and conversational expressions.

  17. Emotional processing in patients with mild cognitive impairment: the influence of the valence and intensity of emotional stimuli: the valence and intensity of emotional stimuli influence emotional processing in patients with mild cognitive impairment.

    Science.gov (United States)

    Sarabia-Cobo, Carmen M; García-Rodríguez, Beatriz; Navas, M José; Ellgring, Heiner

    2015-10-15

    We studied the ability of individuals with mild cognitive impairment (MCI) to process emotional facial expressions (EFEs). To date, no systematic study has addressed how variation in intensity affects recognition of the different type of EFEs in such subjects. Two groups of 50 elderly subjects, 50 healthy individuals and 50 with MCI, completed a task that involved identifying 180 EFEs prepared using virtual models. Two features of the EFEs were contemplated, their valence (operationalized in six basic emotions) and five levels of intensity. At all levels of intensity, elderly individuals with MCI were significantly worse at identifying each EFE than healthy subjects. Some emotions were easier to identify than others, with happiness proving to be the easiest to identify and disgust the hardest, and intensity influenced the identification of the EFEs (the stronger the intensity, the greater the number of correct identifications). Overall, elderly individuals with MCI had a poorer capacity to process EFEs, suggesting that cognitive ability modulates the processing of emotions, where features of such stimuli also seem to play a prominent role (e.g., valence and intensity). Thus, the neurological substrates involved in emotional processing appear to be affected by MCI. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Production of emotional facial expressions in European American, Japanese, and Chinese infants.

    Science.gov (United States)

    Camras, L A; Oster, H; Campos, J; Campos, R; Ujiie, T; Miyake, K; Wang, L; Meng, Z

    1998-07-01

    European American, Japanese, and Chinese 11-month-olds participated in emotion-inducing laboratory procedures. Facial responses were scored with BabyFACS, an anatomically based coding system. Overall, Chinese infants were less expressive than European American and Japanese infants. On measures of smiling and crying, Chinese infants scored lower than European American infants, whereas Japanese infants were similar to the European American infants or fell between the two other groups. Results suggest that differences in expressivity between European American and Chinese infants are more robust than those between European American and Japanese infants and that Chinese and Japanese infants can differ significantly. Cross-cultural differences were also found for some specific brow, cheek, and midface facial actions (e.g., brows lowered). These are discussed in terms of current controversies about infant affective facial expressions.

  19. Perceptual integration of kinematic components in the recognition of emotional facial expressions.

    Science.gov (United States)

    Chiovetto, Enrico; Curio, Cristóbal; Endres, Dominik; Giese, Martin

    2018-04-01

    According to a long-standing hypothesis in motor control, complex body motion is organized in terms of movement primitives, reducing massively the dimensionality of the underlying control problems. For body movements, this low-dimensional organization has been convincingly demonstrated by the learning of low-dimensional representations from kinematic and EMG data. In contrast, the effective dimensionality of dynamic facial expressions is unknown, and dominant analysis approaches have been based on heuristically defined facial "action units," which reflect contributions of individual face muscles. We determined the effective dimensionality of dynamic facial expressions by learning of a low-dimensional model from 11 facial expressions. We found an amazingly low dimensionality with only two movement primitives being sufficient to simulate these dynamic expressions with high accuracy. This low dimensionality is confirmed statistically, by Bayesian model comparison of models with different numbers of primitives, and by a psychophysical experiment that demonstrates that expressions, simulated with only two primitives, are indistinguishable from natural ones. In addition, we find statistically optimal integration of the emotion information specified by these primitives in visual perception. Taken together, our results indicate that facial expressions might be controlled by a very small number of independent control units, permitting very low-dimensional parametrization of the associated facial expression.

  20. Mapping the impairment in decoding static facial expressions of emotion in prosopagnosia.

    Science.gov (United States)

    Fiset, Daniel; Blais, Caroline; Royer, Jessica; Richoz, Anne-Raphaëlle; Dugas, Gabrielle; Caldara, Roberto

    2017-08-01

    Acquired prosopagnosia is characterized by a deficit in face recognition due to diverse brain lesions, but interestingly most prosopagnosic patients suffering from posterior lesions use the mouth instead of the eyes for face identification. Whether this bias is present for the recognition of facial expressions of emotion has not yet been addressed. We tested PS, a pure case of acquired prosopagnosia with bilateral occipitotemporal lesions anatomically sparing the regions dedicated for facial expression recognition. PS used mostly the mouth to recognize facial expressions even when the eye area was the most diagnostic. Moreover, PS directed most of her fixations towards the mouth. Her impairment was still largely present when she was instructed to look at the eyes, or when she was forced to look at them. Control participants showed a performance comparable to PS when only the lower part of the face was available. These observations suggest that the deficits observed in PS with static images are not solely attentional, but are rooted at the level of facial information use. This study corroborates neuroimaging findings suggesting that the Occipital Face Area might play a critical role in extracting facial features that are integrated for both face identification and facial expression recognition in static images. © The Author (2017). Published by Oxford University Press.