WorldWideScience

Sample records for emotional face processing

  1. Processing of emotional faces in social phobia

    Directory of Open Access Journals (Sweden)

    Nicole Kristjansen Rosenberg

    2011-02-01

    Full Text Available Previous research has found that individuals with social phobia differ from controls in their processing of emotional faces. For instance, people with social phobia show increased attention to briefly presented threatening faces. However, when exposure times are increased, the direction of this attentional bias is more unclear. Studies investigating eye movements have found both increased as well as decreased attention to threatening faces in socially anxious participants. The current study investigated eye movements to emotional faces in eight patients with social phobia and 34 controls. Three different tasks with different exposure durations were used, which allowed for an investigation of the time course of attention. At the early time interval, patients showed a complex pattern of both vigilance and avoidance of threatening faces. At the longest time interval, patients avoided the eyes of sad, disgust, and neutral faces more than controls, whereas there were no group differences for angry faces.

  2. Interference among the Processing of Facial Emotion, Face Race, and Face Gender

    OpenAIRE

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People are able to simultaneously process multiple dimensions of facial properties. Facial processing models are based on the processing of facial properties. This paper examined the processing of facial emotion, face race and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interfered with face race in all the tasks. The interaction of face race and face gend...

  3. Interference among the Processing of Facial Emotion, Face Race, and Face Gender

    Science.gov (United States)

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender). PMID:27840621

  4. Interference among the Processing of Facial Emotion, Face Race, and Face Gender.

    Science.gov (United States)

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender).

  5. Emotionally anesthetized: media violence induces neural changes during emotional face processing

    OpenAIRE

    Stockdale, Laura A.; Morrison, Robert G.; Kmiecik, Matthew J.; Garbarino, James; Silton, Rebecca L.

    2015-01-01

    Media violence exposure causes increased aggression and decreased prosocial behavior, suggesting that media violence desensitizes people to the emotional experience of others. Alterations in emotional face processing following exposure to media violence may result in desensitization to others’ emotional states. This study used scalp electroencephalography methods to examine the link between exposure to violence and neural changes associated with emotional face processing. Twenty-five particip...

  6. Disrupted neural processing of emotional faces in psychopathy.

    Science.gov (United States)

    Contreras-Rodríguez, Oren; Pujol, Jesus; Batalla, Iolanda; Harrison, Ben J; Bosque, Javier; Ibern-Regàs, Immaculada; Hernández-Ribas, Rosa; Soriano-Mas, Carles; Deus, Joan; López-Solà, Marina; Pifarré, Josep; Menchón, José M; Cardoner, Narcís

    2014-04-01

    Psychopaths show a reduced ability to recognize emotion facial expressions, which may disturb the interpersonal relationship development and successful social adaptation. Behavioral hypotheses point toward an association between emotion recognition deficits in psychopathy and amygdala dysfunction. Our prediction was that amygdala dysfunction would combine deficient activation with disturbances in functional connectivity with cortical regions of the face-processing network. Twenty-two psychopaths and 22 control subjects were assessed and functional magnetic resonance maps were generated to identify both brain activation and task-induced functional connectivity using psychophysiological interaction analysis during an emotional face-matching task. Results showed significant amygdala activation in control subjects only, but differences between study groups did not reach statistical significance. In contrast, psychopaths showed significantly increased activation in visual and prefrontal areas, with this latest activation being associated with psychopaths' affective-interpersonal disturbances. Psychophysiological interaction analyses revealed a reciprocal reduction in functional connectivity between the left amygdala and visual and prefrontal cortices. Our results suggest that emotional stimulation may evoke a relevant cortical response in psychopaths, but a disruption in the processing of emotional faces exists involving the reciprocal functional interaction between the amygdala and neocortex, consistent with the notion of a failure to integrate emotion into cognition in psychopathic individuals.

  7. Neurophysiological evidence (ERPs) for hemispheric processing of facial expressions of emotions: Evidence from whole face and chimeric face stimuli.

    Science.gov (United States)

    Damaskinou, Nikoleta; Watling, Dawn

    2018-05-01

    This study was designed to investigate the patterns of electrophysiological responses of early emotional processing at frontocentral sites in adults and to explore whether adults' activation patterns show hemispheric lateralization for facial emotion processing. Thirty-five adults viewed full face and chimeric face stimuli. After viewing two faces, sequentially, participants were asked to decide which of the two faces was more emotive. The findings from the standard faces and the chimeric faces suggest that emotion processing is present during the early phases of face processing in the frontocentral sites. In particular, sad emotional faces are processed differently than neutral and happy (including happy chimeras) faces in these early phases of processing. Further, there were differences in the electrode amplitudes over the left and right hemisphere, particularly in the early temporal window. This research provides supporting evidence that the chimeric face test is a test of emotion processing that elicits right hemispheric processing.

  8. Grounding Context in Face Processing: Color, Emotion and Gender

    Directory of Open Access Journals (Sweden)

    Sandrine eGil

    2015-03-01

    Full Text Available In recent years, researchers have become interested in the way that the affective quality of contextual information transfers to a perceived target. We therefore examined the effect of a red (versus green, mixed red/green and achromatic background–known to be valenced−on the processing of stimuli that play a key role in human interactions, namely facial expressions. We also examined whether the valenced-color effect can be modulated by gender, which is also known to be valenced. Female and male adult participants performed a categorization task of facial expressions of emotion in which the faces of female and male posers expressing two ambiguous emotions (i.e., neutral and surprise were presented against the four different colored backgrounds. Additionally, this task was completed by collecting subjective ratings for each colored background in the form of five semantic differential scales corresponding to both discrete and dimensional perspectives of emotion. We found that the red background resulted in more negative face perception than the green background, whether the poser was female or male. However, whereas this valenced-color effect was the only effect for female posers, for male posers, the effect was modulated by both the nature of the ambiguous emotion and the decoder’s gender. Overall, our findings offer evidence that color and gender have a common valence-based dimension.

  9. Processing of emotional faces in congenital amusia: An emotional music priming event-related potential study.

    Science.gov (United States)

    Zhishuai, Jin; Hong, Liu; Daxing, Wu; Pin, Zhang; Xuejing, Lu

    2017-01-01

    Congenital amusia is characterized by lifelong impairments in music perception and processing. It is unclear whether pitch detection deficits impact amusic individuals' perception of musical emotion. In the current work, 19 amusics and 21 healthy controls were subjected to electroencephalography (EEG) while being exposed to music excerpts and emotional faces. We assessed each individual's ability to discriminate positive- and negative-valenced emotional faces and analyzed electrophysiological indices, in the form of event-related potentials (ERPs) recorded at 32 sites, following exposure to emotionally positive or negative music excerpts. We observed smaller N2 amplitudes in response to facial expressions in the amusia group than in the control group, suggesting that amusics were less affected by the musical stimuli. The late-positive component (LPC) in amusics was similar to that in controls. Our results suggest that the neurocognitive deficit characteristic of congenital amusia is fundamentally an impairment in musical information processing rather than an impairment in emotional processing.

  10. Processing of emotional faces in congenital amusia: An emotional music priming event-related potential study

    Directory of Open Access Journals (Sweden)

    Jin Zhishuai

    2017-01-01

    Full Text Available Congenital amusia is characterized by lifelong impairments in music perception and processing. It is unclear whether pitch detection deficits impact amusic individuals' perception of musical emotion. In the current work, 19 amusics and 21 healthy controls were subjected to electroencephalography (EEG while being exposed to music excerpts and emotional faces. We assessed each individual's ability to discriminate positive- and negative-valenced emotional faces and analyzed electrophysiological indices, in the form of event-related potentials (ERPs recorded at 32 sites, following exposure to emotionally positive or negative music excerpts. We observed smaller N2 amplitudes in response to facial expressions in the amusia group than in the control group, suggesting that amusics were less affected by the musical stimuli. The late-positive component (LPC in amusics was similar to that in controls. Our results suggest that the neurocognitive deficit characteristic of congenital amusia is fundamentally an impairment in musical information processing rather than an impairment in emotional processing.

  11. Emotionally anesthetized: media violence induces neural changes during emotional face processing.

    Science.gov (United States)

    Stockdale, Laura A; Morrison, Robert G; Kmiecik, Matthew J; Garbarino, James; Silton, Rebecca L

    2015-10-01

    Media violence exposure causes increased aggression and decreased prosocial behavior, suggesting that media violence desensitizes people to the emotional experience of others. Alterations in emotional face processing following exposure to media violence may result in desensitization to others' emotional states. This study used scalp electroencephalography methods to examine the link between exposure to violence and neural changes associated with emotional face processing. Twenty-five participants were shown a violent or nonviolent film clip and then completed a gender discrimination stop-signal task using emotional faces. Media violence did not affect the early visual P100 component; however, decreased amplitude was observed in the N170 and P200 event-related potentials following the violent film, indicating that exposure to film violence leads to suppression of holistic face processing and implicit emotional processing. Participants who had just seen a violent film showed increased frontal N200/P300 amplitude. These results suggest that media violence exposure may desensitize people to emotional stimuli and thereby require fewer cognitive resources to inhibit behavior. © The Author (2015). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  12. Social and emotional relevance in face processing: Happy faces of future interaction partners enhance the LPP

    Directory of Open Access Journals (Sweden)

    Florian eBublatzky

    2014-07-01

    Full Text Available Human face perception is modulated by both emotional valence and social relevance, but their interaction has rarely been examined. Event-related brain potentials (ERP to happy, neutral, and angry facial expressions with different degrees of social relevance were recorded. Social relevance was manipulated by presenting pictures of two specific face actors as future interaction partners (meet condition, whereas two other face actors remained non-relevant. As a further control condition all stimuli were presented without specific task instructions (passive viewing condition. A within-subject design (Facial Expression x Relevance x Task was implemented, where randomly ordered face stimuli of four actors (2 women, from the KDEF were presented for 1s to 26 participants (16 female. Results showed an augmented N170, early posterior negativity (EPN, and late positive potential (LPP for emotional in contrast to neutral facial expressions. Of particular interest, face processing varied as a function of instructed social relevance. Whereas the meet condition was accompanied with unspecific effects regardless of relevance (P1, EPN, viewing potential interaction partners was associated with increased LPP amplitudes. The LPP was specifically enhanced for happy facial expressions of the future interaction partners. This underscores that social relevance can impact face processing already at an early stage of visual processing. These findings are discussed within the framework of motivated attention and face processing theories.

  13. Passive and motivated perception of emotional faces: qualitative and quantitative changes in the face processing network.

    Directory of Open Access Journals (Sweden)

    Laurie R Skelly

    Full Text Available Emotionally expressive faces are processed by a distributed network of interacting sub-cortical and cortical brain regions. The components of this network have been identified and described in large part by the stimulus properties to which they are sensitive, but as face processing research matures interest has broadened to also probe dynamic interactions between these regions and top-down influences such as task demand and context. While some research has tested the robustness of affective face processing by restricting available attentional resources, it is not known whether face network processing can be augmented by increased motivation to attend to affective face stimuli. Short videos of people expressing emotions were presented to healthy participants during functional magnetic resonance imaging. Motivation to attend to the videos was manipulated by providing an incentive for improved recall performance. During the motivated condition, there was greater coherence among nodes of the face processing network, more widespread correlation between signal intensity and performance, and selective signal increases in a task-relevant subset of face processing regions, including the posterior superior temporal sulcus and right amygdala. In addition, an unexpected task-related laterality effect was seen in the amygdala. These findings provide strong evidence that motivation augments co-activity among nodes of the face processing network and the impact of neural activity on performance. These within-subject effects highlight the necessity to consider motivation when interpreting neural function in special populations, and to further explore the effect of task demands on face processing in healthy brains.

  14. Behavioural and neurophysiological evidence for face identity and face emotion processing in animals

    Science.gov (United States)

    Tate, Andrew J; Fischer, Hanno; Leigh, Andrea E; Kendrick, Keith M

    2006-01-01

    Visual cues from faces provide important social information relating to individual identity, sexual attraction and emotional state. Behavioural and neurophysiological studies on both monkeys and sheep have shown that specialized skills and neural systems for processing these complex cues to guide behaviour have evolved in a number of mammals and are not present exclusively in humans. Indeed, there are remarkable similarities in the ways that faces are processed by the brain in humans and other mammalian species. While human studies with brain imaging and gross neurophysiological recording approaches have revealed global aspects of the face-processing network, they cannot investigate how information is encoded by specific neural networks. Single neuron electrophysiological recording approaches in both monkeys and sheep have, however, provided some insights into the neural encoding principles involved and, particularly, the presence of a remarkable degree of high-level encoding even at the level of a specific face. Recent developments that allow simultaneous recordings to be made from many hundreds of individual neurons are also beginning to reveal evidence for global aspects of a population-based code. This review will summarize what we have learned so far from these animal-based studies about the way the mammalian brain processes the faces and the emotions they can communicate, as well as associated capacities such as how identity and emotion cues are dissociated and how face imagery might be generated. It will also try to highlight what questions and advances in knowledge still challenge us in order to provide a complete understanding of just how brain networks perform this complex and important social recognition task. PMID:17118930

  15. Interactions between facial emotion and identity in face processing: evidence based on redundancy gains.

    Science.gov (United States)

    Yankouskaya, Alla; Booth, David A; Humphreys, Glyn

    2012-11-01

    Interactions between the processing of emotion expression and form-based information from faces (facial identity) were investigated using the redundant-target paradigm, in which we specifically tested whether identity and emotional expression are integrated in a superadditive manner (Miller, Cognitive Psychology 14:247-279, 1982). In Experiments 1 and 2, participants performed emotion and face identity judgments on faces with sad or angry emotional expressions. Responses to redundant targets were faster than responses to either single target when a universal emotion was conveyed, and performance violated the predictions from a model assuming independent processing of emotion and face identity. Experiment 4 showed that these effects were not modulated by varying interstimulus and nontarget contingencies, and Experiment 5 demonstrated that the redundancy gains were eliminated when faces were inverted. Taken together, these results suggest that the identification of emotion and facial identity interact in face processing.

  16. The electrophysiological effects of the serotonin 1A receptor agonist buspirone in emotional face processing.

    Science.gov (United States)

    Bernasconi, Fosco; Kometer, Michael; Pokorny, Thomas; Seifritz, Erich; Vollenweider, Franz X

    2015-04-01

    Emotional face processing is critically modulated by the serotonergic system, and serotonin (5-HT) receptor agonists impair emotional face processing. However, the specific contribution of the 5-HT1A receptor remains poorly understood. Here we investigated the spatiotemporal brain mechanisms underpinning the modulation of emotional face processing induced by buspirone, a partial 5-HT1A receptor agonist. In a psychophysical discrimination of emotional faces task, we observed that the discrimination fearful versus neutral faces were reduced, but not happy versus neutral faces. Electrical neuroimaging analyses were applied to visual evoked potentials elicited by emotional face images, after placebo and buspirone administration. Buspirone modulated response strength (i.e., global field power) in the interval 230-248ms after stimulus onset. Distributed source estimation over this time interval revealed that buspirone decreased the neural activity in the right dorsolateral prefrontal cortex that was evoked by fearful faces. These results indicate temporal and valence-specific effects of buspirone on the neuronal correlates of emotional face processing. Furthermore, the reduced neural activity in the dorsolateral prefrontal cortex in response to fearful faces suggests a reduced attention to fearful faces. Collectively, these findings provide new insights into the role of 5-HT1A receptors in emotional face processing and have implications for affective disorders that are characterized by an increased attention to negative stimuli. Copyright © 2015 Elsevier B.V. and ECNP. All rights reserved.

  17. Electrophysiological correlates of emotional face processing in typically developing adults and adults with high functioning Autism

    OpenAIRE

    Barrie, Jennifer Nicole

    2012-01-01

    Emotional expressions have been found to affect various event-related potentials (ERPs). Furthermore, socio-emotional functioning is altered in individuals with autism, and a growing body of neuroimaging and electrophysiological evidence substantiates underlying neural differences for face processing in this population. However, relatively few studies have examined the time-course of emotional face processing in autism. This study examined how implicit (not the intended focus of attention) ve...

  18. Testing the effects of expression, intensity and age on emotional face processing in ASD.

    Science.gov (United States)

    Luyster, Rhiannon J; Bick, Johanna; Westerlund, Alissa; Nelson, Charles A

    2017-06-21

    Individuals with autism spectrum disorder (ASD) commonly show global deficits in the processing of facial emotion, including impairments in emotion recognition and slowed processing of emotional faces. Growing evidence has suggested that these challenges may increase with age, perhaps due to minimal improvement with age in individuals with ASD. In the present study, we explored the role of age, emotion type and emotion intensity in face processing for individuals with and without ASD. Twelve- and 18-22- year-old children with and without ASD participated. No significant diagnostic group differences were observed on behavioral measures of emotion processing for younger versus older individuals with and without ASD. However, there were significant group differences in neural responses to emotional faces. Relative to TD, at 12 years of age and during adulthood, individuals with ASD showed slower N170 to emotional faces. While the TD groups' P1 latency was significantly shorter in adults when compared to 12 year olds, there was no significant age-related difference in P1 latency among individuals with ASD. Findings point to potential differences in the maturation of cortical networks that support visual processing (whether of faces or stimuli more broadly), among individuals with and without ASD between late childhood and adulthood. Finally, associations between ERP amplitudes and behavioral responses on emotion processing tasks suggest possible neural markers for emotional and behavioral deficits among individuals with ASD. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Young Adults with Autism Spectrum Disorder Show Early Atypical Neural Activity during Emotional Face Processing

    Directory of Open Access Journals (Sweden)

    Rachel C. Leung

    2018-02-01

    Full Text Available Social cognition is impaired in autism spectrum disorder (ASD. The ability to perceive and interpret affect is integral to successful social functioning and has an extended developmental course. However, the neural mechanisms underlying emotional face processing in ASD are unclear. Using magnetoencephalography (MEG, the present study explored neural activation during implicit emotional face processing in young adults with and without ASD. Twenty-six young adults with ASD and 26 healthy controls were recruited. Participants indicated the location of a scrambled pattern (target that was presented alongside a happy or angry face. Emotion-related activation sources for each emotion were estimated using the Empirical Bayes Beamformer (pcorr ≤ 0.001 in Statistical Parametric Mapping 12 (SPM12. Emotional faces elicited elevated fusiform, amygdala and anterior insula and reduced anterior cingulate cortex (ACC activity in adults with ASD relative to controls. Within group comparisons revealed that angry vs. happy faces elicited distinct neural activity in typically developing adults; there was no distinction in young adults with ASD. Our data suggest difficulties in affect processing in ASD reflect atypical recruitment of traditional emotional processing areas. These early differences may contribute to difficulties in deriving social reward from faces, ascribing salience to faces, and an immature threat processing system, which collectively could result in deficits in emotional face processing.

  20. Lateralized hybrid faces: evidence of a valence-specific bias in the processing of implicit emotions.

    Science.gov (United States)

    Prete, Giulia; Laeng, Bruno; Tommasi, Luca

    2014-01-01

    It is well known that hemispheric asymmetries exist for both the analyses of low-level visual information (such as spatial frequency) and high-level visual information (such as emotional expressions). In this study, we assessed which of the above factors underlies perceptual laterality effects with "hybrid faces": a type of stimulus that allows testing for unaware processing of emotional expressions, when the emotion is displayed in the low-frequency information while an image of the same face with a neutral expression is superimposed to it. Despite hybrid faces being perceived as neutral, the emotional information modulates observers' social judgements. In the present study, participants were asked to assess friendliness of hybrid faces displayed tachistoscopically, either centrally or laterally to fixation. We found a clear influence of the hidden emotions also with lateral presentations. Happy faces were rated as more friendly and angry faces as less friendly with respect to neutral faces. In general, hybrid faces were evaluated as less friendly when they were presented in the left visual field/right hemisphere than in the right visual field/left hemisphere. The results extend the validity of the valence hypothesis in the specific domain of unaware (subcortical) emotion processing.

  1. Social anhedonia is associated with neural abnormalities during face emotion processing.

    Science.gov (United States)

    Germine, Laura T; Garrido, Lucia; Bruce, Lori; Hooker, Christine

    2011-10-01

    Human beings are social organisms with an intrinsic desire to seek and participate in social interactions. Social anhedonia is a personality trait characterized by a reduced desire for social affiliation and reduced pleasure derived from interpersonal interactions. Abnormally high levels of social anhedonia prospectively predict the development of schizophrenia and contribute to poorer outcomes for schizophrenia patients. Despite the strong association between social anhedonia and schizophrenia, the neural mechanisms that underlie individual differences in social anhedonia have not been studied and are thus poorly understood. Deficits in face emotion recognition are related to poorer social outcomes in schizophrenia, and it has been suggested that face emotion recognition deficits may be a behavioral marker for schizophrenia liability. In the current study, we used functional magnetic resonance imaging (fMRI) to see whether there are differences in the brain networks underlying basic face emotion processing in a community sample of individuals low vs. high in social anhedonia. We isolated the neural mechanisms related to face emotion processing by comparing face emotion discrimination with four other baseline conditions (identity discrimination of emotional faces, identity discrimination of neutral faces, object discrimination, and pattern discrimination). Results showed a group (high/low social anhedonia) × condition (emotion discrimination/control condition) interaction in the anterior portion of the rostral medial prefrontal cortex, right superior temporal gyrus, and left somatosensory cortex. As predicted, high (relative to low) social anhedonia participants showed less neural activity in face emotion processing regions during emotion discrimination as compared to each control condition. The findings suggest that social anhedonia is associated with abnormalities in networks responsible for basic processes associated with social cognition, and provide a

  2. Asymmetric Engagement of Amygdala and Its Gamma Connectivity in Early Emotional Face Processing

    Science.gov (United States)

    Liu, Tai-Ying; Chen, Yong-Sheng; Hsieh, Jen-Chuen; Chen, Li-Fen

    2015-01-01

    The amygdala has been regarded as a key substrate for emotion processing. However, the engagement of the left and right amygdala during the early perceptual processing of different emotional faces remains unclear. We investigated the temporal profiles of oscillatory gamma activity in the amygdala and effective connectivity of the amygdala with the thalamus and cortical areas during implicit emotion-perceptual tasks using event-related magnetoencephalography (MEG). We found that within 100 ms after stimulus onset the right amygdala habituated to emotional faces rapidly (with duration around 20–30 ms), whereas activity in the left amygdala (with duration around 50–60 ms) sustained longer than that in the right. Our data suggest that the right amygdala could be linked to autonomic arousal generated by facial emotions and the left amygdala might be involved in decoding or evaluating expressive faces in the early perceptual emotion processing. The results of effective connectivity provide evidence that only negative emotional processing engages both cortical and subcortical pathways connected to the right amygdala, representing its evolutional significance (survival). These findings demonstrate the asymmetric engagement of bilateral amygdala in emotional face processing as well as the capability of MEG for assessing thalamo-cortico-limbic circuitry. PMID:25629899

  3. Age-related differences in event-related potentials for early visual processing of emotional faces.

    Science.gov (United States)

    Hilimire, Matthew R; Mienaltowski, Andrew; Blanchard-Fields, Fredda; Corballis, Paul M

    2014-07-01

    With advancing age, processing resources are shifted away from negative emotional stimuli and toward positive ones. Here, we explored this 'positivity effect' using event-related potentials (ERPs). Participants identified the presence or absence of a visual probe that appeared over photographs of emotional faces. The ERPs elicited by the onsets of angry, sad, happy and neutral faces were recorded. We examined the frontocentral emotional positivity (FcEP), which is defined as a positive deflection in the waveforms elicited by emotional expressions relative to neutral faces early on in the time course of the ERP. The FcEP is thought to reflect enhanced early processing of emotional expressions. The results show that within the first 130 ms young adults show an FcEP to negative emotional expressions, whereas older adults show an FcEP to positive emotional expressions. These findings provide additional evidence that the age-related positivity effect in emotion processing can be traced to automatic processes that are evident very early in the processing of emotional facial expressions. © The Author (2013). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  4. Neuropsychology of facial expressions. The role of consciousness in processing emotional faces

    Directory of Open Access Journals (Sweden)

    Michela Balconi

    2012-04-01

    Full Text Available Neuropsychological studies have underlined the significant presence of distinct brain correlates deputed to analyze facial expression of emotion. It was observed that some cerebral circuits were considered as specific for emotional face comprehension as a function of conscious vs. unconscious processing of emotional information. Moreover, the emotional content of faces (i.e. positive vs. negative; more or less arousing may have an effect in activating specific cortical networks. Between the others, recent studies have explained the contribution of hemispheres in comprehending face, as a function of type of emotions (mainly related to the distinction positive vs. negative and of specific tasks (comprehending vs. producing facial expressions. Specifically, ERPs (event-related potentials analysis overview is proposed in order to comprehend how face may be processed by an observer and how he can make face a meaningful construct even in absence of awareness. Finally, brain oscillations is considered in order to explain the synchronization of neural populations in response to emotional faces when a conscious vs. unconscious processing is activated

  5. The Effect of Self-Referential Expectation on Emotional Face Processing.

    Directory of Open Access Journals (Sweden)

    Mel McKendrick

    Full Text Available The role of self-relevance has been somewhat neglected in static face processing paradigms but may be important in understanding how emotional faces impact on attention, cognition and affect. The aim of the current study was to investigate the effect of self-relevant primes on processing emotional composite faces. Sentence primes created an expectation of the emotion of the face before sad, happy, neutral or composite face photos were viewed. Eye movements were recorded and subsequent responses measured the cognitive and affective impact of the emotion expressed. Results indicated that primes did not guide attention, but impacted on judgments of valence intensity and self-esteem ratings. Negative self-relevant primes led to the most negative self-esteem ratings, although the effect of the prime was qualified by salient facial features. Self-relevant expectations about the emotion of a face and subsequent attention to a face that is congruent with these expectations strengthened the affective impact of viewing the face.

  6. Face Processing and Facial Emotion Recognition in Adults with Down Syndrome

    Science.gov (United States)

    Barisnikov, Koviljka; Hippolyte, Loyse; Van der Linden, Martial

    2008-01-01

    Face processing and facial expression recognition was investigated in 17 adults with Down syndrome, and results were compared with those of a child control group matched for receptive vocabulary. On the tasks involving faces without emotional content, the adults with Down syndrome performed significantly worse than did the controls. However, their…

  7. The Impact of Top-Down Prediction on Emotional Face Processing in Social Anxiety

    Directory of Open Access Journals (Sweden)

    Guangming Ran

    2017-07-01

    Full Text Available There is evidence that people with social anxiety show abnormal processing of emotional faces. To investigate the impact of top-down prediction on emotional face processing in social anxiety, brain responses of participants with high and low social anxiety (LSA were recorded, while they performed a variation of the emotional task, using high temporal resolution event-related potential techniques. Behaviorally, we reported an effect of prediction with higher accuracy for predictable than unpredictable faces. Furthermore, we found that participants with high social anxiety (HSA, but not with LSA, recognized angry faces more accurately than happy faces. For the P100 and P200 components, HSA participants showed enhanced brain activity for angry faces compared to happy faces, suggesting a hypervigilance to angry faces. Importantly, HSA participants exhibited larger N170 amplitudes in the right hemisphere electrodes than LSA participants when they observed unpredictable angry faces, but not when the angry faces were predictable. This probably reflects the top-down prediction improving the deficiency at building a holistic face representation in HSA participants.

  8. Interdependent mechanisms for processing gender and emotion:The special status of angry male faces

    Directory of Open Access Journals (Sweden)

    Daniel A Harris

    2016-07-01

    Full Text Available While some models of how various attributes of a face are processed have posited that face features, invariant physical cues such as gender or ethnicity as well as variant social cues such as emotion, may be processed independently (e.g., Bruce & Young, 1986, other models suggest a more distributed representation and interdependent processing (e.g., Haxby, Hoffman, & Gobbini, 2000. Here we use a contingent adaptation paradigm to investigate if mechanisms for processing the gender and emotion of a face are interdependent and symmetric across the happy-angry emotional continuum and regardless of the gender of the face. We simultaneously adapted participants to angry female faces and happy male faces (Experiment 1 or to happy female faces and angry male faces (Experiment 2. In Experiment 1 we found evidence for contingent adaptation, with simultaneous aftereffects in opposite directions: male faces were biased towards angry while female faces were biased towards happy. Interestingly, in the complementary Experiment 2 we did not find evidence for contingent adaptation, with both male and female faces biased towards angry. Our results highlight that evidence for contingent adaptation and the underlying interdependent face processing mechanisms that would allow for contingent adaptation may only be evident for certain combinations of face features. Such limits may be especially important in the case of social cues given how maladaptive it may be to stop responding to threatening information, with male angry faces considered to be the most threatening. The underlying neuronal mechanisms that could account for such asymmetric effects in contingent adaptation remain to be elucidated.

  9. Putting the face in context: Body expressions impact facial emotion processing in human infants

    Directory of Open Access Journals (Sweden)

    Purva Rajhans

    2016-06-01

    Full Text Available Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs. We primed infants with body postures (fearful, happy that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception.

  10. Putting the face in context: Body expressions impact facial emotion processing in human infants.

    Science.gov (United States)

    Rajhans, Purva; Jessen, Sarah; Missana, Manuela; Grossmann, Tobias

    2016-06-01

    Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs). We primed infants with body postures (fearful, happy) that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Human sex differences in emotional processing of own-race and other-race faces.

    Science.gov (United States)

    Ran, Guangming; Chen, Xu; Pan, Yangu

    2014-06-18

    There is evidence that women and men show differences in the perception of affective facial expressions. However, none of the previous studies directly investigated sex differences in emotional processing of own-race and other-race faces. The current study addressed this issue using high time resolution event-related potential techniques. In total, data from 25 participants (13 women and 12 men) were analyzed. It was found that women showed increased N170 amplitudes to negative White faces compared with negative Chinese faces over the right hemisphere electrodes. This result suggests that women show enhanced sensitivity to other-race faces showing negative emotions (fear or disgust), which may contribute toward evolution. However, the current data showed that men had increased N170 amplitudes to happy Chinese versus happy White faces over the left hemisphere electrodes, indicating that men show enhanced sensitivity to own-race faces showing positive emotions (happiness). In this respect, men might use past pleasant emotional experiences to boost recognition of own-race faces.

  12. Face and emotion expression processing and the serotonin transporter polymorphism 5-HTTLPR/rs22531.

    Science.gov (United States)

    Hildebrandt, A; Kiy, A; Reuter, M; Sommer, W; Wilhelm, O

    2016-06-01

    Face cognition, including face identity and facial expression processing, is a crucial component of socio-emotional abilities, characterizing humans as highest developed social beings. However, for these trait domains molecular genetic studies investigating gene-behavior associations based on well-founded phenotype definitions are still rare. We examined the relationship between 5-HTTLPR/rs25531 polymorphisms - related to serotonin-reuptake - and the ability to perceive and recognize faces and emotional expressions in human faces. For this aim we conducted structural equation modeling on data from 230 young adults, obtained by using a comprehensive, multivariate task battery with maximal effort tasks. By additionally modeling fluid intelligence and immediate and delayed memory factors, we aimed to address the discriminant relationships of the 5-HTTLPR/rs25531 polymorphisms with socio-emotional abilities. We found a robust association between the 5-HTTLPR/rs25531 polymorphism and facial emotion perception. Carriers of two long (L) alleles outperformed carriers of one or two S alleles. Weaker associations were present for face identity perception and memory for emotional facial expressions. There was no association between the 5-HTTLPR/rs25531 polymorphism and non-social abilities, demonstrating discriminant validity of the relationships. We discuss the implications and possible neural mechanisms underlying these novel findings. © 2016 John Wiley & Sons Ltd and International Behavioural and Neural Genetics Society.

  13. Are patients with schizophrenia impaired in processing non-emotional features of human faces?

    Directory of Open Access Journals (Sweden)

    Hayley eDarke

    2013-08-01

    Full Text Available It is known that individuals with schizophrenia exhibit signs of impaired face processing, however, the exact perceptual and cognitive mechanisms underlying these deficits are yet to be elucidated. One possible source of confusion in the current literature is the methodological and conceptual inconsistencies that can arise from the varied treatment of different aspects of face processing relating to emotional and non-emotional aspects of face perception. This review aims to disentangle the literature by focusing on the performance of patients with schizophrenia in a range of tasks that required processing of non-emotional features of face stimuli (e.g. identity or gender. We also consider the performance of patients on non-face stimuli that share common elements such as familiarity (e.g. cars and social relevance (e.g. gait. We conclude by exploring whether observed deficits are best considered as face-specific and note that further investigation is required to properly assess the potential contribution of more generalised attentional or perceptual impairments.

  14. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions.

    Science.gov (United States)

    Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the

  15. Amygdala Hyperactivation During Face Emotion Processing in Unaffected Youth at Risk for Bipolar Disorder

    Science.gov (United States)

    Olsavsky, Aviva K.; Brotman, Melissa A.; Rutenberg, Julia G.; Muhrer, Eli J.; Deveney, Christen M.; Fromm, Stephen J.; Towbin, Kenneth; Pine, Daniel S.; Leibenluft, Ellen

    2012-01-01

    Objective: Youth at familial risk for bipolar disorder (BD) show deficits in face emotion processing, but the neural correlates of these deficits have not been examined. This preliminary study tests the hypothesis that, relative to healthy comparison (HC) subjects, both BD subjects and youth at risk for BD (i.e., those with a first-degree BD…

  16. The extended functional neuroanatomy of emotional processing biases for masked faces in major depressive disorder.

    Directory of Open Access Journals (Sweden)

    Teresa A Victor

    Full Text Available Major depressive disorder (MDD is associated with a mood-congruent processing bias in the amygdala toward face stimuli portraying sad expressions that is evident even when such stimuli are presented below the level of conscious awareness. The extended functional anatomical network that maintains this response bias has not been established, however.To identify neural network differences in the hemodynamic response to implicitly presented facial expressions between depressed and healthy control participants.Unmedicated-depressed participants with MDD (n=22 and healthy controls (HC; n=25 underwent functional MRI as they viewed face stimuli showing sad, happy or neutral face expressions, presented using a backward masking design. The blood-oxygen-level dependent (BOLD signal was measured to identify regions where the hemodynamic response to the emotionally valenced stimuli differed between groups.The MDD subjects showed greater BOLD responses than the controls to masked-sad versus masked-happy faces in the hippocampus, amygdala and anterior inferotemporal cortex. While viewing both masked-sad and masked-happy faces relative to masked-neutral faces, the depressed subjects showed greater hemodynamic responses than the controls in a network that included the medial and orbital prefrontal cortices and anterior temporal cortex.Depressed and healthy participants showed distinct hemodynamic responses to masked-sad and masked-happy faces in neural circuits known to support the processing of emotionally valenced stimuli and to integrate the sensory and visceromotor aspects of emotional behavior. Altered function within these networks in MDD may establish and maintain illness-associated differences in the salience of sensory/social stimuli, such that attention is biased toward negative and away from positive stimuli.

  17. The extended functional neuroanatomy of emotional processing biases for masked faces in major depressive disorder.

    Science.gov (United States)

    Victor, Teresa A; Furey, Maura L; Fromm, Stephen J; Bellgowan, Patrick S F; Öhman, Arne; Drevets, Wayne C

    2012-01-01

    Major depressive disorder (MDD) is associated with a mood-congruent processing bias in the amygdala toward face stimuli portraying sad expressions that is evident even when such stimuli are presented below the level of conscious awareness. The extended functional anatomical network that maintains this response bias has not been established, however. To identify neural network differences in the hemodynamic response to implicitly presented facial expressions between depressed and healthy control participants. Unmedicated-depressed participants with MDD (n=22) and healthy controls (HC; n=25) underwent functional MRI as they viewed face stimuli showing sad, happy or neutral face expressions, presented using a backward masking design. The blood-oxygen-level dependent (BOLD) signal was measured to identify regions where the hemodynamic response to the emotionally valenced stimuli differed between groups. The MDD subjects showed greater BOLD responses than the controls to masked-sad versus masked-happy faces in the hippocampus, amygdala and anterior inferotemporal cortex. While viewing both masked-sad and masked-happy faces relative to masked-neutral faces, the depressed subjects showed greater hemodynamic responses than the controls in a network that included the medial and orbital prefrontal cortices and anterior temporal cortex. Depressed and healthy participants showed distinct hemodynamic responses to masked-sad and masked-happy faces in neural circuits known to support the processing of emotionally valenced stimuli and to integrate the sensory and visceromotor aspects of emotional behavior. Altered function within these networks in MDD may establish and maintain illness-associated differences in the salience of sensory/social stimuli, such that attention is biased toward negative and away from positive stimuli.

  18. Processing Distracting Non-face Emotional Images: No Evidence of an Age-Related Positivity Effect.

    Science.gov (United States)

    Madill, Mark; Murray, Janice E

    2017-01-01

    Cognitive aging may be accompanied by increased prioritization of social and emotional goals that enhance positive experiences and emotional states. The socioemotional selectivity theory suggests this may be achieved by giving preference to positive information and avoiding or suppressing negative information. Although there is some evidence of a positivity bias in controlled attention tasks, it remains unclear whether a positivity bias extends to the processing of affective stimuli presented outside focused attention. In two experiments, we investigated age-related differences in the effects of to-be-ignored non-face affective images on target processing. In Experiment 1, 27 older (64-90 years) and 25 young adults (19-29 years) made speeded valence judgments about centrally presented positive or negative target images taken from the International Affective Picture System. To-be-ignored distractor images were presented above and below the target image and were either positive, negative, or neutral in valence. The distractors were considered task relevant because they shared emotional characteristics with the target stimuli. Both older and young adults responded slower to targets when distractor valence was incongruent with target valence relative to when distractors were neutral. Older adults responded faster to positive than to negative targets but did not show increased interference effects from positive distractors. In Experiment 2, affective distractors were task irrelevant as the target was a three-digit array and did not share emotional characteristics with the distractors. Twenty-six older (63-84 years) and 30 young adults (18-30 years) gave speeded responses on a digit disparity task while ignoring the affective distractors positioned in the periphery. Task performance in either age group was not influenced by the task-irrelevant affective images. In keeping with the socioemotional selectivity theory, these findings suggest that older adults preferentially

  19. [Abnormal processing characteristics to basic emotional faces in the early phase in children with autism spectrum disorder].

    Science.gov (United States)

    Lin, Qiong-Xi; Wu, Gui-Hua; Zhang, Ling; Wang, Zeng-Jian; Pan, Ning; Xu, Cai-Juan; Jing, Jin; Jin, Yu

    2018-02-01

    To explore the recognition ability and abnormal processing characteristics to basic emotional faces in the early phase in children with autism spectrum disorders (ASD). Photos of Chinese static faces with four basic emotions (fearful, happy, angry and sad) were used as stimulus. Twenty-five ASD children and twenty-two age- and gender-matched typical developed children (normal controls) were asked to match the emotional faces with words. Event-related potential (ERP) data were recorded concurrently. N170 latencies for total emotion and fearful face in the left temporal region were faster than in the right one in normal controls (P<0.05), but the results were not noted in ASD children. Further, N170 latencies in the left temporal region of ASD children were slower than normal controls for total emotion, fearful and happy faces (P<0.05), and their N170 latencies in the right temporal region were prone to slower than normal controls for angry and fearful faces. The holistic perception speed of emotional faces in the early cognitive processing phase in ASD children is slower than normal controls. The lateralized response in the early phase of recognizing emotional faces may be aberrant in children with ASD.

  20. Altered Functional Subnetwork During Emotional Face Processing: A Potential Intermediate Phenotype for Schizophrenia.

    Science.gov (United States)

    Cao, Hengyi; Bertolino, Alessandro; Walter, Henrik; Schneider, Michael; Schäfer, Axel; Taurisano, Paolo; Blasi, Giuseppe; Haddad, Leila; Grimm, Oliver; Otto, Kristina; Dixson, Luanna; Erk, Susanne; Mohnke, Sebastian; Heinz, Andreas; Romanczuk-Seiferth, Nina; Mühleisen, Thomas W; Mattheisen, Manuel; Witt, Stephanie H; Cichon, Sven; Noethen, Markus; Rietschel, Marcella; Tost, Heike; Meyer-Lindenberg, Andreas

    2016-06-01

    Although deficits in emotional processing are prominent in schizophrenia, it has been difficult to identify neural mechanisms related to the genetic risk for this highly heritable illness. Prior studies have not found consistent regional activation or connectivity alterations in first-degree relatives compared with healthy controls, suggesting that a more comprehensive search for connectomic biomarkers is warranted. To identify a potential systems-level intermediate phenotype linked to emotion processing in schizophrenia and to examine the psychological association, task specificity, test-retest reliability, and clinical validity of the identified phenotype. The study was performed in university research hospitals from June 1, 2008, through December 31, 2013. We examined 58 unaffected first-degree relatives of patients with schizophrenia and 94 healthy controls with an emotional face-matching functional magnetic resonance imaging paradigm. Test-retest reliability was analyzed with an independent sample of 26 healthy participants. A clinical association study was performed in 31 patients with schizophrenia and 45 healthy controls. Data analysis was performed from January 1 to September 30, 2014. Conventional amygdala activity and seeded connectivity measures, graph-based global and local network connectivity measures, Spearman rank correlation, intraclass correlation, and gray matter volumes. Among the 152 volunteers included in the relative-control sample, 58 were unaffected first-degree relatives of patients with schizophrenia (mean [SD] age, 33.29 [12.56]; 38 were women), and 94 were healthy controls without a first-degree relative with mental illness (mean [SD] age, 32.69 [10.09] years; 55 were women). A graph-theoretical connectivity approach identified significantly decreased connectivity in a subnetwork that primarily included the limbic cortex, visual cortex, and subcortex during emotional face processing (cluster-level P corrected for familywise error =

  1. Expression-dependent susceptibility to face distortions in processing of facial expressions of emotion.

    Science.gov (United States)

    Guo, Kun; Soornack, Yoshi; Settle, Rebecca

    2018-03-05

    Our capability of recognizing facial expressions of emotion under different viewing conditions implies the existence of an invariant expression representation. As natural visual signals are often distorted and our perceptual strategy changes with external noise level, it is essential to understand how expression perception is susceptible to face distortion and whether the same facial cues are used to process high- and low-quality face images. We systematically manipulated face image resolution (experiment 1) and blur (experiment 2), and measured participants' expression categorization accuracy, perceived expression intensity and associated gaze patterns. Our analysis revealed a reasonable tolerance to face distortion in expression perception. Reducing image resolution up to 48 × 64 pixels or increasing image blur up to 15 cycles/image had little impact on expression assessment and associated gaze behaviour. Further distortion led to decreased expression categorization accuracy and intensity rating, increased reaction time and fixation duration, and stronger central fixation bias which was not driven by distortion-induced changes in local image saliency. Interestingly, the observed distortion effects were expression-dependent with less deterioration impact on happy and surprise expressions, suggesting this distortion-invariant facial expression perception might be achieved through the categorical model involving a non-linear configural combination of local facial features. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Spatiotemporal brain dynamics of emotional face processing modulations induced by the serotonin 1A/2A receptor agonist psilocybin.

    Science.gov (United States)

    Bernasconi, Fosco; Schmidt, André; Pokorny, Thomas; Kometer, Michael; Seifritz, Erich; Vollenweider, Franz X

    2014-12-01

    Emotional face processing is critically modulated by the serotonergic system. For instance, emotional face processing is impaired by acute psilocybin administration, a serotonin (5-HT) 1A and 2A receptor agonist. However, the spatiotemporal brain mechanisms underlying these modulations are poorly understood. Here, we investigated the spatiotemporal brain dynamics underlying psilocybin-induced modulations during emotional face processing. Electrical neuroimaging analyses were applied to visual evoked potentials in response to emotional faces, following psilocybin and placebo administration. Our results indicate a first time period of strength (i.e., Global Field Power) modulation over the 168-189 ms poststimulus interval, induced by psilocybin. A second time period of strength modulation was identified over the 211-242 ms poststimulus interval. Source estimations over these 2 time periods further revealed decreased activity in response to both neutral and fearful faces within limbic areas, including amygdala and parahippocampal gyrus, and the right temporal cortex over the 168-189 ms interval, and reduced activity in response to happy faces within limbic and right temporo-occipital brain areas over the 211-242 ms interval. Our results indicate a selective and temporally dissociable effect of psilocybin on the neuronal correlates of emotional face processing, consistent with a modulation of the top-down control. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. More than words (and faces): evidence for a Stroop effect of prosody in emotion word processing.

    Science.gov (United States)

    Filippi, Piera; Ocklenburg, Sebastian; Bowling, Daniel L; Heege, Larissa; Güntürkün, Onur; Newen, Albert; de Boer, Bart

    2017-08-01

    Humans typically combine linguistic and nonlinguistic information to comprehend emotions. We adopted an emotion identification Stroop task to investigate how different channels interact in emotion communication. In experiment 1, synonyms of "happy" and "sad" were spoken with happy and sad prosody. Participants had more difficulty ignoring prosody than ignoring verbal content. In experiment 2, synonyms of "happy" and "sad" were spoken with happy and sad prosody, while happy or sad faces were displayed. Accuracy was lower when two channels expressed an emotion that was incongruent with the channel participants had to focus on, compared with the cross-channel congruence condition. When participants were required to focus on verbal content, accuracy was significantly lower also when prosody was incongruent with verbal content and face. This suggests that prosody biases emotional verbal content processing, even when conflicting with verbal content and face simultaneously. Implications for multimodal communication and language evolution studies are discussed.

  4. Neural circuitry of masked emotional face processing in youth with bipolar disorder, severe mood dysregulation, and healthy volunteers

    Directory of Open Access Journals (Sweden)

    Laura A. Thomas

    2014-04-01

    Full Text Available Youth with bipolar disorder (BD and those with severe, non-episodic irritability (severe mood dysregulation, SMD show face-emotion labeling deficits. These groups differ from healthy volunteers (HV in neural responses to emotional faces. It is unknown whether awareness is required to elicit these differences. We compared activation in BD (N = 20, SMD (N = 18, and HV (N = 22 during “Aware” and “Non-aware” priming of shapes by emotional faces. Subjects rated how much they liked the shape. In aware, a face (angry, fearful, happy, neutral, blank oval appeared (187 ms before the shape. In non-aware, a face appeared (17 ms, followed by a mask (170 ms, and shape. A Diagnosis-by-Awareness-by-Emotion ANOVA was not significant. There were significant Diagnosis-by-Awareness interactions in occipital regions. BD and SMD showed increased activity for non-aware vs. aware; HV showed the reverse pattern. When subjects viewed angry or neutral faces, there were Emotion-by-Diagnosis interactions in face-emotion processing regions, including the L precentral gyrus, R posterior cingulate, R superior temporal gyrus, R middle occipital gyrus, and L medial frontal gyrus. Regardless of awareness, BD and SMD differ in activation patterns from HV and each other in multiple brain regions, suggesting that BD and SMD are distinct developmental mood disorders.

  5. Neural circuitry of masked emotional face processing in youth with bipolar disorder, severe mood dysregulation, and healthy volunteers.

    Science.gov (United States)

    Thomas, Laura A; Brotman, Melissa A; Bones, Brian L; Chen, Gang; Rosen, Brooke H; Pine, Daniel S; Leibenluft, Ellen

    2014-04-01

    Youth with bipolar disorder (BD) and those with severe, non-episodic irritability (severe mood dysregulation, SMD) show face-emotion labeling deficits. These groups differ from healthy volunteers (HV) in neural responses to emotional faces. It is unknown whether awareness is required to elicit these differences. We compared activation in BD (N=20), SMD (N=18), and HV (N=22) during "Aware" and "Non-aware" priming of shapes by emotional faces. Subjects rated how much they liked the shape. In aware, a face (angry, fearful, happy, neutral, blank oval) appeared (187 ms) before the shape. In non-aware, a face appeared (17 ms), followed by a mask (170 ms), and shape. A Diagnosis-by-Awareness-by-Emotion ANOVA was not significant. There were significant Diagnosis-by-Awareness interactions in occipital regions. BD and SMD showed increased activity for non-aware vs. aware; HV showed the reverse pattern. When subjects viewed angry or neutral faces, there were Emotion-by-Diagnosis interactions in face-emotion processing regions, including the L precentral gyrus, R posterior cingulate, R superior temporal gyrus, R middle occipital gyrus, and L medial frontal gyrus. Regardless of awareness, BD and SMD differ in activation patterns from HV and each other in multiple brain regions, suggesting that BD and SMD are distinct developmental mood disorders. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Oxytocin and social pretreatment have similar effects on processing of negative emotional faces in healthy adult males

    Directory of Open Access Journals (Sweden)

    Anna eKis

    2013-08-01

    Full Text Available Oxytocin has been shown to affect several aspects of human social cognition, including facial emotion processing. There is also evidence that social stimuli (such as eye-contact can effectively modulate endogenous oxytocin levels.In the present study we directly tested whether intranasal oxytocin administration and pre-treatment with social stimuli had similar effects on face processing at the behavioural level. Subjects (N=52 healthy adult males were presented with a set of faces with expressions of different valence (negative, neutral, positive following different types of pretreatment (oxytocin – OT or placebo – PL and social interaction – Soc or no social interaction – NSoc, N=13 in each and were asked to rate all faces for perceived emotion and trustworthiness. On the next day subjects’ recognition memory was tested on a set of neutral faces and additionally they had to again rate each face for trustworthiness and emotion.Subjects in both the OT and the Soc pretreatment group (as compared to the PL and to the NSoc groups gave higher emotion and trustworthiness scores for faces with negative emotional expression. Moreover, 24 h later, subjects in the OT and Soc groups (unlike in control groups gave lower trustworthiness scores for previously negative faces, than for faces previously seen as emotionally neutral or positive.In sum these results provide the first direct evidence of the similar effects of intranasal oxytocin administration and social stimulation on the perception of negative facial emotions as well as on the delayed recall of negative emotional information.

  7. Self-esteem Modulates the P3 Component in Response to the Self-face Processing after Priming with Emotional Faces.

    Science.gov (United States)

    Guan, Lili; Zhao, Yufang; Wang, Yige; Chen, Yujie; Yang, Juan

    2017-01-01

    The self-face processing advantage (SPA) refers to the research finding that individuals generally recognize their own face faster than another's face; self-face also elicits an enhanced P3 amplitude compared to another's face. It has been suggested that social evaluation threats could weaken the SPA and that self-esteem could be regarded as a threat buffer. However, little research has directly investigated the neural evidence of how self-esteem modulates the social evaluation threat to the SPA. In the current event-related potential study, 27 healthy Chinese undergraduate students were primed with emotional faces (angry, happy, or neutral) and were asked to judge whether the target face (self, friend, and stranger) was familiar or unfamiliar. Electrophysiological results showed that after priming with emotional faces (angry and happy), self-face elicited similar P3 amplitudes to friend-face in individuals with low self-esteem, but not in individuals with high self-esteem. The results suggest that as low self-esteem raises fears of social rejection and exclusion, priming with emotional faces (angry and happy) can weaken the SPA in low self-esteem individuals but not in high self-esteem individuals.

  8. Self-esteem Modulates the P3 Component in Response to the Self-face Processing after Priming with Emotional Faces

    Directory of Open Access Journals (Sweden)

    Lili Guan

    2017-08-01

    Full Text Available The self-face processing advantage (SPA refers to the research finding that individuals generally recognize their own face faster than another’s face; self-face also elicits an enhanced P3 amplitude compared to another’s face. It has been suggested that social evaluation threats could weaken the SPA and that self-esteem could be regarded as a threat buffer. However, little research has directly investigated the neural evidence of how self-esteem modulates the social evaluation threat to the SPA. In the current event-related potential study, 27 healthy Chinese undergraduate students were primed with emotional faces (angry, happy, or neutral and were asked to judge whether the target face (self, friend, and stranger was familiar or unfamiliar. Electrophysiological results showed that after priming with emotional faces (angry and happy, self-face elicited similar P3 amplitudes to friend-face in individuals with low self-esteem, but not in individuals with high self-esteem. The results suggest that as low self-esteem raises fears of social rejection and exclusion, priming with emotional faces (angry and happy can weaken the SPA in low self-esteem individuals but not in high self-esteem individuals.

  9. Emotion Words: Adding Face Value.

    Science.gov (United States)

    Fugate, Jennifer M B; Gendron, Maria; Nakashima, Satoshi F; Barrett, Lisa Feldman

    2017-06-12

    Despite a growing number of studies suggesting that emotion words affect perceptual judgments of emotional stimuli, little is known about how emotion words affect perceptual memory for emotional faces. In Experiments 1 and 2 we tested how emotion words (compared with control words) affected participants' abilities to select a target emotional face from among distractor faces. Participants were generally more likely to false alarm to distractor emotional faces when primed with an emotion word congruent with the face (compared with a control word). Moreover, participants showed both decreased sensitivity (d') to discriminate between target and distractor faces, as well as altered response biases (c; more likely to answer "yes") when primed with an emotion word (compared with a control word). In Experiment 3 we showed that emotion words had more of an effect on perceptual memory judgments when the structural information in the target face was limited, as well as when participants were only able to categorize the face with a partially congruent emotion word. The overall results are consistent with the idea that emotion words affect the encoding of emotional faces in perceptual memory. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. Influence of spatial frequency and emotion expression on face processing in patients with panic disorder.

    Science.gov (United States)

    Shim, Miseon; Kim, Do-Won; Yoon, Sunkyung; Park, Gewnhi; Im, Chang-Hwan; Lee, Seung-Hwan

    2016-06-01

    Deficits in facial emotion processing is a major characteristic of patients with panic disorder. It is known that visual stimuli with different spatial frequencies take distinct neural pathways. This study investigated facial emotion processing involving stimuli presented at broad, high, and low spatial frequencies in patients with panic disorder. Eighteen patients with panic disorder and 19 healthy controls were recruited. Seven event-related potential (ERP) components: (P100, N170, early posterior negativity (EPN); vertex positive potential (VPP), N250, P300; and late positive potential (LPP)) were evaluated while the participants looked at fearful and neutral facial stimuli presented at three spatial frequencies. When a fearful face was presented, panic disorder patients showed a significantly increased P100 amplitude in response to low spatial frequency compared to high spatial frequency; whereas healthy controls demonstrated significant broad spatial frequency dependent processing in P100 amplitude. Vertex positive potential amplitude was significantly increased in high and broad spatial frequency, compared to low spatial frequency in panic disorder. Early posterior negativity amplitude was significantly different between HSF and BSF, and between LSF and BSF processing in both groups, regardless of facial expression. The possibly confounding effects of medication could not be controlled. During early visual processing, patients with panic disorder prefer global to detailed information. However, in later processing, panic disorder patients overuse detailed information for the perception of facial expressions. These findings suggest that unique spatial frequency-dependent facial processing could shed light on the neural pathology associated with panic disorder. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Consciousness and arousal effects on emotional face processing as revealed by brain oscillations. A gamma band analysis.

    Science.gov (United States)

    Balconi, Michela; Lucchiari, Claudio

    2008-01-01

    It remains an open question whether it is possible to assign a single brain operation or psychological function for facial emotion decoding to a certain type of oscillatory activity. Gamma band activity (GBA) offers an adequate tool for studying cortical activation patterns during emotional face information processing. In the present study brain oscillations were analyzed in response to facial expression of emotions. Specifically, GBA modulation was measured when twenty subjects looked at emotional (angry, fearful, happy, and sad faces) or neutral faces in two different conditions: supraliminal (10 ms) vs subliminal (150 ms) stimulation (100 target-mask pairs for each condition). The results showed that both consciousness and significance of the stimulus in terms of arousal can modulate the power synchronization (ERD decrease) during 150-350 time range: an early oscillatory event showed its peak at about 200 ms post-stimulus. GBA was enhanced by supraliminal more than subliminal elaboration, as well as more by high arousal (anger and fear) than low arousal (happiness and sadness) emotions. Finally a left-posterior dominance for conscious elaboration was found, whereas right hemisphere was discriminant in emotional processing of face in comparison with neutral face.

  12. Emotional face processing and flat affect in schizophrenia: functional and structural neural correlates.

    Science.gov (United States)

    Lepage, M; Sergerie, K; Benoit, A; Czechowska, Y; Dickie, E; Armony, J L

    2011-09-01

    There is a general consensus in the literature that schizophrenia causes difficulties with facial emotion perception and discrimination. Functional brain imaging studies have observed reduced limbic activity during facial emotion perception but few studies have examined the relation to flat affect severity. A total of 26 people with schizophrenia and 26 healthy controls took part in this event-related functional magnetic resonance imaging study. Sad, happy and neutral faces were presented in a pseudo-random order and participants indicated the gender of the face presented. Manual segmentation of the amygdala was performed on a structural T1 image. Both the schizophrenia group and the healthy control group rated the emotional valence of facial expressions similarly. Both groups exhibited increased brain activity during the perception of emotional faces relative to neutral ones in multiple brain regions, including multiple prefrontal regions bilaterally, the right amygdala, right cingulate cortex and cuneus. Group comparisons, however, revealed increased activity in the healthy group in the anterior cingulate, right parahippocampal gyrus and multiple visual areas. In schizophrenia, the severity of flat affect correlated significantly with neural activity in several brain areas including the amygdala and parahippocampal region bilaterally. These results suggest that many of the brain regions involved in emotional face perception, including the amygdala, are equally recruited in both schizophrenia and controls, but flat affect can also moderate activity in some other brain regions, notably in the left amygdala and parahippocampal gyrus bilaterally. There were no significant group differences in the volume of the amygdala.

  13. Neural correlates of top-down processing in emotion perception: an ERP study of emotional faces in white noise versus noise-alone stimuli.

    Science.gov (United States)

    Lee, Kyu-Yong; Lee, Tae-Ho; Yoon, So-Jeong; Cho, Yang Seok; Choi, June-Seek; Kim, Hyun Taek

    2010-06-14

    In the present study, we investigated the neural correlates underlying the perception of emotion in response to facial stimuli in order to elucidate the extent to which emotional perception is affected by the top-down process. Subjects performed a forced, two-choice emotion discrimination task towards ambiguous visual stimuli consisted of emotional faces embedded in different levels of visual white noise, including white noise-alone stimuli. ERP recordings and behavioral responses were analyzed according to the four response categories: hit, miss, false alarm and correct rejection. We observed enlarged EPN and LPP amplitudes when subjects reported seeing fearful faces and a typical emotional EPN response in the white noise-alone conditions when fearful faces were not presented. The two components of the ERP data which imply the characteristic modulation reflecting emotional processing showed the type of emotion each individual subjectively perceived. The results suggest that top-down modulations might be indispensable for emotional perception, which consists of two distinct stages of stimulus processing in the brain. (c) 2010 Elsevier B.V. All rights reserved.

  14. Cortical deficits of emotional face processing in adults with ADHD: its relation to social cognition and executive function.

    Science.gov (United States)

    Ibáñez, Agustin; Petroni, Agustin; Urquina, Hugo; Torrente, Fernando; Torralva, Teresa; Hurtado, Esteban; Guex, Raphael; Blenkmann, Alejandro; Beltrachini, Leandro; Muravchik, Carlos; Baez, Sandra; Cetkovich, Marcelo; Sigman, Mariano; Lischinsky, Alicia; Manes, Facundo

    2011-01-01

    Although it has been shown that adults with attention-deficit hyperactivity disorder (ADHD) have impaired social cognition, no previous study has reported the brain correlates of face valence processing. This study looked for behavioral, neuropsychological, and electrophysiological markers of emotion processing for faces (N170) in adult ADHD compared to controls matched by age, gender, educational level, and handedness. We designed an event-related potential (ERP) study based on a dual valence task (DVT), in which faces and words were presented to test the effects of stimulus type (faces, words, or face-word stimuli) and valence (positive versus negative). Individual signatures of cognitive functioning in participants with ADHD and controls were assessed with a comprehensive neuropsychological evaluation, including executive functioning (EF) and theory of mind (ToM). Compared to controls, the adult ADHD group showed deficits in N170 emotion modulation for facial stimuli. These N170 impairments were observed in the absence of any deficit in facial structural processing, suggesting a specific ADHD impairment in early facial emotion modulation. The cortical current density mapping of N170 yielded a main neural source of N170 at posterior section of fusiform gyrus (maximum at left hemisphere for words and right hemisphere for faces and simultaneous stimuli). Neural generators of N170 (fusiform gyrus) were reduced in ADHD. In those patients, N170 emotion processing was associated with performance on an emotional inference ToM task, and N170 from simultaneous stimuli was associated with EF, especially working memory. This is the first report to reveal an adult ADHD-specific impairment in the cortical modulation of emotion for faces and an association between N170 cortical measures and ToM and EF.

  15. The Processing of Human Emotional Faces by Pet and Lab Dogs: Evidence for Lateralization and Experience Effects

    Science.gov (United States)

    Barber, Anjuli L. A.; Randi, Dania; Müller, Corsin A.; Huber, Ludwig

    2016-01-01

    From all non-human animals dogs are very likely the best decoders of human behavior. In addition to a high sensitivity to human attentive status and to ostensive cues, they are able to distinguish between individual human faces and even between human facial expressions. However, so far little is known about how they process human faces and to what extent this is influenced by experience. Here we present an eye-tracking study with dogs emanating from two different living environments and varying experience with humans: pet and lab dogs. The dogs were shown pictures of familiar and unfamiliar human faces expressing four different emotions. The results, extracted from several different eye-tracking measurements, revealed pronounced differences in the face processing of pet and lab dogs, thus indicating an influence of the amount of exposure to humans. In addition, there was some evidence for the influences of both, the familiarity and the emotional expression of the face, and strong evidence for a left gaze bias. These findings, together with recent evidence for the dog's ability to discriminate human facial expressions, indicate that dogs are sensitive to some emotions expressed in human faces. PMID:27074009

  16. Effects of Acute Alcohol Consumption on the Processing of Emotion in Faces: Implications for Understanding Alcohol-Related Aggression

    Science.gov (United States)

    Attwood, Angela S.; Munafò, Marcus R.

    2016-01-01

    The negative consequences of chronic alcohol abuse are well known, but heavy episodic consumption ("binge drinking") is also associated with significant personal and societal harms. Aggressive tendencies are increased after alcohol but the mechanisms underlying these changes are not fully understood. While effects on behavioural control are likely to be important, other effects may be involved given the widespread action of alcohol. Altered processing of social signals is associated with changes in social behaviours, including aggression, but until recently there has been little research investigating the effects of acute alcohol consumption on these outcomes. Recent work investigating the effects of acute alcohol on emotional face processing has suggested reduced sensitivity to submissive signals (sad faces) and increased perceptual bias towards provocative signals (angry faces) after alcohol consumption, which may play a role in alcohol-related aggression. Here we discuss a putative mechanism that may explain how alcohol consumption influences emotional processing and subsequent aggressive responding, via disruption of OFC-amygdala connectivity. While the importance of emotional processing on social behaviours is well established, research into acute alcohol consumption and emotional processing is still in its infancy. Further research is needed and we outline a research agenda to address gaps in the literature. PMID:24920135

  17. Association of Irritability and Anxiety With the Neural Mechanisms of Implicit Face Emotion Processing in Youths With Psychopathology.

    Science.gov (United States)

    Stoddard, Joel; Tseng, Wan-Ling; Kim, Pilyoung; Chen, Gang; Yi, Jennifer; Donahue, Laura; Brotman, Melissa A; Towbin, Kenneth E; Pine, Daniel S; Leibenluft, Ellen

    2017-01-01

    Psychiatric comorbidity complicates clinical care and confounds efforts to elucidate the pathophysiology of commonly occurring symptoms in youths. To our knowledge, few studies have simultaneously assessed the effect of 2 continuously distributed traits on brain-behavior relationships in children with psychopathology. To determine shared and unique effects of 2 major dimensions of child psychopathology, irritability and anxiety, on neural responses to facial emotions during functional magnetic resonance imaging. Cross-sectional functional magnetic resonance imaging study in a large, well-characterized clinical sample at a research clinic at the National Institute of Mental Health. The referred sample included youths ages 8 to 17 years, 93 youths with anxiety, disruptive mood dysregulation, and/or attention-deficit/hyperactivity disorders and 22 healthy youths. The child's irritability and anxiety were rated by both parent and child on the Affective Reactivity Index and Screen for Child Anxiety Related Disorders, respectively. Using functional magnetic resonance imaging, neural response was measured across the brain during gender labeling of varying intensities of angry, happy, or fearful face emotions. In mixed-effects analyses, the shared and unique effects of irritability and anxiety were tested on amygdala functional connectivity and activation to face emotions. The mean (SD) age of participants was 13.2 (2.6) years; of the 115 included, 64 were male. Irritability and/or anxiety influenced amygdala connectivity to the prefrontal and temporal cortex. Specifically, irritability and anxiety jointly influenced left amygdala to left medial prefrontal cortex connectivity during face emotion viewing (F4,888 = 9.20; P differences in neural response to face emotions in several areas (F2, 888 ≥ 13.45; all P emotion dysregulation when very anxious and irritable youth process threat-related faces. Activation in the ventral visual circuitry suggests a mechanism

  18. Steroids facing emotions

    NARCIS (Netherlands)

    Putman, P.L.J.

    2006-01-01

    The studies reported in this thesis have been performed to gain a better understanding about motivational mediators of selective attention and memory for emotionally relevant stimuli, and about the roles that some steroid hormones play in regulation of human motivation and emotion. The stimuli used

  19. Face processing in chronic alcoholism: a specific deficit for emotional features.

    Science.gov (United States)

    Maurage, P; Campanella, S; Philippot, P; Martin, S; de Timary, P

    2008-04-01

    It is well established that chronic alcoholism is associated with a deficit in the decoding of emotional facial expression (EFE). Nevertheless, it is still unclear whether this deficit is specifically for emotions or due to a more general impairment in visual or facial processing. This study was designed to clarify this issue using multiple control tasks and the subtraction method. Eighteen patients suffering from chronic alcoholism and 18 matched healthy control subjects were asked to perform several tasks evaluating (1) Basic visuo-spatial and facial identity processing; (2) Simple reaction times; (3) Complex facial features identification (namely age, emotion, gender, and race). Accuracy and reaction times were recorded. Alcoholic patients had a preserved performance for visuo-spatial and facial identity processing, but their performance was impaired for visuo-motor abilities and for the detection of complex facial aspects. More importantly, the subtraction method showed that alcoholism is associated with a specific EFE decoding deficit, still present when visuo-motor slowing down is controlled for. These results offer a post hoc confirmation of earlier data showing an EFE decoding deficit in alcoholism by strongly suggesting a specificity of this deficit for emotions. This may have implications for clinical situations, where emotional impairments are frequently observed among alcoholic subjects.

  20. Facing the Problem: Impaired Emotion Recognition During Multimodal Social Information Processing in Borderline Personality Disorder.

    Science.gov (United States)

    Niedtfeld, Inga; Defiebre, Nadine; Regenbogen, Christina; Mier, Daniela; Fenske, Sabrina; Kirsch, Peter; Lis, Stefanie; Schmahl, Christian

    2017-04-01

    Previous research has revealed alterations and deficits in facial emotion recognition in patients with borderline personality disorder (BPD). During interpersonal communication in daily life, social signals such as speech content, variation in prosody, and facial expression need to be considered simultaneously. We hypothesized that deficits in higher level integration of social stimuli contribute to difficulties in emotion recognition in BPD, and heightened arousal might explain this effect. Thirty-one patients with BPD and thirty-one healthy controls were asked to identify emotions in short video clips, which were designed to represent different combinations of the three communication channels: facial expression, speech content, and prosody. Skin conductance was recorded as a measure of sympathetic arousal, while controlling for state dissociation. Patients with BPD showed lower mean accuracy scores than healthy control subjects in all conditions comprising emotional facial expressions. This was true for the condition with facial expression only, and for the combination of all three communication channels. Electrodermal responses were enhanced in BPD only in response to auditory stimuli. In line with the major body of facial emotion recognition studies, we conclude that deficits in the interpretation of facial expressions lead to the difficulties observed in multimodal emotion processing in BPD.

  1. The effect of emotionally valenced eye region images on visuocortical processing of surprised faces.

    Science.gov (United States)

    Li, Shuaixia; Li, Ping; Wang, Wei; Zhu, Xiangru; Luo, Wenbo

    2018-05-01

    In this study, we presented pictorial representations of happy, neutral, and fearful expressions projected in the eye regions to determine whether the eye region alone is sufficient to produce a context effect. Participants were asked to judge the valence of surprised faces that had been preceded by a picture of an eye region. Behavioral results showed that affective ratings of surprised faces were context dependent. Prime-related ERPs with presentation of happy eyes elicited a larger P1 than those for neutral and fearful eyes, likely due to the recognition advantage provided by a happy expression. Target-related ERPs showed that surprised faces in the context of fearful and happy eyes elicited dramatically larger C1 than those in the neutral context, which reflected the modulation by predictions during the earliest stages of face processing. There were larger N170 with neutral and fearful eye contexts compared to the happy context, suggesting faces were being integrated with contextual threat information. The P3 component exhibited enhanced brain activity in response to faces preceded by happy and fearful eyes compared with neutral eyes, indicating motivated attention processing may be involved at this stage. Altogether, these results indicate for the first time that the influence of isolated eye regions on the perception of surprised faces involves preferential processing at the early stages and elaborate processing at the late stages. Moreover, higher cognitive processes such as predictions and attention can modulate face processing from the earliest stages in a top-down manner. © 2017 Society for Psychophysiological Research.

  2. Processing of masked and unmasked emotional faces under different attentional conditions: an electrophysiological investigation.

    Directory of Open Access Journals (Sweden)

    Marzia eDel Zotto

    2015-10-01

    Full Text Available In order to investigate the interactions between non-spatial selective attention, awareness and emotion processing, we carried out an ERP study using a backward masking paradigm, in which angry, fearful, happy and neutral facial expressions were presented, while participants attempted to detect the presence of one or the other category of facial expressions in the different experimental blocks. ERP results showed that negative emotions enhanced an early N170 response over temporal-occipital leads in both masked and unmasked conditions, independently of selective attention. A later effect arising at the P2 was linked to awareness. Finally, selective attention was found to affect the N2 and N3 components over occipito-parietal leads. Our findings reveal that i the initial processing of facial expressions arises prior to attention and awareness; ii attention and awareness give rise to temporally distinct periods of activation independently of the type of emotion with only a partial degree of overlap; and iii selective attention appears to be influenced by the emotional nature of the stimuli, which in turn impinges on unconscious processing at a very early stage. This study confirms previous reports that negative facial expressions can be processed rapidly, in absence of visual awareness and independently of selective attention. On the other hand, attention and awareness may operate in a synergistic way, depending on task demand.

  3. Face Emotion Processing in Depressed Children and Adolescents with and without Comorbid Conduct Disorder

    Science.gov (United States)

    Schepman, Karen; Taylor, Eric; Collishaw, Stephan; Fombonne, Eric

    2012-01-01

    Studies of adults with depression point to characteristic neurocognitive deficits, including differences in processing facial expressions. Few studies have examined face processing in juvenile depression, or taken account of other comorbid disorders. Three groups were compared: depressed children and adolescents with conduct disorder (n = 23),…

  4. P2-27: Electrophysiological Correlates of Conscious and Unconscious Processing of Emotional Faces in Individuals with High and Low Autistic Traits

    Directory of Open Access Journals (Sweden)

    Svjetlana Vukusic

    2012-10-01

    Full Text Available LeDoux (1996 The Emotional Brain has suggested that subconsciouss presentation of fearful emotional information is relayed to the amygdala along a rapid subcortical route. Rapid emotion processing is important because it alerts other parts of brain to emotionally salient information. It also produces immediate reflexive responses to threating stimuli in comparison to slower conscious appraisal, which is of important adaptive survival value. Current theoretical models of autism spectrum disorders (ASD have linked impairments in the processing of emotional information to amygdala dysfunction. It can be suggested that impairment in face processing found in autism may be the result of impaired rapid subconscious processing of emotional information which does not make faces socially salient. Previous studies examined subconscious processing of emotional stimuli with backward masking paradigms by using very brief presentation of emotional face stimuli proceeded by a mask. We used an event-related potential (ERP study within a backward masking paradigm with subjects with low and high autistic tendencies as measured by the Autism Spectrum Quotient (AQ questionnaire. The time course of processing of fearful and happy facial expressions and an emotionally neutral face was investigated during subliminal (16 ms and supraliminal (166 ms stimuli presentation. The task consisted of an explicit categorization of emotional and neutral faces. We looked at ERP components N2, P3a, and also N170 for differences between subjects with low ( 19 AQ.

  5. Priming the Secure Attachment Schema Affects the Emotional Face Processing Bias in Attachment Anxiety: An fMRI Research

    Directory of Open Access Journals (Sweden)

    Xu Chen

    2017-04-01

    Full Text Available Our study explored how priming with a secure base schema affects the processing of emotional facial stimuli in individuals with attachment anxiety. We enrolled 42 undergraduate students between 18 and 27 years of age, and divided them into two groups: attachment anxiety and attachment secure. All participants were primed under two conditions, the secure priming using references to the partner, and neutral priming using neutral references. We performed repeated attachment security priming combined with a dual-task paradigm and functional magnetic resonance imaging. Participants’ reaction times in terms of responding to the facial stimuli were also measured. Attachment security priming can facilitate an individual’s processing of positive emotional faces; for instance, the presentation of the partner’s name was associated with stronger activities in a wide range of brain regions and faster reaction times for positive facial expressions in the subjects. The current finding of higher activity in the left-hemisphere regions for secure priming rather than neutral priming is consistent with the prediction that attachment security priming triggers the spread of the activation of a positive emotional state. However, the difference in brain activity during processing of both, positive and negative emotional facial stimuli between the two priming conditions appeared in the attachment anxiety group alone. This study indicates that the effect of attachment secure priming on the processing of emotional facial stimuli could be mediated by chronic attachment anxiety. In addition, it highlights the association between higher-order processes of the attachment system (secure attachment schema priming and early-stage information processing system (attention, given the increased attention toward the effects of secure base schema on the processing of emotion- and attachment-related information among the insecure population. Thus, the following study has

  6. Detecting and categorizing fleeting emotions in faces.

    Science.gov (United States)

    Sweeny, Timothy D; Suzuki, Satoru; Grabowecky, Marcia; Paller, Ken A

    2013-02-01

    Expressions of emotion are often brief, providing only fleeting images from which to base important social judgments. We sought to characterize the sensitivity and mechanisms of emotion detection and expression categorization when exposure to faces is very brief, and to determine whether these processes dissociate. Observers viewed 2 backward-masked facial expressions in quick succession, 1 neutral and the other emotional (happy, fearful, or angry), in a 2-interval forced-choice task. On each trial, observers attempted to detect the emotional expression (emotion detection) and to classify the expression (expression categorization). Above-chance emotion detection was possible with extremely brief exposures of 10 ms and was most accurate for happy expressions. We compared categorization among expressions using a d' analysis, and found that categorization was usually above chance for angry versus happy and fearful versus happy, but consistently poor for fearful versus angry expressions. Fearful versus angry categorization was poor even when only negative emotions (fearful, angry, or disgusted) were used, suggesting that this categorization is poor independent of decision context. Inverting faces impaired angry versus happy categorization, but not emotion detection, suggesting that information from facial features is used differently for emotion detection and expression categorizations. Emotion detection often occurred without expression categorization, and expression categorization sometimes occurred without emotion detection. These results are consistent with the notion that emotion detection and expression categorization involve separate mechanisms. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  7. Detecting and Categorizing Fleeting Emotions in Faces

    Science.gov (United States)

    Sweeny, Timothy D.; Suzuki, Satoru; Grabowecky, Marcia; Paller, Ken A.

    2013-01-01

    Expressions of emotion are often brief, providing only fleeting images from which to base important social judgments. We sought to characterize the sensitivity and mechanisms of emotion detection and expression categorization when exposure to faces is very brief, and to determine whether these processes dissociate. Observers viewed 2 backward-masked facial expressions in quick succession, 1 neutral and the other emotional (happy, fearful, or angry), in a 2-interval forced-choice task. On each trial, observers attempted to detect the emotional expression (emotion detection) and to classify the expression (expression categorization). Above-chance emotion detection was possible with extremely brief exposures of 10 ms and was most accurate for happy expressions. We compared categorization among expressions using a d′ analysis, and found that categorization was usually above chance for angry versus happy and fearful versus happy, but consistently poor for fearful versus angry expressions. Fearful versus angry categorization was poor even when only negative emotions (fearful, angry, or disgusted) were used, suggesting that this categorization is poor independent of decision context. Inverting faces impaired angry versus happy categorization, but not emotion detection, suggesting that information from facial features is used differently for emotion detection and expression categorizations. Emotion detection often occurred without expression categorization, and expression categorization sometimes occurred without emotion detection. These results are consistent with the notion that emotion detection and expression categorization involve separate mechanisms. PMID:22866885

  8. A dimensional approach to determine common and specific neurofunctional markers for depression and social anxiety during emotional face processing.

    Science.gov (United States)

    Luo, Lizhu; Becker, Benjamin; Zheng, Xiaoxiao; Zhao, Zhiying; Xu, Xiaolei; Zhou, Feng; Wang, Jiaojian; Kou, Juan; Dai, Jing; Kendrick, Keith M

    2018-02-01

    Major depression disorder (MDD) and anxiety disorder are both prevalent and debilitating. High rates of comorbidity between MDD and social anxiety disorder (SAD) suggest common pathological pathways, including aberrant neural processing of interpersonal signals. In patient populations, the determination of common and distinct neurofunctional markers of MDD and SAD is often hampered by confounding factors, such as generally elevated anxiety levels and disorder-specific brain structural alterations. This study employed a dimensional disorder approach to map neurofunctional markers associated with levels of depression and social anxiety symptoms in a cohort of 91 healthy subjects using an emotional face processing paradigm. Examining linear associations between levels of depression and social anxiety, while controlling for trait anxiety revealed that both were associated with exaggerated dorsal striatal reactivity to fearful and sad expression faces respectively. Exploratory analysis revealed that depression scores were positively correlated with dorsal striatal functional connectivity during processing of fearful faces, whereas those of social anxiety showed a negative association during processing of sad faces. No linear relationships between levels of depression and social anxiety were observed during a facial-identity matching task or with brain structure. Together, the present findings indicate that dorsal striatal neurofunctional alterations might underlie aberrant interpersonal processing associated with both increased levels of depression and social anxiety. © 2017 Wiley Periodicals, Inc.

  9. Serotonergic modulation of face-emotion recognition

    Directory of Open Access Journals (Sweden)

    C.M. Del-Ben

    2008-04-01

    Full Text Available Facial expressions of basic emotions have been widely used to investigate the neural substrates of emotion processing, but little is known about the exact meaning of subjective changes provoked by perceiving facial expressions. Our assumption was that fearful faces would be related to the processing of potential threats, whereas angry faces would be related to the processing of proximal threats. Experimental studies have suggested that serotonin modulates the brain processes underlying defensive responses to environmental threats, facilitating risk assessment behavior elicited by potential threats and inhibiting fight or flight responses to proximal threats. In order to test these predictions about the relationship between fearful and angry faces and defensive behaviors, we carried out a review of the literature about the effects of pharmacological probes that affect 5-HT-mediated neurotransmission on the perception of emotional faces. The hypothesis that angry faces would be processed as a proximal threat and that, as a consequence, their recognition would be impaired by an increase in 5-HT function was not supported by the results reviewed. In contrast, most of the studies that evaluated the behavioral effects of serotonin challenges showed that increased 5-HT neurotransmission facilitates the recognition of fearful faces, whereas its decrease impairs the same performance. These results agree with the hypothesis that fearful faces are processed as potential threats and that 5-HT enhances this brain processing.

  10. A note on age differences in mood-congruent versus mood-incongruent emotion processing in faces

    Directory of Open Access Journals (Sweden)

    Manuel C. Voelkle

    2014-06-01

    Full Text Available This article addresses four interrelated research questions: (1 Does experienced mood affect emotion perception in faces and is this perception mood-congruent or mood-incongruent? (2 Are there age-group differences in the interplay between experienced mood and emotion perception? (3 Does emotion perception in faces change as a function of the temporal sequence of study sessions and stimuli presentation, and (4 does emotion perception in faces serve a mood-regulatory function? One hundred fifty-four adults of three different age groups (younger: 20–31 years; middle-aged: 44–55 years; older adults: 70–81 years were asked to provide multidimensional emotion ratings of a total of 1,026 face pictures of younger, middle-aged, and older men and women, each displaying six different prototypical (primary emotional expressions. By analyzing the likelihood of ascribing an additional emotional expression to a face whose primary emotion had been correctly recognized, the multidimensional rating approach permits the study of emotion perception while controlling for emotion recognition. Following up on previous research on mood responses to recurring unpleasant situations using the same dataset (Voelkle, Ebner, Lindenberger, & Riediger, 2013, crossed random effects analyses supported a mood-congruent relationship between experienced mood and perceived emotions in faces. In particular older adults were more likely to perceive happiness in faces when being in a positive mood and less likely to do so when being in a negative mood. This did not apply to younger adults. Temporal sequence of study sessions and stimuli presentation had a strong effect on the likelihood of ascribing an additional emotional expression. In contrast to previous findings, however, there was neither evidence for a change from mood-congruent to mood-incongruent responses over time nor evidence for a mood-regulatory effect.

  11. A note on age differences in mood-congruent vs. mood-incongruent emotion processing in faces.

    Science.gov (United States)

    Voelkle, Manuel C; Ebner, Natalie C; Lindenberger, Ulman; Riediger, Michaela

    2014-01-01

    (1) Does experienced mood affect emotion perception in faces and is this perception mood-congruent or mood-incongruent?(2) Are there age-group differences in the interplay between experienced mood and emotion perception? (3) Does emotion perception in faces change as a function of the temporal sequence of study sessions and stimuli presentation, and (4) does emotion perception in faces serve a mood-regulatory function? One hundred fifty-four adults of three different age groups (younger: 20-31 years; middle-aged: 44-55 years; older adults: 70-81 years) were asked to provide multidimensional emotion ratings of a total of 1026 face pictures of younger, middle-aged, and older men and women, each displaying six different prototypical (primary) emotional expressions. By analyzing the likelihood of ascribing an additional emotional expression to a face whose primary emotion had been correctly recognized, the multidimensional rating approach permits the study of emotion perception while controlling for emotion recognition. Following up on previous research on mood responses to recurring unpleasant situations using the same dataset (Voelkle et al., 2013), crossed random effects analyses supported a mood-congruent relationship between experienced mood and perceived emotions in faces. In particular older adults were more likely to perceive happiness in faces when being in a positive mood and less likely to do so when being in a negative mood. This did not apply to younger adults. Temporal sequence of study sessions and stimuli presentation had a strong effect on the likelihood of ascribing an additional emotional expression. In contrast to previous findings, however, there was neither evidence for a change from mood-congruent to mood-incongruent responses over time nor evidence for a mood-regulatory effect.

  12. Fusiform gyrus dysfunction is associated with perceptual processing efficiency to emotional faces in adolescent depression: a model-based approach

    Directory of Open Access Journals (Sweden)

    Tiffany Cheing Ho

    2016-02-01

    Full Text Available While the extant literature has focused on major depressive disorder (MDD as being characterized by abnormalities in processing affective stimuli (e.g., facial expressions, little is known regarding which specific aspects of cognition influence the evaluation of affective stimuli, and what are the underlying neural correlates. To investigate these issues, we assessed 26 adolescents diagnosed with MDD and 37 well-matched healthy controls (HCL who completed an emotion identification task of dynamically morphing faces during functional magnetic resonance imaging (fMRI. We analyzed the behavioral data using a sequential sampling model of response time (RT commonly used to elucidate aspects of cognition in binary perceptual decision making tasks: the Linear Ballistic Accumulator (LBA model. Using a hierarchical Bayesian estimation method, we obtained group-level and individual-level estimates of LBA parameters on the facial emotion identification task. While the MDD and HCL groups did not differ in mean RT, accuracy, or group-level estimates of perceptual processing efficiency (i.e., drift rate parameter of the LBA, the MDD group showed significantly reduced responses in left fusiform gyrus compared to the HCL group during the facial emotion identification task. Furthermore, within the MDD group, fMRI signal in the left fusiform gyrus during affective face processing was significantly associated with greater individual-level estimates of perceptual processing efficiency. Our results therefore suggest that affective processing biases in adolescents with MDD are characterized by greater perceptual processing efficiency of affective visual information in sensory brain regions responsible for the early processing of visual information. The theoretical, methodological, and clinical implications of our results are discussed.

  13. Different underlying mechanisms for face emotion and gender processing during feature-selective attention: Evidence from event-related potential studies.

    Science.gov (United States)

    Wang, Hailing; Ip, Chengteng; Fu, Shimin; Sun, Pei

    2017-05-01

    Face recognition theories suggest that our brains process invariant (e.g., gender) and changeable (e.g., emotion) facial dimensions separately. To investigate whether these two dimensions are processed in different time courses, we analyzed the selection negativity (SN, an event-related potential component reflecting attentional modulation) elicited by face gender and emotion during a feature selective attention task. Participants were instructed to attend to a combination of face emotion and gender attributes in Experiment 1 (bi-dimensional task) and to either face emotion or gender in Experiment 2 (uni-dimensional task). The results revealed that face emotion did not elicit a substantial SN, whereas face gender consistently generated a substantial SN in both experiments. These results suggest that face gender is more sensitive to feature-selective attention and that face emotion is encoded relatively automatically on SN, implying the existence of different underlying processing mechanisms for invariant and changeable facial dimensions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Recognition of Face and Emotional Facial Expressions in Autism

    Directory of Open Access Journals (Sweden)

    Muhammed Tayyib Kadak

    2013-03-01

    Full Text Available Autism is a genetically transferred neurodevelopmental disorder characterized by severe and permanent deficits in many interpersonal relation areas like communication, social interaction and emotional responsiveness. Patients with autism have deficits in face recognition, eye contact and recognition of emotional expression. Both recognition of face and expression of facial emotion carried on face processing. Structural and functional impairment in fusiform gyrus, amygdala, superior temporal sulcus and other brain regions lead to deficits in recognition of face and facial emotion. Therefore studies suggest that face processing deficits resulted in problems in areas of social interaction and emotion in autism. Studies revealed that children with autism had problems in recognition of facial expression and used mouth region more than eye region. It was also shown that autistic patients interpreted ambiguous expressions as negative emotion. In autism, deficits related in various stages of face processing like detection of gaze, face identity, recognition of emotional expression were determined, so far. Social interaction impairments in autistic spectrum disorders originated from face processing deficits during the periods of infancy, childhood and adolescence. Recognition of face and expression of facial emotion could be affected either automatically by orienting towards faces after birth, or by “learning” processes in developmental periods such as identity and emotion processing. This article aimed to review neurobiological basis of face processing and recognition of emotional facial expressions during normal development and in autism.

  15. It Is Not Just in Faces! Processing of Emotion and Intention from Biological Motion in Psychiatric Disorders

    Directory of Open Access Journals (Sweden)

    Łukasz Okruszek

    2018-02-01

    Full Text Available Social neuroscience offers a wide range of techniques that may be applied to study the social cognitive deficits that may underlie reduced social functioning—a common feature across many psychiatric disorders. At the same time, a significant proportion of research in this area has been conducted using paradigms that utilize static displays of faces or eyes. The use of point-light displays (PLDs offers a viable alternative for studying recognition of emotion or intention inference while minimizing the amount of information presented to participants. This mini-review aims to summarize studies that have used PLD to study emotion and intention processing in schizophrenia (SCZ, affective disorders, anxiety and personality disorders, eating disorders and neurodegenerative disorders. Two main conclusions can be drawn from the reviewed studies: first, the social cognitive problems found in most of the psychiatric samples using PLD were of smaller magnitude than those found in studies presenting social information using faces or voices. Second, even though the information presented in PLDs is extremely limited, presentation of these types of stimuli is sufficient to elicit the disorder-specific, social cognitive biases (e.g., mood-congruent bias in depression, increased threat perception in anxious individuals, aberrant body size perception in eating disorders documented using other methodologies. Taken together, these findings suggest that point-light stimuli may be a useful method of studying social information processing in psychiatry. At the same time, some limitations of using this methodology are also outlined.

  16. Emotional faces and the default mode network.

    Science.gov (United States)

    Sreenivas, S; Boehm, S G; Linden, D E J

    2012-01-11

    The default-mode network (DMN) of the human brain has become a central topic of cognitive neuroscience research. Although alterations in its resting state activity and in its recruitment during tasks have been reported for several mental and neurodegenerative disorders, its role in emotion processing has received relatively little attention. We investigated brain responses to different categories of emotional faces with functional magnetic resonance imaging (fMRI) and found deactivation in ventromedial prefrontal cortex (VMPFC), posterior cingulate gyrus (PC) and cuneus. This deactivation was modulated by emotional category and was less prominent for happy than for sad faces. These deactivated areas along the midline conformed to areas of the DMN. We also observed emotion-dependent deactivation of the left middle frontal gyrus, which is not a classical component of the DMN. Conversely, several areas in a fronto-parietal network commonly linked with attention were differentially activated by emotion categories. Functional connectivity patterns, as obtained by correlation of activation levels, also varied between emotions. VMPFC, PC or cuneus served as hubs between the DMN-type areas and the fronto-parietal network. These data support recent suggestions that the DMN is not a unitary system but differentiates according to task and even type of stimulus. The emotion-specific differential pattern of DMN deactivation may be explored further in patients with mood disorder, where the quest for biological markers of emotional biases is still ongoing. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  17. Modulation of the composite face effect by unintended emotion cues.

    Science.gov (United States)

    Gray, Katie L H; Murphy, Jennifer; Marsh, Jade E; Cook, Richard

    2017-04-01

    When upper and lower regions from different emotionless faces are aligned to form a facial composite, observers 'fuse' the two halves together, perceptually. The illusory distortion induced by task-irrelevant ('distractor') halves hinders participants' judgements about task-relevant ('target') halves. This composite-face effect reveals a tendency to integrate feature information from disparate regions of intact upright faces, consistent with theories of holistic face processing. However, observers frequently perceive emotion in ostensibly neutral faces, contrary to the intentions of experimenters. This study sought to determine whether this 'perceived emotion' influences the composite-face effect. In our first experiment, we confirmed that the composite effect grows stronger as the strength of distractor emotion increased. Critically, effects of distractor emotion were induced by weak emotion intensities, and were incidental insofar as emotion cues hindered image matching, not emotion labelling per se . In Experiment 2, we found a correlation between the presence of perceived emotion in a set of ostensibly neutral distractor regions sourced from commonly used face databases, and the strength of illusory distortion they induced. In Experiment 3, participants completed a sequential matching composite task in which half of the distractor regions were rated high and low for perceived emotion, respectively. Significantly stronger composite effects were induced by the high-emotion distractor halves. These convergent results suggest that perceived emotion increases the strength of the composite-face effect induced by supposedly emotionless faces. These findings have important implications for the study of holistic face processing in typical and atypical populations.

  18. Social Anxiety Under Load: The Effects of Perceptual Load in Processing Emotional Faces

    Directory of Open Access Journals (Sweden)

    Sandra Cristina Soares

    2015-04-01

    Full Text Available Previous studies in the social anxiety arena have shown an impaired attentional control system, similar to that found in trait anxiety. However, the effect of task demands on social anxiety in socially threatening stimuli, such as angry faces, remains unseen. In the present study, fifty-four university students scoring high and low in the Social Interaction and Performance Anxiety and Avoidance Scale (SIPAAS questionnaire, participated in a target letter discrimination task while task-irrelevant face stimuli (angry, disgust, happy, and neutral were simultaneously presented. The results showed that high (compared to low socially anxious individuals were more prone to distraction by task-irrelevant stimuli, particularly under high perceptual load conditions. More importantly, for such individuals, the accuracy proportions for angry faces significantly differed between the low and high perceptual load conditions, which is discussed in light of current evolutionary models of social anxiety.

  19. Social anxiety under load: the effects of perceptual load in processing emotional faces.

    Science.gov (United States)

    Soares, Sandra C; Rocha, Marta; Neiva, Tiago; Rodrigues, Paulo; Silva, Carlos F

    2015-01-01

    Previous studies in the social anxiety arena have shown an impaired attentional control system, similar to that found in trait anxiety. However, the effect of task demands on social anxiety in socially threatening stimuli, such as angry faces, remains unseen. In the present study, 54 university students scoring high and low in the Social Interaction and Performance Anxiety and Avoidance Scale (SIPAAS) questionnaire, participated in a target letter discrimination task while task-irrelevant face stimuli (angry, disgust, happy, and neutral) were simultaneously presented. The results showed that high (compared to low) socially anxious individuals were more prone to distraction by task-irrelevant stimuli, particularly under high perceptual load conditions. More importantly, for such individuals, the accuracy proportions for angry faces significantly differed between the low and high perceptual load conditions, which is discussed in light of current evolutionary models of social anxiety.

  20. Hemispheric contributions to the processing of emotion in chimeric faces : behavioural and electrophysiological evidence

    OpenAIRE

    Geiger, Anja

    2005-01-01

    The face in general and facial expressions in particular have always been a focus of sociological, biological and psychological interest, with numerous different scientific approaches and objectives, investigating the expression and perception of facial affect or the social interaction of transmitting facial information between poser and perceiver.This study, dealing with two different aspects of facial expression, namely intensity and efficiency of facial expression, focuses on hemispheric c...

  1. Social anxiety under load: the effects of perceptual load in processing emotional faces

    OpenAIRE

    Soares, Sandra C.; Rocha, Marta; Neiva, Tiago; Rodrigues, Paulo; Silva, Carlos F.

    2015-01-01

    Previous studies in the social anxiety arena have shown an impaired attentional control system, similar to that found in trait anxiety. However, the effect of task demands on social anxiety in socially threatening stimuli, such as angry faces, remains unseen. In the present study, 54 university students scoring high and low in the Social Interaction and Performance Anxiety and Avoidance Scale (SIPAAS) questionnaire, participated in a target letter discrimination task while task-irrelevant fac...

  2. Love withdrawal is related to heightened processing of faces with emotional expressions and incongruent emotional feedback : Evidence from ERPs

    NARCIS (Netherlands)

    Huffmeijer, Renske; Tops, Mattie; Alink, Lenneke R. A.; Bakermans-Kranenburg, Marian J.; van Ijzendoorn, Marinus H.

    Parental use of love withdrawal is thought to affect children's later psychological functioning because it creates a link between children's performance and relational consequences. To investigate whether love withdrawal is also associated with the underlying level of basic information processing in

  3. Shades of Emotion: What the Addition of Sunglasses or Masks to Faces Reveals about the Development of Facial Expression Processing

    Science.gov (United States)

    Roberson, Debi; Kikutani, Mariko; Doge, Paula; Whitaker, Lydia; Majid, Asifa

    2012-01-01

    Three studies investigated developmental changes in facial expression processing, between 3 years-of-age and adulthood. For adults and older children, the addition of sunglasses to upright faces caused an equivalent decrement in performance to face inversion. However, younger children showed "better" classification of expressions of faces wearing…

  4. Self-esteem Modulates the P3 Component in Response to the Self-face Processing after Priming with Emotional Faces

    OpenAIRE

    Lili Guan; Lili Guan; Yufang Zhao; Yige Wang; Yujie Chen; Juan Yang

    2017-01-01

    The self-face processing advantage (SPA) refers to the research finding that individuals generally recognize their own face faster than another’s face; self-face also elicits an enhanced P3 amplitude compared to another’s face. It has been suggested that social evaluation threats could weaken the SPA and that self-esteem could be regarded as a threat buffer. However, little research has directly investigated the neural evidence of how self-esteem modulates the social evaluation threat to the ...

  5. State anxiety and emotional face recognition in healthy volunteers

    OpenAIRE

    Attwood, Angela S.; Easey, Kayleigh E.; Dalili, Michael N.; Skinner, Andrew L.; Woods, Andy; Crick, Lana; Ilett, Elizabeth; Penton-Voak, Ian S.; Munafò, Marcus R.

    2017-01-01

    High trait anxiety has been associated with detriments in emotional face processing. By contrast, relatively little is known about the effects of state anxiety on emotional face processing. We investigated the effects of state anxiety on recognition of emotional expressions (anger, sadness, surprise, disgust, fear and happiness) experimentally, using the 7.5% carbon dioxide (CO2) model to induce state anxiety, and in a large observational study. The experimental studies indicated reduced glob...

  6. 5-HTTLPR differentially predicts brain network responses to emotional faces

    DEFF Research Database (Denmark)

    Fisher, Patrick M; Grady, Cheryl L; Madsen, Martin K

    2015-01-01

    The effects of the 5-HTTLPR polymorphism on neural responses to emotionally salient faces have been studied extensively, focusing on amygdala reactivity and amygdala-prefrontal interactions. Despite compelling evidence that emotional face paradigms engage a distributed network of brain regions...... to fearful faces was significantly greater in S' carriers compared to LA LA individuals. These findings provide novel evidence for emotion-specific 5-HTTLPR effects on the response of a distributed set of brain regions including areas responsive to emotionally salient stimuli and critical components...... involved in emotion, cognitive and visual processing, less is known about 5-HTTLPR effects on broader network responses. To address this, we evaluated 5-HTTLPR differences in the whole-brain response to an emotional faces paradigm including neutral, angry and fearful faces using functional magnetic...

  7. Emotional Labor, Face and Guan xi

    Institute of Scientific and Technical Information of China (English)

    Tianwenling

    2017-01-01

    Emotional Labor, Face and Guan xi are all relevant to performance, appearance, and emotional feelings, which are essential elements in work place. In other words, not only front-line workers, but all employees in an organization is faced up with the three

  8. Mapping the emotional face. How individual face parts contribute to successful emotion recognition.

    Directory of Open Access Journals (Sweden)

    Martin Wegrzyn

    Full Text Available Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes to disgust and happiness (mouth. The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation.

  9. Mapping the emotional face. How individual face parts contribute to successful emotion recognition

    Science.gov (United States)

    Wegrzyn, Martin; Vogt, Maria; Kireclioglu, Berna; Schneider, Julia; Kissler, Johanna

    2017-01-01

    Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation. PMID:28493921

  10. Method for Face-Emotion Retrieval Using A Cartoon Emotional Expression Approach

    Science.gov (United States)

    Kostov, Vlaho; Yanagisawa, Hideyoshi; Johansson, Martin; Fukuda, Shuichi

    A simple method for extracting emotion from a human face, as a form of non-verbal communication, was developed to cope with and optimize mobile communication in a globalized and diversified society. A cartoon face based model was developed and used to evaluate emotional content of real faces. After a pilot survey, basic rules were defined and student subjects were asked to express emotion using the cartoon face. Their face samples were then analyzed using principal component analysis and the Mahalanobis distance method. Feature parameters considered as having relations with emotions were extracted and new cartoon faces (based on these parameters) were generated. The subjects evaluated emotion of these cartoon faces again and we confirmed these parameters were suitable. To confirm how these parameters could be applied to real faces, we asked subjects to express the same emotions which were then captured electronically. Simple image processing techniques were also developed to extract these features from real faces and we then compared them with the cartoon face parameters. It is demonstrated via the cartoon face that we are able to express the emotions from very small amounts of information. As a result, real and cartoon faces correspond to each other. It is also shown that emotion could be extracted from still and dynamic real face images using these cartoon-based features.

  11. Emotional facial expressions reduce neural adaptation to face identity.

    Science.gov (United States)

    Gerlicher, Anna M V; van Loon, Anouk M; Scholte, H Steven; Lamme, Victor A F; van der Leij, Andries R

    2014-05-01

    In human social interactions, facial emotional expressions are a crucial source of information. Repeatedly presented information typically leads to an adaptation of neural responses. However, processing seems sustained with emotional facial expressions. Therefore, we tested whether sustained processing of emotional expressions, especially threat-related expressions, would attenuate neural adaptation. Neutral and emotional expressions (happy, mixed and fearful) of same and different identity were presented at 3 Hz. We used electroencephalography to record the evoked steady-state visual potentials (ssVEP) and tested to what extent the ssVEP amplitude adapts to the same when compared with different face identities. We found adaptation to the identity of a neutral face. However, for emotional faces, adaptation was reduced, decreasing linearly with negative valence, with the least adaptation to fearful expressions. This short and straightforward method may prove to be a valuable new tool in the study of emotional processing.

  12. Face Age and Eye Gaze Influence Older Adults' Emotion Recognition.

    Science.gov (United States)

    Campbell, Anna; Murray, Janice E; Atkinson, Lianne; Ruffman, Ted

    2017-07-01

    Eye gaze has been shown to influence emotion recognition. In addition, older adults (over 65 years) are not as influenced by gaze direction cues as young adults (18-30 years). Nevertheless, these differences might stem from the use of young to middle-aged faces in emotion recognition research because older adults have an attention bias toward old-age faces. Therefore, using older face stimuli might allow older adults to process gaze direction cues to influence emotion recognition. To investigate this idea, young and older adults completed an emotion recognition task with young and older face stimuli displaying direct and averted gaze, assessing labeling accuracy for angry, disgusted, fearful, happy, and sad faces. Direct gaze rather than averted gaze improved young adults' recognition of emotions in young and older faces, but for older adults this was true only for older faces. The current study highlights the impact of stimulus face age and gaze direction on emotion recognition in young and older adults. The use of young face stimuli with direct gaze in most research might contribute to age-related emotion recognition differences. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. Alcoholism and dampened temporal limbic activation to emotional faces.

    Science.gov (United States)

    Marinkovic, Ksenija; Oscar-Berman, Marlene; Urban, Trinity; O'Reilly, Cara E; Howard, Julie A; Sawyer, Kayle; Harris, Gordon J

    2009-11-01

    Excessive chronic drinking is accompanied by a broad spectrum of emotional changes ranging from apathy and emotional flatness to deficits in comprehending emotional information, but their neural bases are poorly understood. Emotional abnormalities associated with alcoholism were examined with functional magnetic resonance imaging in abstinent long-term alcoholic men in comparison to healthy demographically matched controls. Participants were presented with emotionally valenced words and photographs of faces during deep (semantic) and shallow (perceptual) encoding tasks followed by recognition. Overall, faces evoked stronger activation than words, with the expected material-specific laterality (left hemisphere for words, and right for faces) and depth of processing effects. However, whereas control participants showed stronger activation in the amygdala and hippocampus when viewing faces with emotional (relative to neutral) expressions, the alcoholics responded in an undifferentiated manner to all facial expressions. In the alcoholic participants, amygdala activity was inversely correlated with an increase in lateral prefrontal activity as a function of their behavioral deficits. Prefrontal modulation of emotional function as a compensation for the blunted amygdala activity during a socially relevant face appraisal task is in agreement with a distributed network engagement during emotional face processing. Deficient activation of amygdala and hippocampus may underlie impaired processing of emotional faces associated with long-term alcoholism and may be a part of the wide array of behavioral problems including disinhibition, concurring with previously documented interpersonal difficulties in this population. Furthermore, the results suggest that alcoholics may rely on prefrontal rather than temporal limbic areas in order to compensate for reduced limbic responsivity and to maintain behavioral adequacy when faced with emotionally or socially challenging situations.

  14. Seeing emotion with your ears: emotional prosody implicitly guides visual attention to faces.

    Directory of Open Access Journals (Sweden)

    Simon Rigoulot

    Full Text Available Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality and emotional speech prosody (auditory modality which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0-1250 ms], [1250-2500 ms], [2500-5000 ms] were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect, although this effect was often emotion-specific (with greatest effects for fear. Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions.

  15. Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

    Science.gov (United States)

    Rigoulot, Simon; Pell, Marc D.

    2012-01-01

    Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions. PMID:22303454

  16. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias.

    Science.gov (United States)

    Invitto, Sara; Calcagnì, Antonio; Mignozzi, Arianna; Scardino, Rosanna; Piraino, Giulia; Turchi, Daniele; De Feudis, Irio; Brunetti, Antonio; Bevilacqua, Vitoantonio; de Tommaso, Marina

    2017-01-01

    Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians). Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP) and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment). A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.

  17. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias

    Directory of Open Access Journals (Sweden)

    Sara Invitto

    2017-08-01

    Full Text Available Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians. Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment. A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.

  18. Matching faces with emotional expressions

    Directory of Open Access Journals (Sweden)

    Wenfeng eChen

    2011-08-01

    Full Text Available There is some evidence that faces with a happy expression are recognized better than faces with other expressions. However, little is known about whether this happy face advantage also applies to perceptual face matching, and whether similar differences exist among other expressions. Using a sequential matching paradigm, we systematically compared the effects of seven basic facial expressions on identity recognition. Identity matching was quickest when a pair of faces had an identical happy/sad/neutral expression, poorer when they had a fearful/surprise/angry expression, and poorest when they had a disgust expression. Faces with a happy/sad/fear/surprise expression were matched faster than those with an anger/disgust expression when the second face in a pair had a neutral expression. These results demonstrate that effects of facial expression on identity recognition are not limited to happy faces when a learned face is immediately tested. The results suggest different influences of expression in perceptual matching and long-term recognition memory.

  19. Is empathy necessary to comprehend the emotional faces? The empathic effect on attentional mechanisms (eye movements), cortical correlates (N200 event-related potentials) and facial behaviour (electromyography) in face processing.

    Science.gov (United States)

    Balconi, Michela; Canavesio, Ylenia

    2016-01-01

    The present research explored the effect of social empathy on processing emotional facial expressions. Previous evidence suggested a close relationship between emotional empathy and both the ability to detect facial emotions and the attentional mechanisms involved. A multi-measure approach was adopted: we investigated the association between trait empathy (Balanced Emotional Empathy Scale) and individuals' performance (response times; RTs), attentional mechanisms (eye movements; number and duration of fixations), correlates of cortical activation (event-related potential (ERP) N200 component), and facial responsiveness (facial zygomatic and corrugator activity). Trait empathy was found to affect face detection performance (reduced RTs), attentional processes (more scanning eye movements in specific areas of interest), ERP salience effect (increased N200 amplitude), and electromyographic activity (more facial responses). A second important result was the demonstration of strong, direct correlations among these measures. We suggest that empathy may function as a social facilitator of the processes underlying the detection of facial emotion, and a general "facial response effect" is proposed to explain these results. We assumed that empathy influences cognitive and the facial responsiveness, such that empathic individuals are more skilful in processing facial emotion.

  20. The contribution of emotional empathy to approachability judgements assigned to emotional faces is context specific

    Directory of Open Access Journals (Sweden)

    Megan L Willis

    2015-08-01

    Full Text Available Previous research on approachability judgements has indicated that facial expressions modulate how these judgements are made, but the relationship between emotional empathy and context in this decision-making process has not yet been examined. This study examined the contribution of emotional empathy to approachability judgements assigned to emotional faces in different contexts. One hundred and twenty female participants completed the Questionnaire Measure of Emotional Empathy. Participants provided approachability judgements to faces displaying angry, disgusted, fearful, happy, neutral and sad expressions, in three different contexts – when evaluating whether they would approach another individual to: 1 receive help; 2 give help; or 3 when no contextual information was provided. In addition, participants were also required to provide ratings of perceived threat, emotional intensity and label facial expressions. Emotional empathy significantly predicted approachability ratings for specific emotions in each context, over and above the contribution of perceived threat and intensity, which were associated with emotional empathy. Higher emotional empathy predicted less willingness to approach people with angry and disgusted faces to receive help, and a greater willingness to approach people with happy faces to receive help. Higher emotional empathy also predicted a greater willingness to approach people with sad faces to offer help, and more willingness to approach people with happy faces when no contextual information was provided. These results highlight the important contribution of individual differences in emotional empathy in predicting how approachability judgements are assigned to facial expressions in context.

  1. Task-irrelevant emotion facilitates face discrimination learning.

    Science.gov (United States)

    Lorenzino, Martina; Caudek, Corrado

    2015-03-01

    We understand poorly how the ability to discriminate faces from one another is shaped by visual experience. The purpose of the present study is to determine whether face discrimination learning can be facilitated by facial emotions. To answer this question, we used a task-irrelevant perceptual learning paradigm because it closely mimics the learning processes that, in daily life, occur without a conscious intention to learn and without an attentional focus on specific facial features. We measured face discrimination thresholds before and after training. During the training phase (4 days), participants performed a contrast discrimination task on face images. They were not informed that we introduced (task-irrelevant) subtle variations in the face images from trial to trial. For the Identity group, the task-irrelevant features were variations along a morphing continuum of facial identity. For the Emotion group, the task-irrelevant features were variations along an emotional expression morphing continuum. The Control group did not undergo contrast discrimination learning and only performed the pre-training and post-training tests, with the same temporal gap between them as the other two groups. Results indicate that face discrimination improved, but only for the Emotion group. Participants in the Emotion group, moreover, showed face discrimination improvements also for stimulus variations along the facial identity dimension, even if these (task-irrelevant) stimulus features had not been presented during training. The present results highlight the importance of emotions for face discrimination learning. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Men appear more lateralized when noticing emotion in male faces.

    Science.gov (United States)

    Rahman, Qazi; Anchassi, Tarek

    2012-02-01

    Empirical tests of the "right hemisphere dominance" versus "valence" theories of emotion processing are confounded by known sex differences in lateralization. Moreover, information about the sex of the person posing an emotion might be processed differently by men and women because of an adaptive male bias to notice expressions of threat and vigilance in other male faces. The purpose of this study was to investigate whether sex of poser and emotion displayed influenced lateralization in men and women by analyzing "laterality quotient" scores on a test which depicts vertically split chimeric faces, formed with one half showing a neutral expression and the other half showing an emotional expression. We found that men (N = 50) were significantly more lateralized for emotions indicative of vigilance and threat (happy, sad, angry, and surprised) in male faces relative to female faces and compared to women (N = 44). These data indicate that sex differences in functional cerebral lateralization for facial emotion may be specific to the emotion presented and the sex of face presenting it. PsycINFO Database Record (c) 2012 APA, all rights reserved

  3. Your emotion or mine: Labeling feelings alters emotional face perception- An ERP study on automatic and intentional affect labeling

    Directory of Open Access Journals (Sweden)

    Cornelia eHerbert

    2013-07-01

    Full Text Available Empirical evidence suggests that words are powerful regulators of emotion processing. Although a number of studies have used words as contextual cues for emotion processing, the role of what is being labeled by the words (i.e. one’s own emotion as compared to the emotion expressed by the sender is poorly understood. The present study reports results from two experiments which used ERP methodology to evaluate the impact of emotional faces and self- versus sender-related emotional pronoun-noun pairs (e.g. my fear vs. his fear as cues for emotional face processing. The influence of self- and sender-related cues on the processing of fearful, angry and happy faces was investigated in two contexts: an automatic (experiment 1 and intentional affect labeling task (experiment 2, along with control conditions of passive face processing. ERP patterns varied as a function of the label’s reference (self vs. sender and the intentionality of the labelling task (experiment 1 vs. experiment 2. In experiment 1, self-related labels increased the motivational relevance of the emotional faces in the time-window of the EPN component. Processing of sender-related labels improved emotion recognition specifically for fearful faces in the N170 time-window. Spontaneous processing of affective labels modulated later stages of face processing as well. Amplitudes of the late positive potential (LPP were reduced for fearful, happy, and angry faces relative to the control condition of passive viewing. During intentional regulation (experiment 2 amplitudes of the LPP were enhanced for emotional faces when subjects used the self-related emotion labels to label their own emotion during face processing, and they rated the faces as higher in arousal than the emotional faces that had been presented in the label sender’s emotion condition or the passive viewing condition. The present results argue in favor of a differentiated view of language-as-context for emotion processing.

  4. State-dependent alteration in face emotion recognition in depression.

    Science.gov (United States)

    Anderson, Ian M; Shippen, Clare; Juhasz, Gabriella; Chase, Diana; Thomas, Emma; Downey, Darragh; Toth, Zoltan G; Lloyd-Williams, Kathryn; Elliott, Rebecca; Deakin, J F William

    2011-04-01

    Negative biases in emotional processing are well recognised in people who are currently depressed but are less well described in those with a history of depression, where such biases may contribute to vulnerability to relapse. To compare accuracy, discrimination and bias in face emotion recognition in those with current and remitted depression. The sample comprised a control group (n = 101), a currently depressed group (n = 30) and a remitted depression group (n = 99). Participants provided valid data after receiving a computerised face emotion recognition task following standardised assessment of diagnosis and mood symptoms. In the control group women were more accurate in recognising emotions than men owing to greater discrimination. Among participants with depression, those in remission correctly identified more emotions than controls owing to increased response bias, whereas those currently depressed recognised fewer emotions owing to decreased discrimination. These effects were most marked for anger, fear and sadness but there was no significant emotion × group interaction, and a similar pattern tended to be seen for happiness although not for surprise or disgust. These differences were confined to participants who were antidepressant-free, with those taking antidepressants having similar results to the control group. Abnormalities in face emotion recognition differ between people with current depression and those in remission. Reduced discrimination in depressed participants may reflect withdrawal from the emotions of others, whereas the increased bias in those with a history of depression could contribute to vulnerability to relapse. The normal face emotion recognition seen in those taking medication may relate to the known effects of antidepressants on emotional processing and could contribute to their ability to protect against depressive relapse.

  5. Differential emotion attribution to neutral faces of own and other races.

    Science.gov (United States)

    Hu, Chao S; Wang, Qiandong; Han, Tong; Weare, Ethan; Fu, Genyue

    2017-02-01

    Past research has demonstrated differential recognition of emotion on faces of different races. This paper reports the first study to explore differential emotion attribution to neutral faces of different races. Chinese and Caucasian adults viewed a series of Chinese and Caucasian neutral faces and judged their outward facial expression: neutral, positive, or negative. The results showed that both Chinese and Caucasian viewers perceived more Chinese faces than Caucasian faces as neutral. Nevertheless, Chinese viewers attributed positive emotion to Caucasian faces more than to Chinese faces, whereas Caucasian viewers attributed negative emotion to Caucasian faces more than to Chinese faces. Moreover, Chinese viewers attributed negative and neutral emotion to the faces of both races without significant difference in frequency, whereas Caucasian viewers mostly attributed neutral emotion to the faces. These differences between Chinese and Caucasian viewers may be due to differential visual experience, culture, racial stereotype, or expectation of the experiment. We also used eye tracking among the Chinese participants to explore the relationship between face-processing strategy and emotion attribution to neutral faces. The results showed that the interaction between emotion attribution and face race was significant on face-processing strategy, such as fixation proportion on eyes and saccade amplitude. Additionally, pupil size during processing Caucasian faces was larger than during processing Chinese faces.

  6. How stable is activation in the amygdala and prefrontal cortex in adolescence? A study of emotional face processing across three measurements

    NARCIS (Netherlands)

    van den Bulk, B.G.; Koolschijn, P.C.M.P.; Meens, P.H.F.; van Lang, N.D.J.; van der Wee, N.J.A.; Rombouts, S.A.R.B.; Vermeiren, R.R.J.M.; Crone, E.A.

    2013-01-01

    Prior developmental functional magnetic resonance imaging (fMRI) studies have demonstrated elevated activation patterns in the amygdala and prefrontal cortex (PFC) in response to viewing emotional faces. As adolescence is a time of substantial variability in mood and emotional responsiveness, the

  7. One Size Does Not Fit All: Face Emotion Processing Impairments in Semantic Dementia, Behavioural-Variant Frontotemporal Dementia and Alzheimer?s Disease Are Mediated by Distinct Cognitive Deficits

    OpenAIRE

    Miller, Laurie A.; Hsieh, Sharpley; Lah, Suncica; Savage, Sharon; Hodges, John R.; Piguet, Olivier

    2011-01-01

    Patients with frontotemporal dementia (both behavioural variant [bvFTD] and semantic dementia [SD]) as well as those with Alzheimer's disease (AD) show deficits on tests of face emotion processing, yet the mechanisms underlying these deficits have rarely been explored. We compared groups of patients with bvFTD (n = 17), SD (n = 12) or AD (n = 20) to an age- and education-matched group of healthy control subjects (n = 36) on three face emotion processing tasks (Ekman 60, Emotion Matching and E...

  8. Enhanced amygdala reactivity to emotional faces in adults reporting childhood emotional maltreatment

    Science.gov (United States)

    van Tol, Marie-José; Demenescu, Liliana R.; van der Wee, Nic J. A.; Veltman, Dick J.; Aleman, André; van Buchem, Mark A.; Spinhoven, Philip; Penninx, Brenda W. J. H.; Elzinga, Bernet M.

    2013-01-01

    In the context of chronic childhood emotional maltreatment (CEM; emotional abuse and/or neglect), adequately responding to facial expressions is an important skill. Over time, however, this adaptive response may lead to a persistent vigilance for emotional facial expressions. The amygdala and the medial prefrontal cortex (mPFC) are key regions in face processing. However, the neurobiological correlates of face processing in adults reporting CEM are yet unknown. We examined amydala and mPFC reactivity to emotional faces (Angry, Fearful, Sad, Happy, Neutral) vs scrambled faces in healthy controls and unmedicated patients with depression and/or anxiety disorders reporting CEM before the age of 16 years (n = 60), and controls and patients who report no childhood abuse (n = 75). We found that CEM was associated with enhanced bilateral amygdala reactivity to emotional faces in general, and independent of psychiatric status. Furthermore, we found no support for differential mPFC functioning, suggesting that amygdala hyper-responsivity to emotional facial perception in adults reporting CEM may be independent from top–down influences of the mPFC. These findings may be key in understanding the increased emotional sensitivity and interpersonal difficulties, that have been reported in individuals with a history of CEM. PMID:22258799

  9. Computer-Assisted Face Processing Instruction Improves Emotion Recognition, Mentalizing, and Social Skills in Students with ASD

    Science.gov (United States)

    Rice, Linda Marie; Wall, Carla Anne; Fogel, Adam; Shic, Frederick

    2015-01-01

    This study examined the extent to which a computer-based social skills intervention called "FaceSay"™ was associated with improvements in affect recognition, mentalizing, and social skills of school-aged children with Autism Spectrum Disorder (ASD). "FaceSay"™ offers students simulated practice with eye gaze, joint attention,…

  10. Digitizing the moving face: asymmetries of emotion and gender

    Directory of Open Access Journals (Sweden)

    Ashish Desai

    2009-04-01

    movement under voluntary conditions. In males, movement asymmetries favoring the lower left side of the face occurred for most emotional expressions. For females, all emotions were symmetric over the lower face. Our findings with computer digitizing techniques support the hypothesis that there are gender differences in facial movement asymmetries during the expression of emotion. They further underscore the view that emotional processing may represent a more widely distributed system throughout the brain in women than in men, corresponding to previous reports that language processes are also less lateralized in women.

  11. Categorical Perception of emotional faces is not affected by aging

    Directory of Open Access Journals (Sweden)

    Mandy Rossignol

    2009-11-01

    Full Text Available Effects of normal aging on categorical perception (CP of facial emotional expressions were investigated. One-hundred healthy participants (20 to 70 years old; five age groups had to identify morphed expressions ranging from neutrality to happiness, sadness and fear. We analysed percentages and latencies of correct recognition for nonmorphed emotional expressions, percentages and latencies of emotional recognition for morphed-faces, locus of the boundaries along the different continua and the number of intrusions. The results showed that unmorphed happy and fearful faces were better processed than unmorphed sad and neutral faces. For morphed faces, CP was confirmed, as latencies increased as a function of the distance between the displayed morph and the original unmorphed photograph. The locus of categorical boundaries was not affected by age. Aging did not alter the accuracy of recognition for original pictures, no more than the emotional recognition of morphed faces or the rate of intrusions. However, latencies of responses increased with age, for both unmorphed and morphed pictures. In conclusion, CP of facial expressions appears to be spared in aging.

  12. Similar representations of emotions across faces and voices.

    Science.gov (United States)

    Kuhn, Lisa Katharina; Wydell, Taeko; Lavan, Nadine; McGettigan, Carolyn; Garrido, Lúcia

    2017-09-01

    [Correction Notice: An Erratum for this article was reported in Vol 17(6) of Emotion (see record 2017-18585-001). In the article, the copyright attribution was incorrectly listed and the Creative Commons CC-BY license disclaimer was incorrectly omitted from the author note. The correct copyright is "© 2017 The Author(s)" and the omitted disclaimer is below. All versions of this article have been corrected. "This article has been published under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Copyright for this article is retained by the author(s). Author(s) grant(s) the American Psychological Association the exclusive right to publish the article and identify itself as the original publisher."] Emotions are a vital component of social communication, carried across a range of modalities and via different perceptual signals such as specific muscle contractions in the face and in the upper respiratory system. Previous studies have found that emotion recognition impairments after brain damage depend on the modality of presentation: recognition from faces may be impaired whereas recognition from voices remains preserved, and vice versa. On the other hand, there is also evidence for shared neural activation during emotion processing in both modalities. In a behavioral study, we investigated whether there are shared representations in the recognition of emotions from faces and voices. We used a within-subjects design in which participants rated the intensity of facial expressions and nonverbal vocalizations for each of the 6 basic emotion labels. For each participant and each modality, we then computed a representation matrix with the intensity ratings of each emotion. These matrices allowed us to examine the patterns of confusions between emotions and to characterize the representations

  13. Face or body? Oxytocin improves perception of emotions from facial expressions in incongruent emotional body context.

    Science.gov (United States)

    Perry, Anat; Aviezer, Hillel; Goldstein, Pavel; Palgi, Sharon; Klein, Ehud; Shamay-Tsoory, Simone G

    2013-11-01

    The neuropeptide oxytocin (OT) has been repeatedly reported to play an essential role in the regulation of social cognition in humans in general, and specifically in enhancing the recognition of emotions from facial expressions. The later was assessed in different paradigms that rely primarily on isolated and decontextualized emotional faces. However, recent evidence has indicated that the perception of basic facial expressions is not context invariant and can be categorically altered by context, especially body context, at early perceptual levels. Body context has a strong effect on our perception of emotional expressions, especially when the actual target face and the contextually expected face are perceptually similar. To examine whether and how OT affects emotion recognition, we investigated the role of OT in categorizing facial expressions in incongruent body contexts. Our results show that in the combined process of deciphering emotions from facial expressions and from context, OT gives an advantage to the face. This advantage is most evident when the target face and the contextually expected face are perceptually similar. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Facing mixed emotions: Analytic and holistic perception of facial emotion expressions engages separate brain networks.

    Science.gov (United States)

    Meaux, Emilie; Vuilleumier, Patrik

    2016-11-01

    The ability to decode facial emotions is of primary importance for human social interactions; yet, it is still debated how we analyze faces to determine their expression. Here we compared the processing of emotional face expressions through holistic integration and/or local analysis of visual features, and determined which brain systems mediate these distinct processes. Behavioral, physiological, and brain responses to happy and angry faces were assessed by presenting congruent global configurations of expressions (e.g., happy top+happy bottom), incongruent composite configurations (e.g., angry top+happy bottom), and isolated features (e.g. happy top only). Top and bottom parts were always from the same individual. Twenty-six healthy volunteers were scanned using fMRI while they classified the expression in either the top or the bottom face part but ignored information in the other non-target part. Results indicate that the recognition of happy and anger expressions is neither strictly holistic nor analytic Both routes were involved, but with a different role for analytic and holistic information depending on the emotion type, and different weights of local features between happy and anger expressions. Dissociable neural pathways were engaged depending on emotional face configurations. In particular, regions within the face processing network differed in their sensitivity to holistic expression information, which predominantly activated fusiform, inferior occipital areas and amygdala when internal features were congruent (i.e. template matching), whereas more local analysis of independent features preferentially engaged STS and prefrontal areas (IFG/OFC) in the context of full face configurations, but early visual areas and pulvinar when seen in isolated parts. Collectively, these findings suggest that facial emotion recognition recruits separate, but interactive dorsal and ventral routes within the face processing networks, whose engagement may be shaped by

  15. Emotional faces influence evaluation of natural and transformed food.

    Science.gov (United States)

    Manippa, Valerio; Padulo, Caterina; Brancucci, Alfredo

    2018-07-01

    Previous evidence showed the presence of a straight relationship between feeding behavior and emotions. Despite that, no studies have focused on the influence of emotional faces on food processing. In our study, participants were presented with 72 couples of visual stimuli composed of a neutral, happy, or disgusted faces (5000 ms duration in Experiment 1, adaptation; 150 ms in Experiment 2, priming) followed by a food stimulus (1500 ms). Food stimuli were grouped in pleasant foods, further divided in natural and transformed, and unpleasant rotten foods. The task consisted in judging the food valence (as 'pleasant' or 'unpleasant') by keypress. Results showed a different pattern of response based on the transformation level of food. In general, the evaluation of natural foods was more rapid compared with transformed foods, maybe for their simplicity and healthier perception. In addition, transformed foods yielded incongruent responses with respect to the preceding emotional face, whereas natural foods yielded congruent responses with respect to it. These effects were independent of the duration of the emotional face (i.e., adaptation or priming paradigm) and may depend on pleasant food stimuli salience.

  16. Vicarious Social Touch Biases Gazing at Faces and Facial Emotions.

    Science.gov (United States)

    Schirmer, Annett; Ng, Tabitha; Ebstein, Richard P

    2018-02-01

    Research has suggested that interpersonal touch promotes social processing and other-concern, and that women may respond to it more sensitively than men. In this study, we asked whether this phenomenon would extend to third-party observers who experience touch vicariously. In an eye-tracking experiment, participants (N = 64, 32 men and 32 women) viewed prime and target images with the intention of remembering them. Primes comprised line drawings of dyadic interactions with and without touch. Targets comprised two faces shown side-by-side, with one being neutral and the other being happy or sad. Analysis of prime fixations revealed that faces in touch interactions attracted longer gazing than faces in no-touch interactions. In addition, touch enhanced gazing at the area of touch in women but not men. Analysis of target fixations revealed that touch priming increased looking at both faces immediately after target onset, and subsequently, at the emotional face in the pair. Sex differences in target processing were nonsignificant. Together, the present results imply that vicarious touch biases visual attention to faces and promotes emotion sensitivity. In addition, they suggest that, compared with men, women are more aware of tactile exchanges in their environment. As such, vicarious touch appears to share important qualities with actual physical touch. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  17. Emotion perception accuracy and bias in face-to-face versus cyberbullying.

    Science.gov (United States)

    Ciucci, Enrica; Baroncelli, Andrea; Nowicki, Stephen

    2014-01-01

    The authors investigated the association of traditional and cyber forms of bullying and victimization with emotion perception accuracy and emotion perception bias. Four basic emotions were considered (i.e., happiness, sadness, anger, and fear); 526 middle school students (280 females; M age = 12.58 years, SD = 1.16 years) were recruited, and emotionality was controlled. Results indicated no significant findings for girls. Boys with higher levels of traditional bullying did not show any deficit in perception accuracy of emotions, but they were prone to identify happiness and fear in faces when a different emotion was expressed; in addition, male cyberbullying was related to greater accuracy in recognizing fear. In terms of the victims, cyber victims had a global problem in recognizing emotions and a specific problem in processing anger and fear. It was concluded that emotion perception accuracy and bias were associated with bullying and victimization for boys not only in traditional settings but also in the electronic ones. Implications of these findings for possible intervention are discussed.

  18. Emotion elicitor or emotion messenger? Subliminal priming reveals two faces of facial expressions.

    Science.gov (United States)

    Ruys, Kirsten I; Stapel, Diederik A

    2008-06-01

    Facial emotional expressions can serve both as emotional stimuli and as communicative signals. The research reported here was conducted to illustrate how responses to both roles of facial emotional expressions unfold over time. As an emotion elicitor, a facial emotional expression (e.g., a disgusted face) activates a response that is similar to responses to other emotional stimuli of the same valence (e.g., a dirty, nonflushed toilet). As an emotion messenger, the same facial expression (e.g., a disgusted face) serves as a communicative signal by also activating the knowledge that the sender is experiencing a specific emotion (e.g., the sender feels disgusted). By varying the duration of exposure to disgusted, fearful, angry, and neutral faces in two subliminal-priming studies, we demonstrated that responses to faces as emotion elicitors occur prior to responses to faces as emotion messengers, and that both types of responses may unfold unconsciously.

  19. A face a mother could love: depression-related maternal neural responses to infant emotion faces.

    Science.gov (United States)

    Laurent, Heidemarie K; Ablow, Jennifer C

    2013-01-01

    Depressed mothers show negatively biased responses to their infants' emotional bids, perhaps due to faulty processing of infant cues. This study is the first to examine depression-related differences in mothers' neural response to their own infant's emotion faces, considering both effects of perinatal depression history and current depressive symptoms. Primiparous mothers (n = 22), half of whom had a history of major depressive episodes (with one episode occurring during pregnancy and/or postpartum), were exposed to images of their own and unfamiliar infants' joy and distress faces during functional neuroimaging. Group differences (depression vs. no-depression) and continuous effects of current depressive symptoms were tested in relation to neural response to own infant emotion faces. Compared to mothers with no psychiatric diagnoses, those with depression showed blunted responses to their own infant's distress faces in the dorsal anterior cingulate cortex. Mothers with higher levels of current symptomatology showed reduced responses to their own infant's joy faces in the orbitofrontal cortex and insula. Current symptomatology also predicted lower responses to own infant joy-distress in left-sided prefrontal and insula/striatal regions. These deficits in self-regulatory and motivational response circuits may help explain parenting difficulties in depressed mothers.

  20. A leftward bias however you look at it: Revisiting the emotional chimeric face task as a tool for measuring emotion lateralization.

    Science.gov (United States)

    R Innes, Bobby; Burt, D Michael; Birch, Yan K; Hausmann, Markus

    2015-12-28

    Left hemiface biases observed within the Emotional Chimeric Face Task (ECFT) support emotional face perception models whereby all expressions are preferentially processed by the right hemisphere. However, previous research using this task has not considered that the visible midline between hemifaces might engage atypical facial emotion processing strategies in upright or inverted conditions, nor controlled for left visual field (thus right hemispheric) visuospatial attention biases. This study used novel emotional chimeric faces (blended at the midline) to examine laterality biases for all basic emotions. Left hemiface biases were demonstrated across all emotional expressions and were reduced, but not reversed, for inverted faces. The ECFT bias in upright faces was significantly increased in participants with a large attention bias. These results support the theory that left hemiface biases reflect a genuine bias in emotional face processing, and this bias can interact with attention processes similarly localized in the right hemisphere.

  1. A motivational determinant of facial emotion recognition: regulatory focus affects recognition of emotions in faces.

    Science.gov (United States)

    Sassenrath, Claudia; Sassenberg, Kai; Ray, Devin G; Scheiter, Katharina; Jarodzka, Halszka

    2014-01-01

    Two studies examined an unexplored motivational determinant of facial emotion recognition: observer regulatory focus. It was predicted that a promotion focus would enhance facial emotion recognition relative to a prevention focus because the attentional strategies associated with promotion focus enhance performance on well-learned or innate tasks - such as facial emotion recognition. In Study 1, a promotion or a prevention focus was experimentally induced and better facial emotion recognition was observed in a promotion focus compared to a prevention focus. In Study 2, individual differences in chronic regulatory focus were assessed and attention allocation was measured using eye tracking during the facial emotion recognition task. Results indicated that the positive relation between a promotion focus and facial emotion recognition is mediated by shorter fixation duration on the face which reflects a pattern of attention allocation matched to the eager strategy in a promotion focus (i.e., striving to make hits). A prevention focus did not have an impact neither on perceptual processing nor on facial emotion recognition. Taken together, these findings demonstrate important mechanisms and consequences of observer motivational orientation for facial emotion recognition.

  2. Emotion categorization does not depend on explicit face categorization

    NARCIS (Netherlands)

    Seirafi, M.; de Weerd, P.; de Gelder, B.

    2013-01-01

    Face perception and emotion recognition have been extensively studied in the past decade; however, the relation between them is still poorly understood. A traditional view is that successful emotional categorization requires categorization of the stimulus as a ‘face', at least at the basic level.

  3. The time course of face processing: startle eyeblink response modulation by face gender and expression.

    Science.gov (United States)

    Duval, Elizabeth R; Lovelace, Christopher T; Aarant, Justin; Filion, Diane L

    2013-12-01

    The purpose of this study was to investigate the effects of both facial expression and face gender on startle eyeblink response patterns at varying lead intervals (300, 800, and 3500ms) indicative of attentional and emotional processes. We aimed to determine whether responses to affective faces map onto the Defense Cascade Model (Lang et al., 1997) to better understand the stages of processing during affective face viewing. At 300ms, there was an interaction between face expression and face gender with female happy and neutral faces and male angry faces producing inhibited startle. At 3500ms, there was a trend for facilitated startle during angry compared to neutral faces. These findings suggest that affective expressions are perceived differently in male and female faces, especially at short lead intervals. Future studies investigating face processing should take both face gender and expression into account. © 2013.

  4. Improved emotional conflict control triggered by the processing priority of negative emotion.

    Science.gov (United States)

    Yang, Qian; Wang, Xiangpeng; Yin, Shouhang; Zhao, Xiaoyue; Tan, Jinfeng; Chen, Antao

    2016-04-18

    The prefrontal cortex is responsible for emotional conflict resolution, and this control mechanism is affected by the emotional valence of distracting stimuli. In the present study, we investigated effects of negative and positive stimuli on emotional conflict control using a face-word Stroop task in combination with functional brain imaging. Emotional conflict was absent in the negative face context, in accordance with the null activation observed in areas regarding emotional face processing (fusiform face area, middle temporal/occipital gyrus). Importantly, these visual areas negatively coupled with the dorsolateral prefrontal cortex (DLPFC). However, the significant emotional conflict was observed in the positive face context, this effect was accompanied by activation in areas associated with emotional face processing, and the default mode network (DMN), here, DLPFC mainly negatively coupled with DMN, rather than visual areas. These results suggested that the conflict control mechanism exerted differently between negative faces and positive faces, it implemented more efficiently in the negative face condition, whereas it is more devoted to inhibiting internal interference in the positive face condition. This study thus provides a plausible mechanism of emotional conflict resolution that the rapid pathway for negative emotion processing efficiently triggers control mechanisms to preventively resolve emotional conflict.

  5. Congruence of happy and sad emotion in music and faces modifies cortical audiovisual activation.

    Science.gov (United States)

    Jeong, Jeong-Won; Diwadkar, Vaibhav A; Chugani, Carla D; Sinsoongsud, Piti; Muzik, Otto; Behen, Michael E; Chugani, Harry T; Chugani, Diane C

    2011-02-14

    The powerful emotion inducing properties of music are well-known, yet music may convey differing emotional responses depending on environmental factors. We hypothesized that neural mechanisms involved in listening to music may differ when presented together with visual stimuli that conveyed the same emotion as the music when compared to visual stimuli with incongruent emotional content. We designed this study to determine the effect of auditory (happy and sad instrumental music) and visual stimuli (happy and sad faces) congruent or incongruent for emotional content on audiovisual processing using fMRI blood oxygenation level-dependent (BOLD) signal contrast. The experiment was conducted in the context of a conventional block-design experiment. A block consisted of three emotional ON periods, music alone (happy or sad music), face alone (happy or sad faces), and music combined with faces where the music excerpt was played while presenting either congruent emotional faces or incongruent emotional faces. We found activity in the superior temporal gyrus (STG) and fusiform gyrus (FG) to be differentially modulated by music and faces depending on the congruence of emotional content. There was a greater BOLD response in STG when the emotion signaled by the music and faces was congruent. Furthermore, the magnitude of these changes differed for happy congruence and sad congruence, i.e., the activation of STG when happy music was presented with happy faces was greater than the activation seen when sad music was presented with sad faces. In contrast, incongruent stimuli diminished the BOLD response in STG and elicited greater signal change in bilateral FG. Behavioral testing supplemented these findings by showing that subject ratings of emotion in faces were influenced by emotion in music. When presented with happy music, happy faces were rated as more happy (p=0.051) and sad faces were rated as less sad (p=0.030). When presented with sad music, happy faces were rated as less

  6. Laterality Biases to Chimeric Faces in Asperger Syndrome: What Is Right about Face-Processing?

    Science.gov (United States)

    Ashwin, Chris; Wheelwright, Sally; Baron-Cohen, Simon

    2005-01-01

    People show a left visual field (LVF) bias for faces, i.e., involving the right hemisphere of the brain. Lesion and neuroimaging studies confirm the importance of the right-hemisphere and suggest separable neural pathways for processing facial identity vs. emotions. We investigated the hemispheric processing of faces in adults with and without…

  7. Acute pharmacologically induced shifts in serotonin availability abolish emotion-selective responses to negative face emotions in distinct brain networks

    DEFF Research Database (Denmark)

    Grady, Cheryl Lynn; Siebner, Hartwig R; Hornboll, Bettina

    2013-01-01

    Pharmacological manipulation of serotonin availability can alter the processing of facial expressions of emotion. Using a within-subject design, we measured the effect of serotonin on the brain's response to aversive face emotions with functional MRI while 20 participants judged the gender...... of neutral, fearful and angry faces. In three separate and counterbalanced sessions, participants received citalopram (CIT) to raise serotonin levels, underwent acute tryptophan depletion (ATD) to lower serotonin, or were studied without pharmacological challenge (Control). An analysis designed to identify...

  8. Adaptation to emotional conflict: evidence from a novel face emotion paradigm.

    Science.gov (United States)

    Clayson, Peter E; Larson, Michael J

    2013-01-01

    The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation) relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression). Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth) or incongruent (happy eyes, angry mouth) while high-density event-related potentials (ERPs) were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs) showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs). Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words.

  9. Adaptation to emotional conflict: evidence from a novel face emotion paradigm.

    Directory of Open Access Journals (Sweden)

    Peter E Clayson

    Full Text Available The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression. Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth or incongruent (happy eyes, angry mouth while high-density event-related potentials (ERPs were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs. Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words.

  10. Faces and bodies: perception and mimicry of emotionally congruent and incongruent facial and bodily expressions

    Directory of Open Access Journals (Sweden)

    Mariska eKret

    2013-02-01

    Full Text Available Traditional emotion theories stress the importance of the face in the expression of emotions but bodily expressions are becoming increasingly important. Here we tested the hypothesis that similar physiological responses can be evoked by observing emotional face and body signals and that the reaction to angry signals is amplified in anxious individuals. We designed three experiments in which participants categorized emotional expressions from isolated facial and bodily expressions and from emotionally congruent and incongruent face-body compounds. Participants’ fixations were measured and their pupil size recorded with eye-tracking equipment, and their facial reactions measured with electromyography (EMG. The behavioral results support our prediction that the recognition of a facial expression is improved in the context of a matching posture and importantly, also vice versa. From their facial expression, it appeared that observers acted with signs of negative emotionality (increased corrugator activity to angry and fearful facial expressions and with positive emotionality (increased zygomaticus to happy facial expressions. What we predicted and found, was that angry and fearful cues from the face or the body, attracted more attention than happy cues. We further observed that responses evoked by angry cues were amplified in individuals with high anxiety scores. In sum, we show that people process bodily expressions of emotion in a similar fashion as facial expressions and that the congruency between the emotional signals from the face and body ameliorates the recognition of the emotion.

  11. The complex duration perception of emotional faces: Effects of face direction

    Directory of Open Access Journals (Sweden)

    Katrin Martina Kliegl

    2015-03-01

    Full Text Available The perceived duration of emotional face stimuli strongly depends on the expressed emotion. But, emotional faces also differ regarding a number of other features like gaze, face direction, or sex. Usually, these features have been controlled by only using pictures of female models with straight gaze and face direction. Doi and Shinohara (2009 reported that an overestimation of angry faces could only be found when the model’s gaze was oriented towards the observer. We aimed at replicating this effect for face direction. Moreover, we explored the effect of face direction on the duration perception sad faces. Controlling for the sex of the face model and the participant, female and male participants rated the duration of neutral, angry and sad face stimuli of both sexes photographed from different perspectives in a bisection task. In line with current findings, we report a significant overestimation of angry compared to neutral face stimuli that was modulated by face direction. Moreover, the perceived duration of sad face stimuli did not differ from that of neutral faces and was not influenced by face direction. Furthermore, we found that faces of the opposite sex appeared to last longer than those of the same sex. This outcome is discussed with regards to stimulus parameters like the induced arousal, social relevance and an evolutionary context.

  12. No Differences in Emotion Recognition Strategies in Children with Autism Spectrum Disorder: Evidence from Hybrid Faces

    Directory of Open Access Journals (Sweden)

    Kris Evers

    2014-01-01

    Full Text Available Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD. However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness or in the mouth region (so-called bottom-emotions: sadness, anger, and fear. No stronger reliance on mouth information was found in children with ASD.

  13. No differences in emotion recognition strategies in children with autism spectrum disorder: evidence from hybrid faces.

    Science.gov (United States)

    Evers, Kris; Kerkhof, Inneke; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2014-01-01

    Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD). However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD) group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region) with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness) or in the mouth region (so-called bottom-emotions: sadness, anger, and fear). No stronger reliance on mouth information was found in children with ASD.

  14. Visual Afterimages of Emotional Faces in High Functioning Autism

    Science.gov (United States)

    Rutherford, M. D.; Troubridge, Erin K.; Walsh, Jennifer

    2012-01-01

    Fixating an emotional facial expression can create afterimages, such that subsequent faces are seen as having the opposite expression of that fixated. Visual afterimages have been used to map the relationships among emotion categories, and this method was used here to compare ASD and matched control participants. Participants adapted to a facial…

  15. What Is Going On? The Process of Generating Questions about Emotion and Social Cognition in Bipolar Disorder and Schizophrenia with Cartoon Situations and Faces

    Directory of Open Access Journals (Sweden)

    Bryan D. Fantie

    2018-04-01

    Full Text Available Regarding the notion of putative “best” practices in social neuroscience and science in general, we contend that following established procedures has advantages, but prescriptive uniformity in methodology can obscure flaws, bias thinking, stifle creativity, and restrict exploration. Generating hypotheses is at least as important as testing hypotheses. To illustrate this process, we describe the following exploratory study. Psychiatric patients have difficulties with social functioning that affect their quality of life adversely. To investigate these impediments, we compared the performances of patients with schizophrenia and those with bipolar disorder to healthy controls on a task that involved matching photographs of facial expressions to a faceless protagonist in each of a series of drawn cartoon emotion-related situations. These scenarios involved either a single character (Nonsocial or multiple characters (Social. The Social scenarios were also Congruent, with everyone in the cartoon displaying the same emotion, or Noncongruent (with everyone displaying a different emotion than the protagonist should. In this preliminary study, both patient groups produced lower scores than controls (p < 0.001, but did not perform differently from each other. All groups performed best on the social-congruent items and worst on the social-noncongruent items (p < 0.001. Performance varied inversely with illness duration, but not symptom severity. Complete emotional, social, cognitive, or perceptual inability is unlikely because these patient groups could still do this task. Nevertheless, the differences we saw could be meaningful functionally and clinically significant and deserve further exploration. Therefore, we stress the need to continue developing novel, alternative ways to explore social cognition in patients with psychiatric disorders and to clarify which elements of the multidimensional process contribute to difficulties in daily functioning.

  16. Facial emotion recognition, face scan paths, and face perception in children with neurofibromatosis type 1.

    Science.gov (United States)

    Lewis, Amelia K; Porter, Melanie A; Williams, Tracey A; Bzishvili, Samantha; North, Kathryn N; Payne, Jonathan M

    2017-05-01

    This study aimed to investigate face scan paths and face perception abilities in children with Neurofibromatosis Type 1 (NF1) and how these might relate to emotion recognition abilities in this population. The authors investigated facial emotion recognition, face scan paths, and face perception in 29 children with NF1 compared to 29 chronological age-matched typically developing controls. Correlations between facial emotion recognition, face scan paths, and face perception in children with NF1 were examined. Children with NF1 displayed significantly poorer recognition of fearful expressions compared to controls, as well as a nonsignificant trend toward poorer recognition of anger. Although there was no significant difference between groups in time spent viewing individual core facial features (eyes, nose, mouth, and nonfeature regions), children with NF1 spent significantly less time than controls viewing the face as a whole. Children with NF1 also displayed significantly poorer face perception abilities than typically developing controls. Facial emotion recognition deficits were not significantly associated with aberrant face scan paths or face perception abilities in the NF1 group. These results suggest that impairments in the perception, identification, and interpretation of information from faces are important aspects of the social-cognitive phenotype of NF1. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. Emotion recognition training using composite faces generalises across identities but not all emotions.

    Science.gov (United States)

    Dalili, Michael N; Schofield-Toloza, Lawrence; Munafò, Marcus R; Penton-Voak, Ian S

    2017-08-01

    Many cognitive bias modification (CBM) tasks use facial expressions of emotion as stimuli. Some tasks use unique facial stimuli, while others use composite stimuli, given evidence that emotion is encoded prototypically. However, CBM using composite stimuli may be identity- or emotion-specific, and may not generalise to other stimuli. We investigated the generalisability of effects using composite faces in two experiments. Healthy adults in each study were randomised to one of four training conditions: two stimulus-congruent conditions, where same faces were used during all phases of the task, and two stimulus-incongruent conditions, where faces of the opposite sex (Experiment 1) or faces depicting another emotion (Experiment 2) were used after the modification phase. Our results suggested that training effects generalised across identities. However, our results indicated only partial generalisation across emotions. These findings suggest effects obtained using composite stimuli may extend beyond the stimuli used in the task but remain emotion-specific.

  18. Emotional Faces Capture Spatial Attention in 5-Year-Old Children

    Directory of Open Access Journals (Sweden)

    Kit K. Elam

    2010-10-01

    Full Text Available Emotional facial expressions are important social cues that convey salient affective information. Infants, younger children, and adults all appear to orient spatial attention to emotional faces with a particularly strong bias to fearful faces. Yet in young children it is unclear whether or not both happy and fearful faces extract attention. Given that the processing of emotional faces is believed by some to serve an evolutionarily adaptive purpose, attentional biases to both fearful and happy expressions would be expected in younger children. However, the extent to which this ability is present in young children and whether or not this ability is genetically mediated is untested. Therefore, the aims of the current study were to assess the spatial-attentional properties of emotional faces in young children, with a preliminary test of whether this effect was influenced by genetics. Five-year-old twin pairs performed a dot-probe task. The results suggest that children preferentially direct spatial attention to emotional faces, particularly right visual field faces. The results provide support for the notion that the direction of spatial attention to emotional faces serves an evolutionarily adaptive function and may be mediated by genetic mechanisms.

  19. Short-term memory for emotional faces in dysphoria.

    Science.gov (United States)

    Noreen, Saima; Ridout, Nathan

    2010-07-01

    The study aimed to determine if the memory bias for negative faces previously demonstrated in depression and dysphoria generalises from long- to short-term memory. A total of 29 dysphoric (DP) and 22 non-dysphoric (ND) participants were presented with a series of faces and asked to identify the emotion portrayed (happiness, sadness, anger, or neutral affect). Following a delay, four faces were presented (the original plus three distractors) and participants were asked to identify the target face. Half of the trials assessed memory for facial emotion, and the remaining trials examined memory for facial identity. At encoding, no group differences were apparent. At memory testing, relative to ND participants, DP participants exhibited impaired memory for all types of facial emotion and for facial identity when the faces featured happiness, anger, or neutral affect, but not sadness. DP participants exhibited impaired identity memory for happy faces relative to angry, sad, and neutral, whereas ND participants exhibited enhanced facial identity memory when faces were angry. In general, memory for faces was not related to performance at encoding. However, in DP participants only, memory for sad faces was related to sadness recognition at encoding. The results suggest that the negative memory bias for faces in dysphoria does not generalise from long- to short-term memory.

  20. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias

    OpenAIRE

    Invitto, Sara; Calcagn?, Antonio; Mignozzi, Arianna; Scardino, Rosanna; Piraino, Giulia; Turchi, Daniele; De Feudis, Irio; Brunetti, Antonio; Bevilacqua, Vitoantonio; de Tommaso, Marina

    2017-01-01

    Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emo...

  1. Investigating emotional top down modulation of ambiguous faces by single pulse TMS on early visual cortices

    Directory of Open Access Journals (Sweden)

    Zachary Adam Yaple

    2016-06-01

    Full Text Available Top-down processing is a mechanism in which memory, context and expectation are used to perceive stimuli. For this study we investigated how emotion content, induced by music mood, influences perception of happy and sad emoticons. Using single pulse TMS we stimulated right occipital face area (rOFA, primary visual cortex (V1 and vertex while subjects performed a face-detection task and listened to happy and sad music. At baseline, incongruent audio-visual pairings decreased performance, demonstrating dependence of emotion while perceiving ambiguous faces. However, performance of face identification decreased during rOFA stimulation regardless of emotional content. No effects were found between Cz and V1 stimulation. These results suggest that while rOFA is important for processing faces regardless of emotion, V1 stimulation had no effect. Our findings suggest that early visual cortex activity may not integrate emotional auditory information with visual information during emotion top-down modulation of faces.

  2. Individual differences in emotion lateralisation and the processing of emotional information arising from social interactions.

    Science.gov (United States)

    Bourne, Victoria J; Watling, Dawn

    2015-01-01

    Previous research examining the possible association between emotion lateralisation and social anxiety has found conflicting results. In this paper two studies are presented to assess two aspects related to different features of social anxiety: fear of negative evaluation (FNE) and emotion regulation. Lateralisation for the processing of facial emotion was measured using the chimeric faces test. Individuals with greater FNE were more strongly lateralised to the right hemisphere for the processing of anger, happiness and sadness; and, for the processing of fearful faces the relationship was found for females only. Emotion regulation strategies were reduced to two factors: positive strategies and negative strategies. For males, but not females, greater reported use of negative emotion strategies is associated with stronger right hemisphere lateralisation for processing negative emotions. The implications for further understanding the neuropsychological processing of emotion in individuals with social anxiety are discussed.

  3. Effects of acute psychosocial stress on neural activity to emotional and neutral faces in a face recognition memory paradigm.

    Science.gov (United States)

    Li, Shijia; Weerda, Riklef; Milde, Christopher; Wolf, Oliver T; Thiel, Christiane M

    2014-12-01

    Previous studies have shown that acute psychosocial stress impairs recognition of declarative memory and that emotional material is especially sensitive to this effect. Animal studies suggest a central role of the amygdala which modulates memory processes in hippocampus, prefrontal cortex and other brain areas. We used functional magnetic resonance imaging (fMRI) to investigate neural correlates of stress-induced modulation of emotional recognition memory in humans. Twenty-seven healthy, right-handed, non-smoker male volunteers performed an emotional face recognition task. During encoding, participants were presented with 50 fearful and 50 neutral faces. One hour later, they underwent either a stress (Trier Social Stress Test) or a control procedure outside the scanner which was followed immediately by the recognition session inside the scanner, where participants had to discriminate between 100 old and 50 new faces. Stress increased salivary cortisol, blood pressure and pulse, and decreased the mood of participants but did not impact recognition memory. BOLD data during recognition revealed a stress condition by emotion interaction in the left inferior frontal gyrus and right hippocampus which was due to a stress-induced increase of neural activity to fearful and a decrease to neutral faces. Functional connectivity analyses revealed a stress-induced increase in coupling between the right amygdala and the right fusiform gyrus, when processing fearful as compared to neutral faces. Our results provide evidence that acute psychosocial stress affects medial temporal and frontal brain areas differentially for neutral and emotional items, with a stress-induced privileged processing of emotional stimuli.

  4. Emotion regulation in mothers and young children faced with trauma.

    Science.gov (United States)

    Pat-Horenczyk, Ruth; Cohen, S; Ziv, Y; Achituv, M; Asulin-Peretz, L; Blanchard, T R; Schiff, M; Brom, D

    2015-01-01

    The present study investigated maternal emotion regulation as mediating the association between maternal posttraumatic stress symptoms and children's emotional dysregulation in a community sample of 431 Israeli mothers and children exposed to trauma. Little is known about the specific pathways through which maternal posttraumatic symptoms and deficits in emotion regulation contribute to emotional dysregulation. Inspired by the intergenerational process of relational posttraumatic stress disorder (PTSD), in which posttraumatic distress is transmitted from mothers to children, we suggest an analogous concept of relational emotion regulation, by which maternal emotion regulation problems may contribute to child emotion regulation deficits. Child emotion regulation problems were measured using the Child Behavior Checklist-Dysregulation Profile (CBCL-DP; T.M. Achenbach & I. Rescorla, 2000), which is comprised of three subscales of the CBCL: Attention, Aggression, and Anxiety/Depression. Maternal PTSD symptoms were assessed by the Posttraumatic Diagnostic Scale (E.B. Foa, L. Cashman, L. Jaycox, & K. Perry, 1997) and maternal emotion regulation by the Difficulties in Emotion Regulation Scale (K.L. Gratz & L. Roemer, 2004). Results showed that the child's emotion regulation problems were associated with both maternal posttraumatic symptoms and maternal emotion dysregulation. Further, maternal emotion regulation mediated the association between maternal posttraumatic symptoms and the child's regulation deficits. These findings highlight the central role of mothers' emotion regulation skills in the aftermath of trauma as it relates to children's emotion regulation skills. The degree of mothers' regulatory skills in the context of posttraumatic stress symptoms reflects a key process through which the intergenerational transmission of trauma may occur. Study results have critical implications for planning and developing clinical interventions geared toward the treatment of

  5. Are neutral faces of children really emotionally neutral?

    OpenAIRE

    小松, 佐穂子; 箱田, 裕司; Komatsu, Sahoko; Hakoda, Yuji

    2012-01-01

    In this study, we investigated whether people recognize emotions from neutral faces of children (11 to 13 years old). We took facial images of 53 male and 54 female Japanese children who had been asked to keep a neutral facial expression. Then, we conducted an experiment in which 43 participants (19 to 34 years old) rated the strength of four emotions (happiness, surprise, sadness, and anger) for the facial images, using a 7- point scale. We found that (a) they rated both male and female face...

  6. Cognitive Biases for Emotional Faces in High- and Low-Trait Depressive Participants

    Directory of Open Access Journals (Sweden)

    Yi-Hsing Hsieh

    2004-10-01

    Full Text Available This study examined the association between trait depression and information-processing biases. Thirty participants were divided into high- and low-trait depressive groups based on the median of their depressive subscale scores according to the Basic Personality Inventory. Information-processing biases were measured using a deployment-of-attention task (DOAT and a recognition memory task (RMT. For the DOAT, participants saw one emotional face paired with a neutral face of the same person, and then were forced to choose on which face the color patch had first occurred. The percentage of participants' choices favoring the happy, angry, or sad faces represented the selective attentional bias score for each emotion, respectively. For the RMT, participants rated different types of emotional faces and subsequently discriminated old faces from new faces. The memory strength for each type of face was calculated from hit and false-positive rates, based on the signal detection theory. Compared with the low-trait depressive group, the high-trait depressive group showed a negative cognitive style. This was an enhanced recognition memory for sad faces and a weakened inhibition of attending to sad faces, suggesting that those with high depressive trait may be vulnerable to interpersonal withdrawal.

  7. About-face on face recognition ability and holistic processing.

    Science.gov (United States)

    Richler, Jennifer J; Floyd, R Jackie; Gauthier, Isabel

    2015-01-01

    Previous work found a small but significant relationship between holistic processing measured with the composite task and face recognition ability measured by the Cambridge Face Memory Test (CFMT; Duchaine & Nakayama, 2006). Surprisingly, recent work using a different measure of holistic processing (Vanderbilt Holistic Face Processing Test [VHPT-F]; Richler, Floyd, & Gauthier, 2014) and a larger sample found no evidence for such a relationship. In Experiment 1 we replicate this unexpected result, finding no relationship between holistic processing (VHPT-F) and face recognition ability (CFMT). A key difference between the VHPT-F and other holistic processing measures is that unique face parts are used on each trial in the VHPT-F, unlike in other tasks where a small set of face parts repeat across the experiment. In Experiment 2, we test the hypothesis that correlations between the CFMT and holistic processing tasks are driven by stimulus repetition that allows for learning during the composite task. Consistent with our predictions, CFMT performance was correlated with holistic processing in the composite task when a small set of face parts repeated over trials, but not when face parts did not repeat. A meta-analysis confirms that relationships between the CFMT and holistic processing depend on stimulus repetition. These results raise important questions about what is being measured by the CFMT, and challenge current assumptions about why faces are processed holistically.

  8. Serotonergic neurotransmission in emotional processing

    DEFF Research Database (Denmark)

    Laursen, Helle Ruff; Henningsson, Susanne; Macoveanu, Julian

    2016-01-01

    ,4-methylene-dioxymethamphetamine [MDMA]) induces alterations in serotonergic neurotransmission that are comparable to those observed in a depleted state. In this functional magnetic resonance imaging (fMRI) study, we investigated the responsiveness of the amygdala to emotional face stimuli in recreational...... ecstasy users as a model of long-term serotonin depletion. Fourteen ecstasy users and 12 non-using controls underwent fMRI to measure the regional neural activity elicited in the amygdala by male or female faces expressing anger, disgust, fear, sadness, or no emotion. During fMRI, participants made a sex...... judgement on each face stimulus. Positron emission tomography with (11)C-DASB was additionally performed to assess serotonin transporter (SERT) binding in the brain. In the ecstasy users, SERT binding correlated negatively with amygdala activity, and accumulated lifetime intake of ecstasy tablets...

  9. A new method for face detection in colour images for emotional bio-robots

    Institute of Scientific and Technical Information of China (English)

    HAPESHI; Kevin

    2010-01-01

    Emotional bio-robots have become a hot research topic in last two decades. Though there have been some progress in research, design and development of various emotional bio-robots, few of them can be used in practical applications. The study of emotional bio-robots demands multi-disciplinary co-operation. It involves computer science, artificial intelligence, 3D computation, engineering system modelling, analysis and simulation, bionics engineering, automatic control, image processing and pattern recognition etc. Among them, face detection belongs to image processing and pattern recognition. An emotional robot must have the ability to recognize various objects, particularly, it is very important for a bio-robot to be able to recognize human faces from an image. In this paper, a face detection method is proposed for identifying any human faces in colour images using human skin model and eye detection method. Firstly, this method can be used to detect skin regions from the input colour image after normalizing its luminance. Then, all face candidates are identified using an eye detection method. Comparing with existing algorithms, this method only relies on the colour and geometrical data of human face rather than using training datasets. From experimental results, it is shown that this method is effective and fast and it can be applied to the development of an emotional bio-robot with further improvements of its speed and accuracy.

  10. No Differences in Emotion Recognition Strategies in Children with Autism Spectrum Disorder: Evidence from Hybrid Faces

    OpenAIRE

    Evers, Kris; Kerkhof, Inneke; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2014-01-01

    Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD). However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybr...

  11. Auditory Emotion Word Primes Influence Emotional Face Categorization in Children and Adults, but Not Vice Versa.

    Science.gov (United States)

    Vesker, Michael; Bahn, Daniela; Kauschke, Christina; Tschense, Monika; Degé, Franziska; Schwarzer, Gudrun

    2018-01-01

    In order to assess how the perception of audible speech and facial expressions influence one another for the perception of emotions, and how this influence might change over the course of development, we conducted two cross-modal priming experiments with three age groups of children (6-, 9-, and 12-years old), as well as college-aged adults. In Experiment 1, 74 children and 24 adult participants were tasked with categorizing photographs of emotional faces as positive or negative as quickly as possible after being primed with emotion words presented via audio in valence-congruent and valence-incongruent trials. In Experiment 2, 67 children and 24 adult participants carried out a similar categorization task, but with faces acting as visual primes, and emotion words acting as auditory targets. The results of Experiment 1 showed that participants made more errors when categorizing positive faces primed by negative words versus positive words, and that 6-year-old children are particularly sensitive to positive word primes, giving faster correct responses regardless of target valence. Meanwhile, the results of Experiment 2 did not show any congruency effects for priming by facial expressions. Thus, audible emotion words seem to exert an influence on the emotional categorization of faces, while faces do not seem to influence the categorization of emotion words in a significant way.

  12. Assessment of incongruent emotions in face and voice

    NARCIS (Netherlands)

    Takagi, S.; Tabei, K.-I.; Huis in 't Veld, E.M.J.; de Gelder, B.

    2013-01-01

    Information derived from facial and vocal nonverbal expressions plays an important role in social communication in the real and virtual worlds. In the present study, we investigated cultural differences between Japanese and Dutch participants in the multisensory perception of emotion. We used a face

  13. Child's recognition of emotions in robot's face and body

    NARCIS (Netherlands)

    Cohen, I.; Looije, R.; Neerincx, M.A.

    2011-01-01

    Social robots can comfort and support children who have to cope with chronic diseases. In previous studies, a "facial robot", the iCat, proved to show well-recognized emotional expressions that are important in social interactions. The question is if a mobile robot without a face, the Nao, can

  14. The recognition of emotional expression in prosopagnosia: decoding whole and part faces.

    Science.gov (United States)

    Stephan, Blossom Christa Maree; Breen, Nora; Caine, Diana

    2006-11-01

    Prosopagnosia is currently viewed within the constraints of two competing theories of face recognition, one highlighting the analysis of features, the other focusing on configural processing of the whole face. This study investigated the role of feature analysis versus whole face configural processing in the recognition of facial expression. A prosopagnosic patient, SC made expression decisions from whole and incomplete (eyes-only and mouth-only) faces where features had been obscured. SC was impaired at recognizing some (e.g., anger, sadness, and fear), but not all (e.g., happiness) emotional expressions from the whole face. Analyses of his performance on incomplete faces indicated that his recognition of some expressions actually improved relative to his performance on the whole face condition. We argue that in SC interference from damaged configural processes seem to override an intact ability to utilize part-based or local feature cues.

  15. Positive emotion impedes emotional but not cognitive conflict processing.

    Science.gov (United States)

    Zinchenko, Artyom; Obermeier, Christian; Kanske, Philipp; Schröger, Erich; Kotz, Sonja A

    2017-06-01

    Cognitive control enables successful goal-directed behavior by resolving a conflict between opposing action tendencies, while emotional control arises as a consequence of emotional conflict processing such as in irony. While negative emotion facilitates both cognitive and emotional conflict processing, it is unclear how emotional conflict processing is affected by positive emotion (e.g., humor). In 2 EEG experiments, we investigated the role of positive audiovisual target stimuli in cognitive and emotional conflict processing. Participants categorized either spoken vowels (cognitive task) or their emotional valence (emotional task) and ignored the visual stimulus dimension. Behaviorally, a positive target showed no influence on cognitive conflict processing, but impeded emotional conflict processing. In the emotional task, response time conflict costs were higher for positive than for neutral targets. In the EEG, we observed an interaction of emotion by congruence in the P200 and N200 ERP components in emotional but not in cognitive conflict processing. In the emotional conflict task, the P200 and N200 conflict effect was larger for emotional than neutral targets. Thus, our results show that emotion affects conflict processing differently as a function of conflict type and emotional valence. This suggests that there are conflict- and valence-specific mechanisms modulating executive control.

  16. Psilocybin with psychological support improves emotional face recognition in treatment-resistant depression.

    Science.gov (United States)

    Stroud, J B; Freeman, T P; Leech, R; Hindocha, C; Lawn, W; Nutt, D J; Curran, H V; Carhart-Harris, R L

    2018-02-01

    Depressed patients robustly exhibit affective biases in emotional processing which are altered by SSRIs and predict clinical outcome. The objective of this study is to investigate whether psilocybin, recently shown to rapidly improve mood in treatment-resistant depression (TRD), alters patients' emotional processing biases. Seventeen patients with treatment-resistant depression completed a dynamic emotional face recognition task at baseline and 1 month later after two doses of psilocybin with psychological support. Sixteen controls completed the emotional recognition task over the same time frame but did not receive psilocybin. We found evidence for a group × time interaction on speed of emotion recognition (p = .035). At baseline, patients were slower at recognising facial emotions compared with controls (p psilocybin, this difference was remediated (p = .208). Emotion recognition was faster at follow-up compared with baseline in patients (p = .004, d = .876) but not controls (p = .263, d = .302). In patients, this change was significantly correlated with a reduction in anhedonia over the same time period (r = .640, p = .010). Psilocybin with psychological support appears to improve processing of emotional faces in treatment-resistant depression, and this correlates with reduced anhedonia. Placebo-controlled studies are warranted to follow up these preliminary findings.

  17. From specificity to sensitivity: affective states modulate visual working memory for emotional expressive faces.

    Science.gov (United States)

    Maran, Thomas; Sachse, Pierre; Furtner, Marco

    2015-01-01

    Previous findings suggest that visual working memory (VWM) preferentially remembers angry looking faces. However, the meaning of facial actions is construed in relation to context. To date, there are no studies investigating the role of perceiver-based context when processing emotional cues in VWM. To explore the influence of affective context on VWM for faces, we conducted two experiments using both a VWM task for emotionally expressive faces and a mood induction procedure. Affective context was manipulated by unpleasant (Experiment 1) and pleasant (Experiment 2) IAPS pictures in order to induce an affect high in motivational intensity (defensive or appetitive, respectively) compared to a low arousal control condition. Results indicated specifically increased sensitivity of VWM for angry looking faces in the neutral condition. Enhanced VWM for angry faces was prevented by inducing affects of high motivational intensity. In both experiments, affective states led to a switch from specific enhancement of angry expressions in VWM to an equally sensitive representation of all emotional expressions. Our findings demonstrate that emotional expressions are of different behavioral relevance for the receiver depending on the affective context, supporting a functional organization of VWM along with flexible resource allocation. In VWM, stimulus processing adjusts to situational requirements and transitions from a specifically prioritizing default mode in predictable environments to a sensitive, hypervigilant mode in exposure to emotional events.

  18. From Specificity to Sensitivity: Affective states modulate visual working memory for emotional expressive faces

    Directory of Open Access Journals (Sweden)

    Thomas eMaran

    2015-08-01

    Full Text Available Previous findings suggest that visual working memory preferentially remembers angry looking faces. However, the meaning of facial actions is construed in relation to context. To date, there are no studies investigating the role of perceiver-based context when processing emotional cues in visual working memory. To explore the influence of affective context on visual working memory for faces, we conducted two experiments using both a visual working memory task for emotionally expressive faces and a mood induction procedure. Affective context was manipulated by unpleasant (Experiment 1 and pleasant (Experiment 2 IAPS pictures in order to induce an affect high in motivational intensity (defensive or appetitive, respectively compared to a low arousal control condition. Results indicated specifically increased sensitivity of visual working memory for angry looking faces in the neutral condition. Enhanced visual working memory for angry faces was prevented by inducing affects of high motivational intensity. In both experiments, affective states led to a switch from specific enhancement of angry expressions in visual working memory to an equally sensitive representation of all emotional expressions. Our findings demonstrate that emotional expressions are of different behavioral relevance for the receiver depending on the affective context, supporting a functional organization of visual working memory along with flexible resource allocation. In visual working memory, stimulus processing adjusts to situational requirements and transitions from a specifically prioritizing default mode in predictable environments to a sensitive, hypervigilant mode in exposure to emotional events.

  19. Infants' Temperament and Mothers', and Fathers' Depression Predict Infants' Attention to Objects Paired with Emotional Faces.

    Science.gov (United States)

    Aktar, Evin; Mandell, Dorothy J; de Vente, Wieke; Majdandžić, Mirjana; Raijmakers, Maartje E J; Bögels, Susan M

    2016-07-01

    Between 10 and 14 months, infants gain the ability to learn about unfamiliar stimuli by observing others' emotional reactions to those stimuli, so called social referencing (SR). Joint processing of emotion and head/gaze direction is essential for SR. This study tested emotion and head/gaze direction effects on infants' attention via pupillometry in the period following the emergence of SR. Pupil responses of 14-to-17-month-old infants (N = 57) were measured during computerized presentations of unfamiliar objects alone, before-and-after being paired with emotional (happy, sad, fearful vs. neutral) faces gazing towards (vs. away) from objects. Additionally, the associations of infants' temperament, and parents' negative affect/depression/anxiety with infants' pupil responses were explored. Both mothers and fathers of participating infants completed questionnaires about their negative affect, depression and anxiety symptoms and their infants' negative temperament. Infants allocated more attention (larger pupils) to negative vs. neutral faces when the faces were presented alone, while they allocated less attention to objects paired with emotional vs. neutral faces independent of head/gaze direction. Sad (but not fearful) temperament predicted more attention to emotional faces. Infants' sad temperament moderated the associations of mothers' depression (but not anxiety) with infants' attention to objects. Maternal depression predicted more attention to objects paired with emotional expressions in infants low in sad temperament, while it predicted less attention in infants high in sad temperament. Fathers' depression (but not anxiety) predicted more attention to objects paired with emotional expressions independent of infants' temperament. We conclude that infants' own temperamental dispositions for sadness, and their exposure to mothers' and fathers' depressed moods may influence infants' attention to emotion-object associations in social learning contexts.

  20. Children's understanding of facial expression of emotion: II. Drawing of emotion-faces.

    Science.gov (United States)

    Missaghi-Lakshman, M; Whissell, C

    1991-06-01

    67 children from Grades 2, 4, and 7 drew faces representing the emotional expressions of fear, anger, surprise, disgust, happiness, and sadness. The children themselves and 29 adults later decoded the drawings in an emotion-recognition task. Children were the more accurate decoders, and their accuracy and the accuracy of adults increased significantly for judgments of 7th-grade drawings. The emotions happy and sad were most accurately decoded. There were no significant differences associated with sex. In their drawings, children utilized a symbol system that seems to be based on a highlighting or exaggeration of features of the innately governed facial expression of emotion.

  1. Sex differences in social cognition: The case of face processing.

    Science.gov (United States)

    Proverbio, Alice Mado

    2017-01-02

    Several studies have demonstrated that women show a greater interest for social information and empathic attitude than men. This article reviews studies on sex differences in the brain, with particular reference to how males and females process faces and facial expressions, social interactions, pain of others, infant faces, faces in things (pareidolia phenomenon), opposite-sex faces, humans vs. landscapes, incongruent behavior, motor actions, biological motion, erotic pictures, and emotional information. Sex differences in oxytocin-based attachment response and emotional memory are also mentioned. In addition, we investigated how 400 different human faces were evaluated for arousal and valence dimensions by a group of healthy male and female University students. Stimuli were carefully balanced for sensory and perceptual characteristics, age, facial expression, and sex. As a whole, women judged all human faces as more positive and more arousing than men. Furthermore, they showed a preference for the faces of children and the elderly in the arousal evaluation. Regardless of face aesthetics, age, or facial expression, women rated human faces higher than men. The preference for opposite- vs. same-sex faces strongly interacted with facial age. Overall, both women and men exhibited differences in facial processing that could be interpreted in the light of evolutionary psychobiology. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  2. Visual attention to emotional face in schizophrenia: an eye tracking study.

    Directory of Open Access Journals (Sweden)

    Mania Asgharpour

    2015-03-01

    Full Text Available Deficits in the processing of facial emotions have been reported extensively in patients with schizophrenia. To explore whether restricted attention is the cause of impaired emotion processing in these patients, we examined visual attention through tracking eye movements in response to emotional and neutral face stimuli in a group of patients with schizophrenia and healthy individuals. We also examined the correlation between visual attention allocation and symptoms severity in our patient group.Thirty adult patients with schizophrenia and 30 matched healthy controls participated in this study. Visual attention data were recorded while participants passively viewed emotional-neutral face pairs for 500 ms. The relationship between the visual attention and symptoms severity were assessed by the Positive and Negative Syndrome Scale (PANSS in the schizophrenia group. Repeated Measures ANOVAs were used to compare the groups.Comparing the number of fixations made during face-pairs presentation, we found that patients with schizophrenia made fewer fixations on faces, regardless of the expression of the face. Analysis of the number of fixations on negative-neutral pairs also revealed that the patients made fewer fixations on both neutral and negative faces. Analysis of number of fixations on positive-neutral pairs only showed more fixations on positive relative to neutral expressions in both groups. We found no correlations between visual attention pattern to faces and symptom severity in schizophrenic patients.The results of this study suggest that the facial recognition deficit in schizophrenia is related to decreased attention to face stimuli. Finding of no difference in visual attention for positive-neutral face pairs between the groups is in line with studies that have shown increased ability to positive emotional perception in these patients.

  3. Emotion Recognition in Face and Body Motion in Bulimia Nervosa.

    Science.gov (United States)

    Dapelo, Marcela Marin; Surguladze, Simon; Morris, Robin; Tchanturia, Kate

    2017-11-01

    Social cognition has been studied extensively in anorexia nervosa (AN), but there are few studies in bulimia nervosa (BN). This study investigated the ability of people with BN to recognise emotions in ambiguous facial expressions and in body movement. Participants were 26 women with BN, who were compared with 35 with AN, and 42 healthy controls. Participants completed an emotion recognition task by using faces portraying blended emotions, along with a body emotion recognition task by using videos of point-light walkers. The results indicated that BN participants exhibited difficulties recognising disgust in less-ambiguous facial expressions, and a tendency to interpret non-angry faces as anger, compared with healthy controls. These difficulties were similar to those found in AN. There were no significant differences amongst the groups in body motion emotion recognition. The findings suggest that difficulties with disgust and anger recognition in facial expressions may be shared transdiagnostically in people with eating disorders. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association.

  4. Neural markers of opposite-sex bias in face processing

    Directory of Open Access Journals (Sweden)

    Alice Mado eProverbio

    2010-10-01

    Full Text Available Some behavioral and neuroimaging studies suggest that adults prefer to view attractive faces of the opposite sex more than attractive faces of the same sex. However, unlike the other-race face effect (ORE; Caldara et al., 2004, little is known regarding the existence of an opposite-/same-sex bias in face processing. In this study, the faces of 130 attractive male and female adults were foveally presented to 40 heterosexual university students (20 men and 20 women who were engaged in a secondary perceptual task (landscape detection. The automatic processing of face gender was investigated by recording ERPs from 128 scalp sites. Neural markers of opposite- vs. same-sex bias in face processing included larger and earlier centro-parietal N400s in response to faces of the opposite sex and a larger late positivity (LP to same-sex faces. Analysis of intra-cortical neural generators (swLORETA showed that facial processing-related (FG, BA37, BA20/21 and emotion-related brain areas (the right parahippocampal gyrus, BA35; uncus, BA36/38; and the cingulate gyrus, BA24 had higher activations in response to opposite- than same-sex faces. The results of this analysis, along with data obtained from ERP recordings, support the hypothesis that both genders process opposite-sex faces differently than same-sex faces. The data also suggest a hemispheric asymmetry in the processing of opposite-/same-sex faces, with the right hemisphere involved in processing same-sex faces and the left hemisphere involved in processing faces of the opposite sex. The data support previous literature suggesting a right lateralization for the representation of self-image and body awareness.

  5. Neural markers of opposite-sex bias in face processing.

    Science.gov (United States)

    Proverbio, Alice Mado; Riva, Federica; Martin, Eleonora; Zani, Alberto

    2010-01-01

    Some behavioral and neuroimaging studies suggest that adults prefer to view attractive faces of the opposite sex more than attractive faces of the same sex. However, unlike the other-race face effect (Caldara et al., 2004), little is known regarding the existence of an opposite-/same-sex bias in face processing. In this study, the faces of 130 attractive male and female adults were foveally presented to 40 heterosexual university students (20 men and 20 women) who were engaged in a secondary perceptual task (landscape detection). The automatic processing of face gender was investigated by recording ERPs from 128 scalp sites. Neural markers of opposite- vs. same-sex bias in face processing included larger and earlier centro-parietal N400s in response to faces of the opposite sex and a larger late positivity (LP) to same-sex faces. Analysis of intra-cortical neural generators (swLORETA) showed that facial processing-related (FG, BA37, BA20/21) and emotion-related brain areas (the right parahippocampal gyrus, BA35; uncus, BA36/38; and the cingulate gyrus, BA24) had higher activations in response to opposite- than same-sex faces. The results of this analysis, along with data obtained from ERP recordings, support the hypothesis that both genders process opposite-sex faces differently than same-sex faces. The data also suggest a hemispheric asymmetry in the processing of opposite-/same-sex faces, with the right hemisphere involved in processing same-sex faces and the left hemisphere involved in processing faces of the opposite sex. The data support previous literature suggesting a right lateralization for the representation of self-image and body awareness.

  6. Modulation of the composite face effect by unintended emotion cues

    OpenAIRE

    Gray, Katie L. H.; Murphy, Jennifer; Marsh, Jade E.; Cook, Richard

    2017-01-01

    When upper and lower regions from different emotionless faces are aligned to form a facial composite, observers ‘fuse’ the two halves together, perceptually. The illusory distortion induced by task-irrelevant (‘distractor’) halves hinders participants’ judgments about task-relevant (‘target’) halves. This composite-face effect reveals a tendency to integrate feature information from disparate regions of intact upright faces, consistent with theories of holistic face processing. However, obser...

  7. Psilocybin modulates functional connectivity of the amygdala during emotional face discrimination.

    Science.gov (United States)

    Grimm, O; Kraehenmann, R; Preller, K H; Seifritz, E; Vollenweider, F X

    2018-04-24

    Recent studies suggest that the antidepressant effects of the psychedelic 5-HT2A receptor agonist psilocybin are mediated through its modulatory properties on prefrontal and limbic brain regions including the amygdala. To further investigate the effects of psilocybin on emotion processing networks, we studied for the first-time psilocybin's acute effects on amygdala seed-to-voxel connectivity in an event-related face discrimination task in 18 healthy volunteers who received psilocybin and placebo in a double-blind balanced cross-over design. The amygdala has been implicated as a salience detector especially involved in the immediate response to emotional face content. We used beta-series amygdala seed-to-voxel connectivity during an emotional face discrimination task to elucidate the connectivity pattern of the amygdala over the entire brain. When we compared psilocybin to placebo, an increase in reaction time for all three categories of affective stimuli was found. Psilocybin decreased the connectivity between amygdala and the striatum during angry face discrimination. During happy face discrimination, the connectivity between the amygdala and the frontal pole was decreased. No effect was seen during discrimination of fearful faces. Thus, we show psilocybin's effect as a modulator of major connectivity hubs of the amygdala. Psilocybin decreases the connectivity between important nodes linked to emotion processing like the frontal pole or the striatum. Future studies are needed to clarify whether connectivity changes predict therapeutic effects in psychiatric patients. Copyright © 2018 Elsevier B.V. and ECNP. All rights reserved.

  8. Holistic Processing of Static and Moving Faces

    Science.gov (United States)

    Zhao, Mintao; Bülthoff, Isabelle

    2017-01-01

    Humans' face ability develops and matures with extensive experience in perceiving, recognizing, and interacting with faces that move most of the time. However, how facial movements affect 1 core aspect of face ability--holistic face processing--remains unclear. Here we investigated the influence of rigid facial motion on holistic and part-based…

  9. Faces in context: A review and systematization of contextual influences on affective face processing

    Directory of Open Access Journals (Sweden)

    Matthias J Wieser

    2012-11-01

    Full Text Available Facial expressions are of eminent importance for social interaction as they convey information about other individuals’ emotions and social intentions. According to the predominant basic emotion approach, the perception of emotion in faces is based on the rapid, automatic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, decontextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual’s face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social information gained from affective learning and implicit processing biases such as race bias. Thus, the perception of facial expressions is presumably always influenced by contextual variables. In this comprehensive review, we aim at 1 systematizing the contextual variables that may influence the perception of facial expressions and 2 summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of automatic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in

  10. Cross-modal perception (face and voice in emotions. ERPs and behavioural measures

    Directory of Open Access Journals (Sweden)

    Michela Balconi

    2007-04-01

    Full Text Available Emotion decoding constitutes a case of multimodal processing of cues from multiple channels. Previous behavioural and neuropsychological studies indicated that, when we have to decode emotions on the basis of multiple perceptive information, a cross-modal integration has place. The present study investigates the simultaneous processing of emotional tone of voice and emotional facial expression by event-related potentials (ERPs, through an ample range of different emotions (happiness, sadness, fear, anger, surprise, and disgust. Auditory emotional stimuli (a neutral word pronounced in an affective tone and visual patterns (emotional facial expressions were matched in congruous (the same emotion in face and voice and incongruous (different emotions pairs. Subjects (N=30 were required to process the stimuli and to indicate their comprehension (by stimpad. ERPs variations and behavioural data (response time, RTs were submitted to repeated measures analysis of variance (ANOVA. We considered two time intervals (150-250; 250-350 ms post-stimulus, in order to explore the ERP variations. ANOVA showed two different ERP effects, a negative deflection (N2, more anterior-distributed (Fz, and a positive deflection (P2, more posterior-distributed, with different cognitive functions. In the first case N2 may be considered a marker of the emotional content (sensitive to type of emotion, whereas P2 may represent a cross-modal integration marker, it being varied as a function of the congruous/incongruous condition, showing a higher peak for congruous stimuli than incongruous stimuli. Finally, a RT reduction was found for some emotion types for congruous condition (i.e. sadness and an inverted effect for other emotions (i.e. fear, anger, and surprise.

  11. Cognitive emotion regulation in children: Reappraisal of emotional faces modulates neural source activity in a frontoparietal network.

    Science.gov (United States)

    Wessing, Ida; Rehbein, Maimu A; Romer, Georg; Achtergarde, Sandra; Dobel, Christian; Zwitserlood, Pienie; Fürniss, Tilman; Junghöfer, Markus

    2015-06-01

    Emotion regulation has an important role in child development and psychopathology. Reappraisal as cognitive regulation technique can be used effectively by children. Moreover, an ERP component known to reflect emotional processing called late positive potential (LPP) can be modulated by children using reappraisal and this modulation is also related to children's emotional adjustment. The present study seeks to elucidate the neural generators of such LPP effects. To this end, children aged 8-14 years reappraised emotional faces, while neural activity in an LPP time window was estimated using magnetoencephalography-based source localization. Additionally, neural activity was correlated with two indexes of emotional adjustment and age. Reappraisal reduced activity in the left dorsolateral prefrontal cortex during down-regulation and enhanced activity in the right parietal cortex during up-regulation. Activity in the visual cortex decreased with increasing age, more adaptive emotion regulation and less anxiety. Results demonstrate that reappraisal changed activity within a frontoparietal network in children. Decreasing activity in the visual cortex with increasing age is suggested to reflect neural maturation. A similar decrease with adaptive emotion regulation and less anxiety implies that better emotional adjustment may be associated with an advance in neural maturation. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. The Perception of Time While Perceiving Dynamic Emotional Faces

    Directory of Open Access Journals (Sweden)

    Wang On eLi

    2015-08-01

    Full Text Available Emotion plays an essential role in the perception of time such that time is perceived to fly when events are enjoyable, while unenjoyable moments are perceived to drag. Previous studies have reported a time-drag effect when participants are presented with emotional facial expressions, regardless of the emotion presented. This effect can hardly be explained by induced emotion given the heterogeneous nature of emotional expressions. We conducted two experiments (n=44 & n=39 to examine the cognitive mechanism underlying this effect by presenting dynamic sequences of emotional expressions to participants. Each sequence started with a particular expression, then morphed to another. The presentation of dynamic facial expressions allows a comparison between the time-drag effect of homogeneous pairs of emotional expressions sharing similar valence and arousal to heterogeneous pairs. Sequences of seven durations (400ms, 600ms, 800ms, 1,000ms, 1,200ms, 1,400ms, 1,600ms were presented to participants, who were asked to judge whether the sequences were closer to 400ms or 1,600ms in a two-alternative forced choice task. The data were then collated according to conditions and fit into cumulative Gaussian curves to estimate the point of subjective equivalence indicating the perceived duration of 1,000ms. Consistent with previous reports, a feeling of time dragging is induced regardless of the sequence presented, such that 1,000ms is perceived to be longer than 1,000ms. In addition, dynamic facial expressions exert a greater effect on perceived time drag than static expressions. The effect is most prominent when the dynamics involve an angry face or a change in valence. The significance of this sensitivity is discussed in terms of emotion perception and its evolutionary significance for our attention mechanism.

  13. The right place at the right time: priming facial expressions with emotional face components in developmental visual agnosia.

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran R; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo

    2012-04-01

    The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG's impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face's emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG's performance was strongly influenced by the diagnosticity of the components: his emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Interactions among the effects of head orientation, emotional expression, and physical attractiveness on face preferences.

    Science.gov (United States)

    Main, Julie C; DeBruine, Lisa M; Little, Anthony C; Jones, Benedict C

    2010-01-01

    Previous studies have shown that preferences for direct versus averted gaze are modulated by emotional expressions and physical attractiveness. For example, preferences for direct gaze are stronger when judging happy or physically attractive faces than when judging disgusted or physically unattractive faces. Here we show that preferences for front versus three-quarter views of faces, in which gaze direction was always congruent with head orientation, are also modulated by emotional expressions and physical attractiveness; participants demonstrated preferences for front views of faces over three-quarter views of faces when judging the attractiveness of happy, physically attractive individuals, but not when judging the attractiveness of relatively unattractive individuals or those with disgusted expressions. Moreover, further analyses indicated that these interactions did not simply reflect differential perceptions of the intensity of the emotional expressions shown in each condition. Collectively, these findings present novel evidence that the effect of the direction of the attention of others on attractiveness judgments is modulated by cues to the physical attractiveness and emotional state of the depicted individual, potentially reflecting psychological adaptations for efficient allocation of social effort. These data also present the first behavioural evidence that the effect of the direction of the attention of others on attractiveness judgments reflects viewer-referenced, rather than face-referenced, coding and/or processing of gaze direction.

  15. Emotion Processes in Knowledge Revision

    Science.gov (United States)

    Trevors, Gregory J.; Kendeou, Panayiota; Butterfuss, Reese

    2017-01-01

    In recent years, a number of insights have been gained into the cognitive processes that explain how individuals overcome misconceptions and revise their previously acquired incorrect knowledge. The current study complements this line of research by investigating the moment-by-moment emotion processes that occur during knowledge revision using a…

  16. Pretreatment Differences in BOLD Response to Emotional Faces Correlate with Antidepressant Response to Scopolamine.

    Science.gov (United States)

    Furey, Maura L; Drevets, Wayne C; Szczepanik, Joanna; Khanna, Ashish; Nugent, Allison; Zarate, Carlos A

    2015-03-28

    Faster acting antidepressants and biomarkers that predict treatment response are needed to facilitate the development of more effective treatments for patients with major depressive disorders. Here, we evaluate implicitly and explicitly processed emotional faces using neuroimaging to identify potential biomarkers of treatment response to the antimuscarinic, scopolamine. Healthy participants (n=15) and unmedicated-depressed major depressive disorder patients (n=16) participated in a double-blind, placebo-controlled crossover infusion study using scopolamine (4 μg/kg). Before and following scopolamine, blood oxygen-level dependent signal was measured using functional MRI during a selective attention task. Two stimuli comprised of superimposed pictures of faces and houses were presented. Participants attended to one stimulus component and performed a matching task. Face emotion was modulated (happy/sad) creating implicit (attend-houses) and explicit (attend-faces) emotion processing conditions. The pretreatment difference in blood oxygen-level dependent response to happy and sad faces under implicit and explicit conditions (emotion processing biases) within a-priori regions of interest was correlated with subsequent treatment response in major depressive disorder. Correlations were observed exclusively during implicit emotion processing in the regions of interest, which included the subgenual anterior cingulate (Pemotional faces prior to treatment reflect the potential to respond to scopolamine. These findings replicate earlier results, highlighting the potential for pretreatment neural activity in the middle occipital cortices and subgenual anterior cingulate to inform us about the potential to respond clinically to scopolamine. Published by Oxford University Press on behalf of CINP 2015. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  17. Emotional face recognition deficit in amnestic patients with mild cognitive impairment: behavioral and electrophysiological evidence

    Directory of Open Access Journals (Sweden)

    Yang L

    2015-08-01

    Full Text Available Linlin Yang, Xiaochuan Zhao, Lan Wang, Lulu Yu, Mei Song, Xueyi Wang Department of Mental Health, The First Hospital of Hebei Medical University, Hebei Medical University Institute of Mental Health, Shijiazhuang, People’s Republic of China Abstract: Amnestic mild cognitive impairment (MCI has been conceptualized as a transitional stage between healthy aging and Alzheimer’s disease. Thus, understanding emotional face recognition deficit in patients with amnestic MCI could be useful in determining progression of amnestic MCI. The purpose of this study was to investigate the features of emotional face processing in amnestic MCI by using event-related potentials (ERPs. Patients with amnestic MCI and healthy controls performed a face recognition task, giving old/new responses to previously studied and novel faces with different emotional messages as the stimulus material. Using the learning-recognition paradigm, the experiments were divided into two steps, ie, a learning phase and a test phase. ERPs were analyzed on electroencephalographic recordings. The behavior data indicated high emotion classification accuracy for patients with amnestic MCI and for healthy controls. The mean percentage of correct classifications was 81.19% for patients with amnestic MCI and 96.46% for controls. Our ERP data suggest that patients with amnestic MCI were still be able to undertake personalizing processing for negative faces, but not for neutral or positive faces, in the early frontal processing stage. In the early time window, no differences in frontal old/new effect were found between patients with amnestic MCI and normal controls. However, in the late time window, the three types of stimuli did not elicit any old/new parietal effects in patients with amnestic MCI, suggesting their recollection was impaired. This impairment may be closely associated with amnestic MCI disease. We conclude from our data that face recognition processing and emotional memory is

  18. Reading emotions from faces in two indigenous societies.

    Science.gov (United States)

    Crivelli, Carlos; Jarillo, Sergio; Russell, James A; Fernández-Dols, José-Miguel

    2016-07-01

    That all humans recognize certain specific emotions from their facial expression-the Universality Thesis-is a pillar of research, theory, and application in the psychology of emotion. Its most rigorous test occurs in indigenous societies with limited contact with external cultural influences, but such tests are scarce. Here we report 2 such tests. Study 1 was of children and adolescents (N = 68; aged 6-16 years) of the Trobriand Islands (Papua New Guinea, South Pacific) with a Western control group from Spain (N = 113, of similar ages). Study 2 was of children and adolescents (N = 36; same age range) of Matemo Island (Mozambique, Africa). In both studies, participants were shown an array of prototypical facial expressions and asked to point to the person feeling a specific emotion: happiness, fear, anger, disgust, or sadness. The Spanish control group matched faces to emotions as predicted by the Universality Thesis: matching was seen on 83% to 100% of trials. For the indigenous societies, in both studies, the Universality Thesis was moderately supported for happiness: smiles were matched to happiness on 58% and 56% of trials, respectively. For other emotions, however, results were even more modest: 7% to 46% in the Trobriand Islands and 22% to 53% in Matemo Island. These results were robust across age, gender, static versus dynamic display of the facial expressions, and between- versus within-subjects design. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  19. The Right Place at the Right Time: Priming Facial Expressions with Emotional Face Components in Developmental Visual Agnosia

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran. R.; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo

    2012-01-01

    The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG’s impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face’s emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG’s performance was strongly influenced by the diagnosticity of the components: His emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. PMID:22349446

  20. Different neural and cognitive response to emotional faces in healthy monozygotic twins at risk of depression

    DEFF Research Database (Denmark)

    Miskowiak, K W; Glerup, L; Vestbo, C

    2015-01-01

    while performing a gender discrimination task. After the scan, they were given a faces dot-probe task, a facial expression recognition task and questionnaires assessing mood, personality traits and coping strategies. RESULTS: High-risk twins showed increased neural response to happy and fearful faces...... processing. These task-related changes in neural responses in high-risk twins were accompanied by impaired gender discrimination performance during face processing. They also displayed increased attention vigilance for fearful faces and were slower at recognizing facial expressions relative to low......BACKGROUND: Negative cognitive bias and aberrant neural processing of emotional faces are trait-marks of depression. Yet it is unclear whether these changes constitute an endophenotype for depression and are also present in healthy individuals with hereditary risk for depression. METHOD: Thirty...

  1. Human versus Non-Human Face Processing: Evidence from Williams Syndrome

    Science.gov (United States)

    Santos, Andreia; Rosset, Delphine; Deruelle, Christine

    2009-01-01

    Increased motivation towards social stimuli in Williams syndrome (WS) led us to hypothesize that a face's human status would have greater impact than face's orientation on WS' face processing abilities. Twenty-nine individuals with WS were asked to categorize facial emotion expressions in real, human cartoon and non-human cartoon faces presented…

  2. Are reading and face processing related?

    DEFF Research Database (Denmark)

    Starrfelt, Randi; Klargaard, Solja; Petersen, Anders

    2015-01-01

    Traditionally, perceptual processing of faces and words is considered highly specialized, strongly lateralized, and largely independent. This has, however, recently been challenged by studies showing that learning to read may affect the perceptual and neural processes involved in face recognition......, a lower perceptual threshold, and higher processing speed for words compared to letters. In sum, we find no evidence that reading skills are abnormal in developmental prosopagnosia, a finding that may challenge the recently proposed hypothesis that reading development and face processing abilities...

  3. Functional Brain Activation to Emotional and non-Emotional Faces in Healthy Children: Evidence for Developmentally Undifferentiated Amygdala Function During the School Age Period

    Science.gov (United States)

    Pagliaccio, David; Luby, Joan L.; Gaffrey, Michael S.; Belden, Andrew C.; Botteron, Kelly N.; Harms, Michael P.; Barch, Deanna M.

    2013-01-01

    The amygdala is a key region in emotion processing. Particularly, fMRI studies have demonstrated that the amygdala is active during the viewing of emotional faces. Previous research has consistently found greater amygdala responses to fearful faces as compared to neutral faces in adults, convergent with a focus in the animal literature on the amygdala's role in fear processing. Studies have found that the amygdala also responds differentially to other facial emotion types in adults. Yet, the literature regarding when this differential amygdala responsivity develops is limited and mixed. Thus, the goal of current study was to examine amygdala responses to emotional and neutral faces in a relatively large sample of healthy school age children (N = 52). While the amygdala was active in response to emotional and neutral faces, the results do not support the hypothesis that the amygdala responds differentially to emotional faces in 7 – 12 year old children. Nonetheless, amygdala activity was correlated with the severity of subclinical depression symptoms and emotional regulation skills. Additionally, sex differences were observed in frontal, temporal, and visual regions as well as effects of pubertal development in visual regions. These findings suggest important differences in amygdala reactivity in childhood. PMID:23636982

  4. Emotional language processing in autism spectrum disorders: a systematic review.

    Science.gov (United States)

    Lartseva, Alina; Dijkstra, Ton; Buitelaar, Jan K

    2014-01-01

    In his first description of Autism Spectrum Disorders (ASD), Kanner emphasized emotional impairments by characterizing children with ASD as indifferent to other people, self-absorbed, emotionally cold, distanced, and retracted. Thereafter, emotional impairments became regarded as part of the social impairments of ASD, and research mostly focused on understanding how individuals with ASD recognize visual expressions of emotions from faces and body postures. However, it still remains unclear how emotions are processed outside of the visual domain. This systematic review aims to fill this gap by focusing on impairments of emotional language processing in ASD. We systematically searched PubMed for papers published between 1990 and 2013 using standardized search terms. Studies show that people with ASD are able to correctly classify emotional language stimuli as emotionally positive or negative. However, processing of emotional language stimuli in ASD is associated with atypical patterns of attention and memory performance, as well as abnormal physiological and neural activity. Particularly, younger children with ASD have difficulties in acquiring and developing emotional concepts, and avoid using these in discourse. These emotional language impairments were not consistently associated with age, IQ, or level of development of language skills. We discuss how emotional language impairments fit with existing cognitive theories of ASD, such as central coherence, executive dysfunction, and weak Theory of Mind. We conclude that emotional impairments in ASD may be broader than just a mere consequence of social impairments, and should receive more attention in future research.

  5. Emotional language processing in Autism Spectrum Disorders: A systematic review

    Directory of Open Access Journals (Sweden)

    Alina eLartseva

    2015-01-01

    Full Text Available In his first description of Autism Spectrum Disorders (ASD, Kanner emphasized emotional impairments by characterizing children with ASD as indifferent to other people, self-absorbed, emotionally cold, distanced, and retracted. Thereafter, emotional impairments became regarded as part of the social impairments of ASD, and research mostly focused on understanding how individuals with ASD recognize visual expressions of emotions from faces and body postures. However, it still remains unclear how emotions are processed outside of the visual domain. This systematic review aims to fill this gap by focusing on impairments of emotional language processing in ASD.We systematically searched PubMed for papers published between 1990 and 2013 using standardized search terms. Studies show that people with ASD are able to correctly classify emotional language stimuli as emotionally positive or negative. However, processing of emotional language stimuli in ASD is associated with atypical patterns of attention and memory performance, as well as abnormal physiological and neural activity. Particularly, younger children with ASD have difficulties in acquiring and developing emotional concepts, and avoid using these in discourse. These emotional language impairments were not consistently associated with age, IQ, or level of development of language skills.We discuss how emotional language impairments fit with existing cognitive theories of ASD, such as central coherence, executive dysfunction, and weak Theory of Mind. We conclude that emotional impairments in ASD may be broader than just a mere consequence of social impairments, and should receive more attention in future research.

  6. Emotional language processing in autism spectrum disorders: a systematic review

    Science.gov (United States)

    Lartseva, Alina; Dijkstra, Ton; Buitelaar, Jan K.

    2015-01-01

    In his first description of Autism Spectrum Disorders (ASD), Kanner emphasized emotional impairments by characterizing children with ASD as indifferent to other people, self-absorbed, emotionally cold, distanced, and retracted. Thereafter, emotional impairments became regarded as part of the social impairments of ASD, and research mostly focused on understanding how individuals with ASD recognize visual expressions of emotions from faces and body postures. However, it still remains unclear how emotions are processed outside of the visual domain. This systematic review aims to fill this gap by focusing on impairments of emotional language processing in ASD. We systematically searched PubMed for papers published between 1990 and 2013 using standardized search terms. Studies show that people with ASD are able to correctly classify emotional language stimuli as emotionally positive or negative. However, processing of emotional language stimuli in ASD is associated with atypical patterns of attention and memory performance, as well as abnormal physiological and neural activity. Particularly, younger children with ASD have difficulties in acquiring and developing emotional concepts, and avoid using these in discourse. These emotional language impairments were not consistently associated with age, IQ, or level of development of language skills. We discuss how emotional language impairments fit with existing cognitive theories of ASD, such as central coherence, executive dysfunction, and weak Theory of Mind. We conclude that emotional impairments in ASD may be broader than just a mere consequence of social impairments, and should receive more attention in future research. PMID:25610383

  7. Emotion Processing for Arousal and Neutral Content in Alzheimer's Disease

    Directory of Open Access Journals (Sweden)

    Corina Satler

    2009-01-01

    Full Text Available Objective. To assess the ability of Alzheimer's disease (AD patients to perceive emotional information and to assign subjective emotional rating scores to audiovisual presentations. Materials and Methods. 24 subjects (14 with AD, matched to controls for age and educational levels were studied. After neuropsychological assessment, they watched a Neutral story and then a story with Emotional content. Results. Recall scores for both stories were significantly lower in AD (Neutral and Emotional: P=.001. CG assigned different emotional scores for each version of the test, P=.001, while ratings of AD did not differ, P=.32. Linear regression analyses determined the best predictors of emotional rating and recognition memory for each group among neuropsychological tests battery. Conclusions. AD patients show changes in emotional processing on declarative memory and a preserved ability to express emotions in face of arousal content. The present findings suggest that these impairments are due to general cognitive decline.

  8. Emotion processing for arousal and neutral content in Alzheimer's disease.

    Science.gov (United States)

    Satler, Corina; Uribe, Carlos; Conde, Carlos; Da-Silva, Sergio Leme; Tomaz, Carlos

    2010-02-01

    Objective. To assess the ability of Alzheimer's disease (AD) patients to perceive emotional information and to assign subjective emotional rating scores to audiovisual presentations. Materials and Methods. 24 subjects (14 with AD, matched to controls for age and educational levels) were studied. After neuropsychological assessment, they watched a Neutral story and then a story with Emotional content. Results. Recall scores for both stories were significantly lower in AD (Neutral and Emotional: P = .001). CG assigned different emotional scores for each version of the test, P = .001, while ratings of AD did not differ, P = .32. Linear regression analyses determined the best predictors of emotional rating and recognition memory for each group among neuropsychological tests battery. Conclusions. AD patients show changes in emotional processing on declarative memory and a preserved ability to express emotions in face of arousal content. The present findings suggest that these impairments are due to general cognitive decline.

  9. Abnormal early gamma responses to emotional faces differentiate unipolar from bipolar disorder patients.

    Science.gov (United States)

    Liu, T Y; Chen, Y S; Su, T P; Hsieh, J C; Chen, L F

    2014-01-01

    This study investigates the cortical abnormalities of early emotion perception in patients with major depressive disorder (MDD) and bipolar disorder (BD) using gamma oscillations. Twenty-three MDD patients, twenty-five BD patients, and twenty-four normal controls were enrolled and their event-related magnetoencephalographic responses were recorded during implicit emotional tasks. Our results demonstrated abnormal gamma activity within 100 ms in the emotion-related regions (amygdala, orbitofrontal (OFC) cortex, anterior insula (AI), and superior temporal pole) in the MDD patients, suggesting that these patients may have dysfunctions or negativity biases in perceptual binding of emotional features at very early stage. Decreased left superior medial frontal cortex (smFC) responses to happy faces in the MDD patients were correlated with their serious level of depression symptoms, indicating that decreased smFC activity perhaps underlies irregular positive emotion processing in depressed patients. In the BD patients, we showed abnormal activation in visual regions (inferior/middle occipital and middle temporal cortices) which responded to emotional faces within 100 ms, supporting that the BD patients may hyperactively respond to emotional features in perceptual binding. The discriminant function of gamma activation in the left smFC, right medial OFC, right AI/inferior OFC, and the right precentral cortex accurately classified 89.6% of patients as unipolar/bipolar disorders.

  10. Veiled emotions: the effect of covered faces on emotion perception and attitudes

    NARCIS (Netherlands)

    Fischer, A.H.; Gillebaart, M.; Rotteveel, M.; Becker, D.; Vliek, M.

    2012-01-01

    The present study explores the relative absence of expressive cues and the effect of contextual cues on the perception of emotions and its effect on attitudes. The visibility of expressive cues was manipulated by showing films displaying female targets whose faces were either fully visible, covered

  11. Brain and behavioral inhibitory control of kindergartners facing negative emotions.

    Science.gov (United States)

    Farbiash, Tali; Berger, Andrea

    2016-09-01

    Inhibitory control (IC) - one of the most critical functions underlying a child's ability to self-regulate - develops significantly throughout the kindergarten years. Experiencing negative emotions imposes challenges on executive functioning and may specifically affect IC. In this study, we examined kindergartners' IC and its related brain activity during a negative emotional situation: 58 children (aged 5.5-6.5 years) performed an emotion-induction Go/NoGo task. During this task, we recorded children's performance and brain activity, focusing on the fronto-central N2 component in the event-related potential (ERP) and the power of its underlying theta frequency. Compared to Go trials, inhibition of NoGo trials was associated with larger N2 amplitudes and theta power. The negative emotional experience resulted in better IC performance and, at the brain level, in larger theta power. Source localization of this effect showed that the brain activity related to IC during the negative emotional experience was principally generated in the posterior frontal regions. Furthermore, the band power measure was found to be a more sensitive index for children's inhibitory processes than N2 amplitudes. This is the first study to focus on kindergartners' IC while manipulating their emotional experience to induce negative emotions. Our findings suggest that a kindergartner's experience of negative emotion can result in improved IC and increases in associated aspects of brain activity. Our results also suggest the utility of time-frequency analyses in the study of brain processes associated with response inhibition in young children. © 2015 John Wiley & Sons Ltd.

  12. Different neural and cognitive response to emotional faces in healthy monozygotic twins at risk of depression.

    Science.gov (United States)

    Miskowiak, K W; Glerup, L; Vestbo, C; Harmer, C J; Reinecke, A; Macoveanu, J; Siebner, H R; Kessing, L V; Vinberg, M

    2015-05-01

    Negative cognitive bias and aberrant neural processing of emotional faces are trait-marks of depression. Yet it is unclear whether these changes constitute an endophenotype for depression and are also present in healthy individuals with hereditary risk for depression. Thirty healthy, never-depressed monozygotic (MZ) twins with a co-twin history of depression (high risk group: n = 13) or without co-twin history of depression (low-risk group: n = 17) were enrolled in a functional magnetic resonance imaging (fMRI) study. During fMRI, participants viewed fearful and happy faces while performing a gender discrimination task. After the scan, they were given a faces dot-probe task, a facial expression recognition task and questionnaires assessing mood, personality traits and coping strategies. High-risk twins showed increased neural response to happy and fearful faces in dorsal anterior cingulate cortex (ACC), dorsomedial prefrontal cortex (dmPFC), pre-supplementary motor area and occipito-parietal regions compared to low-risk twins. They also displayed stronger negative coupling between amygdala and pregenual ACC, dmPFC and temporo-parietal regions during emotional face processing. These task-related changes in neural responses in high-risk twins were accompanied by impaired gender discrimination performance during face processing. They also displayed increased attention vigilance for fearful faces and were slower at recognizing facial expressions relative to low-risk controls. These effects occurred in the absence of differences between groups in mood, subjective state or coping. Different neural response and functional connectivity within fronto-limbic and occipito-parietal regions during emotional face processing and enhanced fear vigilance may be key endophenotypes for depression.

  13. Neural activation to emotional faces in adolescents with autism spectrum disorders.

    Science.gov (United States)

    Weng, Shih-Jen; Carrasco, Melisa; Swartz, Johnna R; Wiggins, Jillian Lee; Kurapati, Nikhil; Liberzon, Israel; Risi, Susan; Lord, Catherine; Monk, Christopher S

    2011-03-01

    Autism spectrum disorders (ASD) involve a core deficit in social functioning and impairments in the ability to recognize face emotions. In an emotional faces task designed to constrain group differences in attention, the present study used functional MRI to characterize activation in the amygdala, ventral prefrontal cortex (vPFC), and striatum, three structures involved in socio-emotional processing in adolescents with ASD. Twenty-two adolescents with ASD and 20 healthy adolescents viewed facial expressions (happy, fearful, sad and neutral) that were briefly presented (250 ms) during functional MRI acquisition. To monitor attention, subjects pressed a button to identify the gender of each face. The ASD group showed greater activation to the faces relative to the control group in the amygdala, vPFC and striatum. Follow-up analyses indicated that the ASD relative to control group showed greater activation in the amygdala, vPFC and striatum (p gender identification task. When group differences in attention to facial expressions were limited, adolescents with ASD showed greater activation in structures involved in socio-emotional processing. © 2010 The Authors. Journal of Child Psychology and Psychiatry © 2010 Association for Child and Adolescent Mental Health.

  14. Linking children's neuropsychological processing of emotion with their knowledge of emotion expression regulation.

    Science.gov (United States)

    Watling, Dawn; Bourne, Victoria J

    2007-09-01

    Understanding of emotions has been shown to develop between the ages of 4 and 10 years; however, individual differences exist in this development. While previous research has typically examined these differences in terms of developmental and/or social factors, little research has considered the possible impact of neuropsychological development on the behavioural understanding of emotions. Emotion processing tends to be lateralised to the right hemisphere of the brain in adults, yet this pattern is not as evident in children until around the age of 10 years. In this study 136 children between 5 and 10 years were given both behavioural and neuropsychological tests of emotion processing. The behavioural task examined expression regulation knowledge (ERK) for prosocial and self-presentational hypothetical interactions. The chimeric faces test was given as a measure of lateralisation for processing positive facial emotion. An interaction between age and lateralisation for emotion processing was predictive of children's ERK for only the self-presentational interactions. The relationships between children's ERK and lateralisation for emotion processing changes across the three age groups, emerging as a positive relationship in the 10-year-olds. The 10-years-olds who were more lateralised to the right hemisphere for emotion processing tended to show greater understanding of the need for regulating negative emotions during interactions that would have a self-presentational motivation. This finding suggests an association between the behavioural and neuropsychological development of emotion processing.

  15. Implicit conditioning of faces via the social regulation of emotion: ERP evidence of early attentional biases for security conditioned faces.

    Science.gov (United States)

    Beckes, Lane; Coan, James A; Morris, James P

    2013-08-01

    Not much is known about the neural and psychological processes that promote the initial conditions necessary for positive social bonding. This study explores one method of conditioned bonding utilizing dynamics related to the social regulation of emotion and attachment theory. This form of conditioning involves repeated presentations of negative stimuli followed by images of warm, smiling faces. L. Beckes, J. Simpson, and A. Erickson (2010) found that this conditioning procedure results in positive associations with the faces measured via a lexical decision task, suggesting they are perceived as comforting. This study found that the P1 ERP was similarly modified by this conditioning procedure and the P1 amplitude predicted lexical decision times to insecure words primed by the faces. The findings have implications for understanding how the brain detects supportive people, the flexibility and modifiability of early ERP components, and social bonding more broadly. Copyright © 2013 Society for Psychophysiological Research.

  16. Dysregulation in cortical reactivity to emotional faces in PTSD patients with high dissociation symptoms

    Directory of Open Access Journals (Sweden)

    Aleksandra Klimova

    2013-09-01

    Full Text Available Background: Predominant dissociation in posttraumatic stress disorder (PTSD is characterized by restricted affective responses to positive stimuli. To date, no studies have examined neural responses to a range of emotional expressions in PTSD with high dissociative symptoms. Objective: This study tested the hypothesis that PTSD patients with high dissociative symptoms will display increased event-related potential (ERP amplitudes in early components (N1, P1 to threatening faces (angry, fearful, and reduced later ERP amplitudes (Vertex Positive Potential (VPP, P3 to happy faces compared to PTSD patients with low dissociative symptoms. Methods: Thirty-nine civilians with PTSD were classified as high dissociative (n=16 or low dissociative (n=23 according to their responses on the Clinician Administered Dissociative States Scale. ERPs were recorded, whilst participants viewed emotional (happy, angry, fear and neutral facial expressions in a passive viewing task. Results: High dissociative PTSD patients displayed significantly increased N120 amplitude to the majority of facial expressions (neutral, happy, and angry compared to low dissociative PTSD patients under conscious and preconscious conditions. The high dissociative PTSD group had significantly reduced VPP amplitude to happy faces in the conscious condition. Conclusion: High dissociative PTSD patients displayed increased early (preconscious cortical responses to emotional stimuli, and specific reductions to happy facial expressions in later (conscious, face-specific components compared to low dissociative PTSD patients. Dissociation in PTSD may act to increase initial pre-attentive processing of affective stimuli, and specifically reduce cortical reactivity to happy faces when consciously processing these stimuli.

  17. Functional connectivity of emotional processing in depression.

    LENUS (Irish Health Repository)

    Carballedo, Angela

    2012-02-01

    OBJECTIVES: The aim of the study is to map a neural network of emotion processing and to identify differences in major depression compared to healthy controls. It is hypothesized that intentional perception of emotional faces activates connections between amygdala (Demir et al.), orbitofrontal cortex (OFC), anterior cingulate cortex (ACC) and prefrontal cortex (PFC) and that frontal-amygdala connections are altered in major depressive disorder (MDD). METHODS: Fifteen medication-free patients with MDD and fifteen healthy controls were enrolled. All subjects were assessed using the same face-matching functional Magnetic Resonance Imaging (fMRI) task, known to involve those areas. Brain activations were obtained using Statistical Parametric Mapping version 5 (SPM5) for data analysis and MARSBAR for extracting of fMRI time series. Then data was analyzed using structural equation modeling (SEM). RESULTS: A valid model was established for the left and the right hemispheres showing a circuit involving ACC, OFC, PFC and AMY. The left hemisphere shows significant lower connectivity strengths in patients than controls, for the pathway that goes from AMY to the OF11, and a trend of higher connectivity in patients for the path that goes from the PF9 to the OF11. In the right hemisphere, patients show lower connectivity coefficients in the paths from the AMY to OF11, from the AMY to ACC, and from the ACC to PF9. By the contrary, controls show lower connectivity strengths for the path that goes from ACC to AMY. CONCLUSIONS: Functional disconnection between limbic and frontal brain regions could be demonstrated using structural equation modeling. The interpretation of these findings could be that there is an emotional processing bias with disconnection bilaterally between amygdala to orbitofrontal cortices and in addition a right disconnection between amygdala and ACC as well as between ACC and prefrontal cortex possibly in line with a more prominent role for the right hemisphere

  18. Attention Modulates the Neural Processes Underlying Multisensory Integration of Emotion

    Directory of Open Access Journals (Sweden)

    Hao Tam Ho

    2011-10-01

    Full Text Available Integrating emotional information from multiple sensory modalities is generally assumed to be a pre-attentive process (de Gelder et al., 1999. This assumption, however, presupposes that the integrative process occurs independent of attention. Using event-potentials (ERP the present study investigated whether the neural processes underlying the integration of dynamic facial expression and emotional prosody is indeed unaffected by attentional manipulations. To this end, participants were presented with congruent and incongruent face-voice combinations (eg, an angry face combined with a neutral voice and performed different two-choice tasks in four consecutive blocks. Three of the tasks directed the participants' attention to emotion expressions in the face, the voice or both. The fourth task required participants to attend to the synchronicity between voice and lip movements. The results show divergent modulations of early ERP components by the different attentional manipulations. For example, when attention was directed to the face (or the voice, incongruent stimuli elicited a reduced N1 as compared to congruent stimuli. This effect was absent, when attention was diverted away from the emotionality in both face and voice suggesting that the detection of emotional incongruence already requires attention. Based on these findings, we question whether multisensory integration of emotion occurs indeed pre-attentively.

  19. Image-based Analysis of Emotional Facial Expressions in Full Face Transplants.

    Science.gov (United States)

    Bedeloglu, Merve; Topcu, Çagdas; Akgul, Arzu; Döger, Ela Naz; Sever, Refik; Ozkan, Ozlenen; Ozkan, Omer; Uysal, Hilmi; Polat, Ovunc; Çolak, Omer Halil

    2018-01-20

    In this study, it is aimed to determine the degree of the development in emotional expression of full face transplant patients from photographs. Hence, a rehabilitation process can be planned according to the determination of degrees as a later work. As envisaged, in full face transplant cases, the determination of expressions can be confused or cannot be achieved as the healthy control group. In order to perform image-based analysis, a control group consist of 9 healthy males and 2 full-face transplant patients participated in the study. Appearance-based Gabor Wavelet Transform (GWT) and Local Binary Pattern (LBP) methods are adopted for recognizing neutral and 6 emotional expressions which consist of angry, scared, happy, hate, confused and sad. Feature extraction was carried out by using both methods and combination of these methods serially. In the performed expressions, the extracted features of the most distinct zones in the facial area where the eye and mouth region, have been used to classify the emotions. Also, the combination of these region features has been used to improve classifier performance. Control subjects and transplant patients' ability to perform emotional expressions have been determined with K-nearest neighbor (KNN) classifier with region-specific and method-specific decision stages. The results have been compared with healthy group. It has been observed that transplant patients don't reflect some emotional expressions. Also, there were confusions among expressions.

  20. Emotional memory and perception of emotional faces in patients suffering from depersonalization disorder.

    NARCIS (Netherlands)

    Montagne, B.; Sierra, M.; Medford, N.; Hunter, E.; Baker, D.J.; Kessels, R.P.C.; Haan, E.H.F. de; David, A.S.

    2007-01-01

    Previous work has shown that patients with depersonalization disorder (DPD) have reduced physiological responses to emotional stimuli, which may be related to subjective emotional numbing. This study investigated two aspects of affective processing in 13 patients with DPD according to the DSM-IV

  1. Gender differences in the recognition of emotional faces: are men less efficient?

    Directory of Open Access Journals (Sweden)

    Ana Ruiz-Ibáñez

    2017-06-01

    Full Text Available As research in recollection of stimuli with emotional valence indicates, emotions influence memory. Many studies in face and emotional facial expression recognition have focused on age (young and old people and gender-associated (men and women differences. Nevertheless, this kind of studies has produced contradictory results, because of that, it would be necessary to study gender involvement in depth. The main objective of our research consists of analyzing the differences in image recognition using faces with emotional facial expressions between two groups composed by university students aged 18-30. The first group is constituted by men and the second one by women. The results showed statistically significant differences in face corrected recognition (hit rate - false alarm rate: the women demonstrated a better recognition than the men. However, other analyzed variables as time or efficiency do not provide conclusive results. Furthermore, a significant negative correlation between the time used and the efficiency when doing the task was found in the male group. This information reinforces not only the hypothesis of gender difference in face recognition, in favor of women, but also these ones that suggest a different cognitive processing of facial stimuli in both sexes. Finally, we argue the necessity of a greater research related to variables as age or sociocultural level.

  2. Recognition memory for low- and high-frequency-filtered emotional faces: Low spatial frequencies drive emotional memory enhancement, whereas high spatial frequencies drive the emotion-induced recognition bias.

    Science.gov (United States)

    Rohr, Michaela; Tröger, Johannes; Michely, Nils; Uhde, Alarith; Wentura, Dirk

    2017-07-01

    This article deals with two well-documented phenomena regarding emotional stimuli: emotional memory enhancement-that is, better long-term memory for emotional than for neutral stimuli-and the emotion-induced recognition bias-that is, a more liberal response criterion for emotional than for neutral stimuli. Studies on visual emotion perception and attention suggest that emotion-related processes can be modulated by means of spatial-frequency filtering of the presented emotional stimuli. Specifically, low spatial frequencies are assumed to play a primary role for the influence of emotion on attention and judgment. Given this theoretical background, we investigated whether spatial-frequency filtering also impacts (1) the memory advantage for emotional faces and (2) the emotion-induced recognition bias, in a series of old/new recognition experiments. Participants completed incidental-learning tasks with high- (HSF) and low- (LSF) spatial-frequency-filtered emotional and neutral faces. The results of the surprise recognition tests showed a clear memory advantage for emotional stimuli. Most importantly, the emotional memory enhancement was significantly larger for face images containing only low-frequency information (LSF faces) than for HSF faces across all experiments, suggesting that LSF information plays a critical role in this effect, whereas the emotion-induced recognition bias was found only for HSF stimuli. We discuss our findings in terms of both the traditional account of different processing pathways for HSF and LSF information and a stimulus features account. The double dissociation in the results favors the latter account-that is, an explanation in terms of differences in the characteristics of HSF and LSF stimuli.

  3. Processing of unattended facial emotions: a visual mismatch negativity study.

    Science.gov (United States)

    Stefanics, Gábor; Csukly, Gábor; Komlósi, Sarolta; Czobor, Pál; Czigler, István

    2012-02-01

    Facial emotions express our internal states and are fundamental in social interactions. Here we explore whether the repetition of unattended facial emotions builds up a predictive representation of frequently encountered emotions in the visual system. Participants (n=24) were presented peripherally with facial stimuli expressing emotions while they performed a visual detection task presented in the center of the visual field. Facial stimuli consisted of four faces of different identity, but expressed the same emotion (happy or fearful). Facial stimuli were presented in blocks of oddball sequence (standard emotion: p=0.9, deviant emotion: p=0.1). Event-related potentials (ERPs) to the same emotions were compared when the emotions were deviant and standard, respectively. We found visual mismatch negativity (vMMN) responses to unattended deviant emotions in the 170-360 ms post-stimulus range over bilateral occipito-temporal sites. Our results demonstrate that information about the emotional content of unattended faces presented at the periphery of the visual field is rapidly processed and stored in a predictive memory representation by the visual system. We also found evidence that differential processing of deviant fearful faces starts already at 70-120 ms after stimulus onset. This finding shows a 'negativity bias' under unattended conditions. Differential processing of fearful deviants were more pronounced in the right hemisphere in the 195-275 ms and 360-390 ms intervals, whereas processing of happy deviants evoked larger differential response in the left hemisphere in the 360-390 ms range, indicating differential hemispheric specialization for automatic processing of positive and negative affect. Copyright © 2011 Elsevier Inc. All rights reserved.

  4. Gender differences in hemispheric asymmetry for face processing

    Directory of Open Access Journals (Sweden)

    Matarazzo Silvia

    2006-06-01

    Full Text Available Abstract Background Current cognitive neuroscience models predict a right-hemispheric dominance for face processing in humans. However, neuroimaging and electromagnetic data in the literature provide conflicting evidence of a right-sided brain asymmetry for decoding the structural properties of faces. The purpose of this study was to investigate whether this inconsistency might be due to gender differences in hemispheric asymmetry. Results In this study, event-related brain potentials (ERPs were recorded in 40 healthy, strictly right-handed individuals (20 women and 20 men while they observed infants' faces expressing a variety of emotions. Early face-sensitive P1 and N1 responses to neutral vs. affective expressions were measured over the occipital/temporal cortices, and the responses were analyzed according to viewer gender. Along with a strong right hemispheric dominance for men, the results showed a lack of asymmetry for face processing in the amplitude of the occipito-temporal N1 response in women to both neutral and affective faces. Conclusion Men showed an asymmetric functioning of visual cortex while decoding faces and expressions, whereas women showed a more bilateral functioning. These results indicate the importance of gender effects in the lateralization of the occipito-temporal response during the processing of face identity, structure, familiarity, or affective content.

  5. Holistic processing of face configurations and components.

    Science.gov (United States)

    Hayward, William G; Crookes, Kate; Chu, Ming Hon; Favelle, Simone K; Rhodes, Gillian

    2016-10-01

    Although many researchers agree that faces are processed holistically, we know relatively little about what information holistic processing captures from a face. Most studies that assess the nature of holistic processing do so with changes to the face affecting many different aspects of face information (e.g., different identities). Does holistic processing affect every aspect of a face? We used the composite task, a common means of examining the strength of holistic processing, with participants making same-different judgments about configuration changes or component changes to 1 portion of a face. Configuration changes involved changes in spatial position of the eyes, whereas component changes involved lightening or darkening the eyebrows. Composites were either aligned or misaligned, and were presented either upright or inverted. Both configuration judgments and component judgments showed evidence of holistic processing, and in both cases it was strongest for upright face composites. These results suggest that holistic processing captures a broad range of information about the face, including both configuration-based and component-based information. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. Childhood Poverty Predicts Adult Amygdala and Frontal Activity and Connectivity in Response to Emotional Faces

    Directory of Open Access Journals (Sweden)

    Arash eJavanbakht

    2015-06-01

    Full Text Available Childhood poverty negatively impacts physical and mental health in adulthood. Altered brain development in response to social and environmental factors associated with poverty likely contributes to this effect, engendering maladaptive patterns of social attribution and/or elevated physiological stress. In this fMRI study, we examined the association between childhood poverty and neural processing of social signals (i.e., emotional faces in adulthood. 52 subjects from a longitudinal prospective study recruited as children, participated in a brain imaging study at 23-25 years of age using the Emotional Faces Assessment Task (EFAT. Childhood poverty, independent of concurrent adult income, was associated with higher amygdala and mPFC responses to threat vs. happy faces. Also, childhood poverty was associated with decreased functional connectivity between left amygdala and mPFC. This study is unique because it prospectively links childhood poverty to emotional processing during adulthood, suggesting a candidate neural mechanism for negative social-emotional bias. Adults who grew up poor appear to be more sensitive to social threat cues and less sensitive to positive social cues.

  7. A face to remember: emotional expression modulates prefrontal activity during memory formation.

    Science.gov (United States)

    Sergerie, Karine; Lepage, Martin; Armony, Jorge L

    2005-01-15

    Emotion can exert a modulatory role on episodic memory. Several studies have shown that negative stimuli (e.g., words, pictures) are better remembered than neutral ones. Although facial expressions are powerful emotional stimuli and have been shown to influence perception and attention processes, little is known about their effect on memory. We used functional magnetic resonance imaging (fMRI) in humans to investigate the effects of expression (happy, neutral, and fearful) on prefrontal cortex (PFC) activity during the encoding of faces, using a subsequent memory effect paradigm. Our results show that activity in right PFC predicted memory for faces, regardless of expression, while a homotopic region in the left hemisphere was associated with successful encoding only for faces with an emotional expression. These findings are consistent with the proposed role of right dorsolateral PFC in successful encoding of nonverbal material, but also suggest that left DLPFC may be a site where integration of memory and emotional processes occurs. This study sheds new light on the current controversy regarding the hemispheric lateralization of PFC in memory encoding.

  8. Are Max-Specified Infant Facial Expressions during Face-to-Face Interaction Consistent with Differential Emotions Theory?

    Science.gov (United States)

    Matias, Reinaldo; Cohn, Jeffrey F.

    1993-01-01

    Examined infant facial expressions at two, four, and six months of age during face-to-face play and a still-face interaction with their mothers. Contrary to differential emotions theory, at no age did proportions or durations of discrete and blended negative expressions differ; they also showed different patterns of developmental change. (MM)

  9. Age-related emotional bias in processing two emotionally valenced tasks.

    Science.gov (United States)

    Allen, Philip A; Lien, Mei-Ching; Jardin, Elliott

    2017-01-01

    Previous studies suggest that older adults process positive emotions more efficiently than negative emotions, whereas younger adults show the reverse effect. We examined whether this age-related difference in emotional bias still occurs when attention is engaged in two emotional tasks. We used a psychological refractory period paradigm and varied the emotional valence of Task 1 and Task 2. In both experiments, Task 1 was emotional face discrimination (happy vs. angry faces) and Task 2 was sound discrimination (laugh, punch, vs. cork pop in Experiment 1 and laugh vs. scream in Experiment 2). The backward emotional correspondence effect for positively and negatively valenced Task 2 on Task 1 was measured. In both experiments, younger adults showed a backward correspondence effect from a negatively valenced Task 2, suggesting parallel processing of negatively valenced stimuli. Older adults showed similar negativity bias in Experiment 2 with a more salient negative sound ("scream" relative to "punch"). These results are consistent with an arousal-bias competition model [Mather and Sutherland (Perspectives in Psychological Sciences 6:114-133, 2011)], suggesting that emotional arousal modulates top-down attentional control settings (emotional regulation) with age.

  10. Do Valenced Odors and Trait Body Odor Disgust Affect Evaluation of Emotion in Dynamic Faces?

    Science.gov (United States)

    Syrjänen, Elmeri; Liuzza, Marco Tullio; Fischer, Håkan; Olofsson, Jonas K

    2017-12-01

    Disgust is a core emotion evolved to detect and avoid the ingestion of poisonous food as well as the contact with pathogens and other harmful agents. Previous research has shown that multisensory presentation of olfactory and visual information may strengthen the processing of disgust-relevant information. However, it is not known whether these findings extend to dynamic facial stimuli that changes from neutral to emotionally expressive, or if individual differences in trait body odor disgust may influence the processing of disgust-related information. In this preregistered study, we tested whether a classification of dynamic facial expressions as happy or disgusted, and an emotional evaluation of these facial expressions, would be affected by individual differences in body odor disgust sensitivity, and by exposure to a sweat-like, negatively valenced odor (valeric acid), as compared with a soap-like, positively valenced odor (lilac essence) or a no-odor control. Using Bayesian hypothesis testing, we found evidence that odors do not affect recognition of emotion in dynamic faces even when body odor disgust sensitivity was used as moderator. However, an exploratory analysis suggested that an unpleasant odor context may cause faster RTs for faces, independent of their emotional expression. Our results further our understanding of the scope and limits of odor effects on facial perception affect and suggest further studies should focus on reproducibility, specifying experimental circumstances where odor effects on facial expressions may be present versus absent.

  11. Facial Expression Aftereffect Revealed by Adaption to Emotion-Invisible Dynamic Bubbled Faces

    Science.gov (United States)

    Luo, Chengwen; Wang, Qingyun; Schyns, Philippe G.; Kingdom, Frederick A. A.; Xu, Hong

    2015-01-01

    Visual adaptation is a powerful tool to probe the short-term plasticity of the visual system. Adapting to local features such as the oriented lines can distort our judgment of subsequently presented lines, the tilt aftereffect. The tilt aftereffect is believed to be processed at the low-level of the visual cortex, such as V1. Adaptation to faces, on the other hand, can produce significant aftereffects in high-level traits such as identity, expression, and ethnicity. However, whether face adaptation necessitate awareness of face features is debatable. In the current study, we investigated whether facial expression aftereffects (FEAE) can be generated by partially visible faces. We first generated partially visible faces using the bubbles technique, in which the face was seen through randomly positioned circular apertures, and selected the bubbled faces for which the subjects were unable to identify happy or sad expressions. When the subjects adapted to static displays of these partial faces, no significant FEAE was found. However, when the subjects adapted to a dynamic video display of a series of different partial faces, a significant FEAE was observed. In both conditions, subjects could not identify facial expression in the individual adapting faces. These results suggest that our visual system is able to integrate unrecognizable partial faces over a short period of time and that the integrated percept affects our judgment on subsequently presented faces. We conclude that FEAE can be generated by partial face with little facial expression cues, implying that our cognitive system fills-in the missing parts during adaptation, or the subcortical structures are activated by the bubbled faces without conscious recognition of emotion during adaptation. PMID:26717572

  12. Word wins over Face: Emotional Stroop effect activates the frontal cortical network

    Directory of Open Access Journals (Sweden)

    Shima Ovaysikia

    2011-01-01

    Full Text Available The prefrontal cortex (PFC has been implicated in higher order cognitive control of behaviour. Sometimes such control is executed through suppression of an unwanted response in order to avoid conflict. Conflict occurs when two simultaneously competing processes lead to different behavioral outcomes, as seen in tasks such as the anti-saccade, go/no-go and the Stroop task. We set out to examine whether different types of stimuli in a modified emotional Stroop task would cause similar interference effects as the original Stroop-colour/word, and whether the required suppression mechanism(s would recruit similar regions of the medial PFC (mPFC. By using emotional words and emotional faces in this Stroop experiment, we examined the two well-learned automatic behaviours of word reading and recognition of face expressions. In our emotional Stroop paradigm, words were processed faster than face expressions with incongruent trials yielding longer reaction times (RT and larger number of errors compared to the congruent trials. This novel Stroop effect activated the anterior and inferior regions of the mPFC, namely the anterior cingulate cortex (ACC, inferior frontal gyrus (IFG as well as the superior frontal gyrus. Our results suggest that prepotent behaviours such as reading and recognition of face expressions are stimulus-dependent and perhaps hierarchical, hence recruiting distinct regions of the mPFC. Moreover, the faster processing of word reading compared to reporting face expressions is indicative of the formation of stronger stimulus-response (SR associations of an over-learned behaviour compared to an instinctive one, which could alternatively be explained through the distinction between awareness and selective attention.

  13. Facing emotions in narcolepsy with cataplexy: haemodynamic and behavioural responses during emotional stimulation.

    Science.gov (United States)

    de Zambotti, Massimiliano; Pizza, Fabio; Covassin, Naima; Vandi, Stefano; Cellini, Nicola; Stegagno, Luciano; Plazzi, Giuseppe

    2014-08-01

    Narcolepsy with cataplexy is a complex sleep disorder that affects the modulation of emotions: cataplexy, the key symptom of narcolepsy, is indeed strongly linked with emotions that usually trigger the episodes. Our study aimed to investigate haemodynamic and behavioural responses during emotional stimulation in narco-cataplexy. Twelve adult drug-naive narcoleptic patients (five males; age: 33.3 ± 9.4 years) and 12 healthy controls (five males; age: 30.9 ± 9.5 years) were exposed to emotional stimuli (pleasant, unpleasant and neutral pictures). Heart rate, arterial blood pressure and mean cerebral blood flow velocity of the middle cerebral arteries were continuously recorded using photoplethysmography and Doppler ultrasound. Ratings of valence and arousal and coping strategies were scored by the Self-Assessment Manikin and by questionnaires, respectively. Narcoleptic patients' haemodynamic responses to pictures overlapped with the data obtained from controls: decrease of heart rate and increase of mean cerebral blood flow velocity regardless of pictures' content, increase of systolic blood pressure during the pleasant condition, and relative reduction of heart rate during pleasant and unpleasant conditions. However, when compared with controls, narcoleptic patients reported lower arousal scores during the pleasant and neutral stimulation, and lower valence scores during the pleasant condition, respectively, and also a lower score at the 'focus on and venting of emotions' dimensions of coping. Our results suggested that adult narcoleptic patients, compared with healthy controls, inhibited their emotion-expressive behaviour to emotional stimulation, and that may be related to the development of adaptive cognitive strategies to face emotions avoiding cataplexy. © 2014 European Sleep Research Society.

  14. Gender differences in human single neuron responses to male emotional faces.

    Science.gov (United States)

    Newhoff, Morgan; Treiman, David M; Smith, Kris A; Steinmetz, Peter N

    2015-01-01

    Well-documented differences in the psychology and behavior of men and women have spurred extensive exploration of gender's role within the brain, particularly regarding emotional processing. While neuroanatomical studies clearly show differences between the sexes, the functional effects of these differences are less understood. Neuroimaging studies have shown inconsistent locations and magnitudes of gender differences in brain hemodynamic responses to emotion. To better understand the neurophysiology of these gender differences, we analyzed recordings of single neuron activity in the human brain as subjects of both genders viewed emotional expressions. This study included recordings of single-neuron activity of 14 (6 male) epileptic patients in four brain areas: amygdala (236 neurons), hippocampus (n = 270), anterior cingulate cortex (n = 256), and ventromedial prefrontal cortex (n = 174). Neural activity was recorded while participants viewed a series of avatar male faces portraying positive, negative or neutral expressions. Significant gender differences were found in the left amygdala, where 23% (n = 15∕66) of neurons in men were significantly affected by facial emotion, vs. 8% (n = 6∕76) of neurons in women. A Fisher's exact test comparing the two ratios found a highly significant difference between the two (p differences between genders at the single-neuron level in the human amygdala. These differences may reflect gender-based distinctions in evolved capacities for emotional processing and also demonstrate the importance of including subject gender as an independent factor in future studies of emotional processing by single neurons in the human amygdala.

  15. Influence of Emotional Facial Expressions on 3-5-Year-Olds' Face Recognition

    Science.gov (United States)

    Freitag, Claudia; Schwarzer, Gudrun

    2011-01-01

    Three experiments examined 3- and 5-year-olds' recognition of faces in constant and varied emotional expressions. Children were asked to identify repeatedly presented target faces, distinguishing them from distractor faces, during an immediate recognition test and during delayed assessments after 10 min and one week. Emotional facial expression…

  16. The impact of emotional faces on social motivation in schizophrenia.

    Science.gov (United States)

    Radke, Sina; Pfersmann, Vera; Derntl, Birgit

    2015-10-01

    Impairments in emotion recognition and psychosocial functioning are a robust phenomenon in schizophrenia and may affect motivational behavior, particularly during socio-emotional interactions. To characterize potential deficits and their interplay, we assessed social motivation covering various facets, such as implicit and explicit approach-avoidance tendencies to facial expressions, in 27 patients with schizophrenia (SZP) and 27 matched healthy controls (HC). Moreover, emotion recognition abilities as well as self-reported behavioral activation and inhibition were evaluated. Compared to HC, SZP exhibited less pronounced approach-avoidance ratings to happy and angry expressions along with prolonged reactions during automatic approach-avoidance. Although deficits in emotion recognition were replicated, these were not associated with alterations in social motivation. Together with additional connections between psychopathology and several approach-avoidance processes, these results identify motivational impairments in SZP and suggest a complex relationship between different aspects of social motivation. In the context of specialized interventions aimed at improving social cognitive abilities in SZP, the link between such dynamic measures, motivational profiles and functional outcomes warrants further investigations, which can provide important leverage points for treatment. Crucially, our findings present first insights into the assessment and identification of target features of social motivation.

  17. Detection of emotional faces: salient physical features guide effective visual search.

    Science.gov (United States)

    Calvo, Manuel G; Nummenmaa, Lauri

    2008-08-01

    In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent, surprised and disgusted faces was found both under upright and inverted display conditions. Inversion slowed down the detection of these faces less than that of others (fearful, angry, and sad). Accordingly, the detection advantage involves processing of featural rather than configural information. The facial features responsible for the detection advantage are located in the mouth rather than the eye region. Computationally modeled visual saliency predicted both attentional orienting and detection. Saliency was greatest for the faces (happy) and regions (mouth) that were fixated earlier and detected faster, and there was close correspondence between the onset of the modeled saliency peak and the time at which observers initially fixated the faces. The authors conclude that visual saliency of specific facial features--especially the smiling mouth--is responsible for facilitated initial orienting, which thus shortens detection. (PsycINFO Database Record (c) 2008 APA, all rights reserved).

  18. Emotional face expression modulates occipital-frontal effective connectivity during memory formation in a bottom-up fashion

    Directory of Open Access Journals (Sweden)

    Daiming eXiu

    2015-04-01

    Full Text Available This study investigated the role of bottom-up and top-down neural mechanisms in the processing of emotional face expression during memory formation. Functional brain imaging data was acquired during incidental learning of positive (‘happy’, neutral and negative (‘angry’ or ‘fearful’ faces. Dynamic Causal Modeling (DCM was applied on the fMRI data to characterize effective connectivity within a brain network involving face perception (inferior occipital gyrus and fusiform gyrus and successful memory formation related areas (hippocampus, superior parietal lobule, amygdala and orbitofrontal cortex. The bottom-up models assumed processing of emotional face expression along feed forward pathways to the orbitofrontal cortex. The top-down models assumed that the orbitofrontal cortex processed emotional valence and mediated connections to the hippocampus. A subsequent recognition memory test showed an effect of negative emotion on the response bias, but not on memory performance. Our DCM findings showed that the bottom-up model family of effective connectivity best explained the data across all subjects and specified that emotion affected most bottom-up connections to the orbitofrontal cortex, especially from the occipital visual cortex and superior parietal lobule. Of those pathways to the orbitofrontal cortex the connection from the inferior occipital gyrus correlated with memory performance independently of valence. We suggest that bottom-up neural mechanisms support effects of emotional face expression and memory formation in a parallel and partially overlapping fashion.

  19. A facial expression of pax: Assessing children's "recognition" of emotion from faces.

    Science.gov (United States)

    Nelson, Nicole L; Russell, James A

    2016-01-01

    In a classic study, children were shown an array of facial expressions and asked to choose the person who expressed a specific emotion. Children were later asked to name the emotion in the face with any label they wanted. Subsequent research often relied on the same two tasks--choice from array and free labeling--to support the conclusion that children recognize basic emotions from facial expressions. Here five studies (N=120, 2- to 10-year-olds) showed that these two tasks produce illusory recognition; a novel nonsense facial expression was included in the array. Children "recognized" a nonsense emotion (pax or tolen) and two familiar emotions (fear and jealousy) from the same nonsense face. Children likely used a process of elimination; they paired the unknown facial expression with a label given in the choice-from-array task and, after just two trials, freely labeled the new facial expression with the new label. These data indicate that past studies using this method may have overestimated children's expression knowledge. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Neural processing of emotional-intensity predicts emotion regulation choice.

    Science.gov (United States)

    Shafir, Roni; Thiruchselvam, Ravi; Suri, Gaurav; Gross, James J; Sheppes, Gal

    2016-12-01

    Emotional-intensity is a core characteristic of affective events that strongly determines how individuals choose to regulate their emotions. Our conceptual framework suggests that in high emotional-intensity situations, individuals prefer to disengage attention using distraction, which can more effectively block highly potent emotional information, as compared with engagement reappraisal, which is preferred in low emotional-intensity. However, existing supporting evidence remains indirect because prior intensity categorization of emotional stimuli was based on subjective measures that are potentially biased and only represent the endpoint of emotional-intensity processing. Accordingly, this study provides the first direct evidence for the role of online emotional-intensity processing in predicting behavioral regulatory-choices. Utilizing the high temporal resolution of event-related potentials, we evaluated online neural processing of stimuli's emotional-intensity (late positive potential, LPP) prior to regulatory-choices between distraction and reappraisal. Results showed that enhanced neural processing of intensity (enhanced LPP amplitudes) uniquely predicted (above subjective measures of intensity) increased tendency to subsequently choose distraction over reappraisal. Additionally, regulatory-choices led to adaptive consequences, demonstrated in finding that actual implementation of distraction relative to reappraisal-choice resulted in stronger attenuation of LPPs and self-reported arousal. © The Author (2016). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  1. Processing emotions in children with Moebius syndrome. A behavioral and thermal imaging study

    OpenAIRE

    Nicolini, Ylenia

    2018-01-01

    According to the “facial feedback hypothesis”, the proprioceptive feedback from facial muscles while mimicking face movements is crucial in regulating emotional experience. Facial mimicry, intended as a spontaneous reaction of individuals when observing emotional faces, plays a key role in understanding others’ facial expression of emotions. Studies on facial expression processing and emotion understanding have revealed that the neuronal bases of facial mimicry are underpinned by a mirror mec...

  2. Emotion and Object Processing in Parkinson's Disease

    Science.gov (United States)

    Cohen, Henri; Gagne, Marie-Helene; Hess, Ursula; Pourcher, Emmanuelle

    2010-01-01

    The neuropsychological literature on the processing of emotions in Parkinson's disease (PD) reveals conflicting evidence about the role of the basal ganglia in the recognition of facial emotions. Hence, the present study had two objectives. One was to determine the extent to which the visual processing of emotions and objects differs in PD. The…

  3. From face processing to face recognition: Comparing three different processing levels.

    Science.gov (United States)

    Besson, G; Barragan-Jason, G; Thorpe, S J; Fabre-Thorpe, M; Puma, S; Ceccaldi, M; Barbeau, E J

    2017-01-01

    Verifying that a face is from a target person (e.g. finding someone in the crowd) is a critical ability of the human face processing system. Yet how fast this can be performed is unknown. The 'entry-level shift due to expertise' hypothesis suggests that - since humans are face experts - processing faces should be as fast - or even faster - at the individual than at superordinate levels. In contrast, the 'superordinate advantage' hypothesis suggests that faces are processed from coarse to fine, so that the opposite pattern should be observed. To clarify this debate, three different face processing levels were compared: (1) a superordinate face categorization level (i.e. detecting human faces among animal faces), (2) a face familiarity level (i.e. recognizing famous faces among unfamiliar ones) and (3) verifying that a face is from a target person, our condition of interest. The minimal speed at which faces can be categorized (∼260ms) or recognized as familiar (∼360ms) has largely been documented in previous studies, and thus provides boundaries to compare our condition of interest to. Twenty-seven participants were included. The recent Speed and Accuracy Boosting procedure paradigm (SAB) was used since it constrains participants to use their fastest strategy. Stimuli were presented either upright or inverted. Results revealed that verifying that a face is from a target person (minimal RT at ∼260ms) was remarkably fast but longer than the face categorization level (∼240ms) and was more sensitive to face inversion. In contrast, it was much faster than recognizing a face as familiar (∼380ms), a level severely affected by face inversion. Face recognition corresponding to finding a specific person in a crowd thus appears achievable in only a quarter of a second. In favor of the 'superordinate advantage' hypothesis or coarse-to-fine account of the face visual hierarchy, these results suggest a graded engagement of the face processing system across processing

  4. Face Processing in Children with Autism Spectrum Disorder: Independent or Interactive Processing of Facial Identity and Facial Expression?

    Science.gov (United States)

    Krebs, Julia F.; Biswas, Ajanta; Pascalis, Olivier; Kamp-Becker, Inge; Remschmidt, Helmuth; Schwarzer, Gudrun

    2011-01-01

    The current study investigated if deficits in processing emotional expression affect facial identity processing and vice versa in children with autism spectrum disorder. Children with autism and IQ and age matched typically developing children classified faces either by emotional expression, thereby ignoring facial identity or by facial identity…

  5. Processing emotional body expressions: state-of-the-art.

    Science.gov (United States)

    Enea, Violeta; Iancu, Sorina

    2016-10-01

    Processing emotional body expressions has become recently an important topic in affective and social neuroscience along with the investigation of facial expressions. The objective of the study is to review the literature on emotional body expressions in order to discuss the current state of knowledge on this topic and identify directions for future research. The following electronic databases were searched: PsychINFO, Ebsco, ERIC, ProQuest, Sagepub, and SCOPUS using terms such as "body," "bodily expression," "body perception," "emotions," "posture," "body recognition" and combinations of them. The synthesis revealed several research questions that were addressed in neuroimaging, electrophysiological and behavioral studies. Among them, one important question targeted the neural mechanisms of emotional processing of body expressions to specific subsections regarding the time course for the integration of emotional signals from face and body, as well as the role of context in the perception of emotional signals. Processing bodily expression of emotion is similar to processing facial expressions, and the holistic processing is extended to the whole person. The current state-of-the-art in processing emotional body expressions may lead to a better understanding of the underlying neural mechanisms of social behavior. At the end of the review, suggestions for future research directions are presented.

  6. Adult age-differences in subjective impression of emotional faces are reflected in emotion-related attention and memory tasks

    Directory of Open Access Journals (Sweden)

    Joakim eSvard

    2014-05-01

    Full Text Available Although younger and older adults appear to attend to and remember emotional faces differently, less is known about age-related differences in the subjective emotional impression (arousal, potency, and valence of emotional faces and how these differences, in turn, are reflected in age differences in various emotional tasks. In the current study, we used the same facial emotional stimuli (angry and happy faces in four tasks: emotional rating, attention, categorical perception, and visual short-term memory (VSTM. The aim of this study was to investigate effects of age on the subjective emotional impression of angry and happy faces and to examine whether any age differences were mirrored in measures of emotional behavior (attention, categorical perception, and memory.In addition, regression analyses were used to further study impression-behavior associations. Forty younger adults (range 20-30 years and thirty-nine older adults (range 65-75 years participated in the experiment. The emotional rating task showed that older adults perceived less arousal, potency, and valence than younger adults and that the difference was more pronounced for angry than happy faces. Similarly, the results of the attention and memory tasks demonstrated interaction effects between emotion and age, and age differences on these measures were larger for angry than for happy faces. Regression analyses confirmed that in both age groups, higher potency ratings predicted both visual search and visual short-term memory efficiency. Future studies should consider the possibility that age differences in the subjective emotional impression of facial emotional stimuli may explain age differences in attention to and memory of such stimuli.

  7. Neutral face classification using personalized appearance models for fast and robust emotion detection.

    Science.gov (United States)

    Chiranjeevi, Pojala; Gopalakrishnan, Viswanath; Moogi, Pratibha

    2015-09-01

    Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning-based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, and so on, in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as user stays neutral for majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this paper, we propose a light-weight neutral versus emotion classification engine, which acts as a pre-processer to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at key emotion (KE) points using a statistical texture model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a statistical texture model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves emotion recognition (ER) accuracy and simultaneously reduces computational complexity of the ER system, as validated on multiple databases.

  8. Judging emotional congruency: Explicit attention to situational context modulates processing of facial expressions of emotion.

    Science.gov (United States)

    Diéguez-Risco, Teresa; Aguado, Luis; Albert, Jacobo; Hinojosa, José Antonio

    2015-12-01

    The influence of explicit evaluative processes on the contextual integration of facial expressions of emotion was studied in a procedure that required the participants to judge the congruency of happy and angry faces with preceding sentences describing emotion-inducing situations. Judgments were faster on congruent trials in the case of happy faces and on incongruent trials in the case of angry faces. At the electrophysiological level, a congruency effect was observed in the face-sensitive N170 component that showed larger amplitudes on incongruent trials. An interactive effect of congruency and emotion appeared on the LPP (late positive potential), with larger amplitudes in response to happy faces that followed anger-inducing situations. These results show that the deliberate intention to judge the contextual congruency of facial expressions influences not only processes involved in affective evaluation such as those indexed by the LPP but also earlier processing stages that are involved in face perception. Copyright © 2015. Published by Elsevier B.V.

  9. Are reading and face processing related?

    DEFF Research Database (Denmark)

    Starrfelt, Randi; Klargaard, Solja K.; Petersen, Anders

    Traditionally, perceptual processing of faces and words is considered highly specialized, strongly lateralized, and largely independent. This has, however, recently been challenged by studies showing that learning to read may affect the perceptual and neural processes involved in face recognition......, reflected in better overall accuracy, a lower perceptual threshold, and higher processing speed for words compared to letters. In sum, we find no evidence that reading skills are abnormal in developmental prosopagnosia, a finding that may challenge the recently proposed hypothesis that reading development...

  10. Emotion and Cognition Processes in Preschool Children

    Science.gov (United States)

    Leerkes, Esther M.; Paradise, Matthew; O'Brien, Marion; Calkins, Susan D.; Lange, Garrett

    2008-01-01

    The core processes of emotion understanding, emotion control, cognitive understanding, and cognitive control and their association with early indicators of social and academic success were examined in a sample of 141 3-year-old children. Confirmatory factor analysis supported the hypothesized four-factor model of emotion and cognition in early…

  11. Poignancy: Mixed Emotional Experience in the Face of Meaningful Endings

    Science.gov (United States)

    Ersner-Hershfield, Hal; Mikels, Joseph A.; Sullivan, Sarah J.; Carstensen, Laura L.

    2009-01-01

    The experience of mixed emotions increases with age. Socioemotional selectivity theory suggests that mixed emotions are associated with shifting time horizons. Theoretically, perceived constraints on future time increase appreciation for life, which, in turn, elicits positive emotions such as happiness. Yet, the very same temporal constraints heighten awareness that these positive experiences come to an end, thus yielding mixed emotional states. In 2 studies, the authors examined the link between the awareness of anticipated endings and mixed emotional experience. In Study 1, participants repeatedly imagined being in a meaningful location. Participants in the experimental condition imagined being in the meaningful location for the final time. Only participants who imagined “last times” at meaningful locations experienced more mixed emotions. In Study 2, college seniors reported their emotions on graduation day. Mixed emotions were higher when participants were reminded of the ending that they were experiencing. Findings suggest that poignancy is an emotional experience associated with meaningful endings. PMID:18179325

  12. The NMDA antagonist ketamine and the 5-HT agonist psilocybin produce dissociable effects on structural encoding of emotional face expressions.

    Science.gov (United States)

    Schmidt, André; Kometer, Michael; Bachmann, Rosilla; Seifritz, Erich; Vollenweider, Franz

    2013-01-01

    Both glutamate and serotonin (5-HT) play a key role in the pathophysiology of emotional biases. Recent studies indicate that the glutamate N-methyl-D-aspartate (NMDA) receptor antagonist ketamine and the 5-HT receptor agonist psilocybin are implicated in emotion processing. However, as yet, no study has systematically compared their contribution to emotional biases. This study used event-related potentials (ERPs) and signal detection theory to compare the effects of the NMDA (via S-ketamine) and 5-HT (via psilocybin) receptor system on non-conscious or conscious emotional face processing biases. S-ketamine or psilocybin was administrated to two groups of healthy subjects in a double-blind within-subject placebo-controlled design. We behaviorally assessed objective thresholds for non-conscious discrimination in all drug conditions. Electrophysiological responses to fearful, happy, and neutral faces were subsequently recorded with the face-specific P100 and N170 ERP. Both S-ketamine and psilocybin impaired the encoding of fearful faces as expressed by a reduced N170 over parieto-occipital brain regions. In contrast, while S-ketamine also impaired the encoding of happy facial expressions, psilocybin had no effect on the N170 in response to happy faces. This study demonstrates that the NMDA and 5-HT receptor systems differentially contribute to the structural encoding of emotional face expressions as expressed by the N170. These findings suggest that the assessment of early visual evoked responses might allow detecting pharmacologically induced changes in emotional processing biases and thus provides a framework to study the pathophysiology of dysfunctional emotional biases.

  13. Detection of Emotional Faces: Salient Physical Features Guide Effective Visual Search

    Science.gov (United States)

    Calvo, Manuel G.; Nummenmaa, Lauri

    2008-01-01

    In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent,…

  14. Visual and associated affective processing of face information in schizophrenia: A selective review.

    Science.gov (United States)

    Chen, Yue; Ekstrom, Tor

    Perception of facial features is crucial in social life. In past decades, extensive research showed that the ability to perceive facial emotion expression was compromised in schizophrenia patients. Given that face perception involves visual/cognitive and affective processing, the roles of these two processing domains in the compromised face perception in schizophrenia were studied and discussed, but not clearly defined. One particular issue was whether face-specific processing is implicated in this psychiatric disorder. Recent investigations have probed into the components of face perception processes such as visual detection, identity recognition, emotion expression discrimination and working memory conveyed from faces. Recent investigations have further assessed the associations between face processing and basic visual processing and between face processing and social cognitive processing such as Theory of Mind. In this selective review, we discuss the investigative findings relevant to the issues of cognitive and affective association and face-specific processing. We highlight the implications of multiple processing domains and face-specific processes as potential mechanisms underlying compromised face perception in schizophrenia. These findings suggest a need for a domain-specific therapeutic approach to the improvement of face perception in schizophrenia.

  15. Neurophysiological Markers of Emotion Processing in Burnout Syndrome.

    Science.gov (United States)

    Golonka, Krystyna; Mojsa-Kaja, Justyna; Popiel, Katarzyna; Marek, Tadeusz; Gawlowska, Magda

    2017-01-01

    The substantial body of research employing subjective measures indicates that burnout syndrome is associated with cognitive and emotional dysfunctions. The growing amount of neurophysiological and neuroimaging research helps in broadening existing knowledge of the neural mechanisms underlying core burnout components (emotional exhaustion and depersonalization/cynicism) that are inextricably associated with emotional processing. In the presented EEG study, a group of 93 participants (55 women; mean age = 35.8) were selected for the burnout group or the demographically matched control group on the basis of the results of the Maslach Burnout Inventory - General Survey (MBI-GS) and the Areas of Worklife Survey (AWS). Subjects then participated in an EEG experiment using two experimental procedures: a facial recognition task and viewing of passive pictures. The study focuses on analyzing event-related potentials (ERPs): N170, VPP, EPN, and LPP, as indicators of emotional information processing. Our results show that burnout subjects, as compared to the control group, demonstrate significantly weaker response to affect-evoking stimuli, indexed by a decline in VPP amplitude to emotional faces and decreased EPN amplitude in processing emotional scenes. The analysis of N170 and LPP showed no significant between-group difference. The correlation analyses revealed that VPP and EPN, which are ERP components related to emotional processing, are associated with two core burnout symptoms: emotional exhaustion and cynicism. To our knowledge, we are one of the first research groups to use ERPs to demonstrate such a relationship between neurophysiological activity and burnout syndrome in the context of emotional processing. Thus, in conclusion we emphasized that the decreased amplitude of VPP and EPN components in the burnout group may be a neurophysiological manifestation of emotional blunting and may be considered as neurophysiological markers of emotional exhaustion and cynicism

  16. The construction of emotional experience requires the integration of implicit and explicit emotional processes.

    Science.gov (United States)

    Quirin, Markus; Lane, Richard D

    2012-06-01

    Although we agree that a constructivist approach to emotional experience makes sense, we propose that implicit (visceromotor and somatomotor) emotional processes are dissociable from explicit (attention and reflection) emotional processes, and that the conscious experience of emotion requires an integration of the two. Assessments of implicit emotion and emotional awareness can be helpful in the neuroscientific investigation of emotion.

  17. Time for a Change: College Students' Preference for Technology-Mediated Versus Face-to-Face Help for Emotional Distress.

    Science.gov (United States)

    Lungu, Anita; Sun, Michael

    2016-12-01

    Even with recent advances in psychological treatments and mobile technology, online computerized therapy is not yet popular. College students, with ubiquitous access to technology, experiencing high distress, and often nontreatment seekers, could be an important area for online treatment dissemination. Finding ways to reach out to college students by offering psychological interventions through technology, devices, and applications they often use, might increase their engagement in treatment. This study evaluates college students' reported willingness to seek help for emotional distress through novel delivery mediums, to play computer games for learning emotional coping skills, and to disclose personal information online. We also evaluated the role of ethnicity and level of emotional distress in help-seeking patterns. A survey exploring our domains of interest and the Mental Health Inventory ([MHI] as mental health index) were completed by 572 students (mean age 18.7 years, predominantly Asian American, female, and freshmen in college). More participants expressed preference for online versus face-to-face professional help. We found no relationship between MHI and help-seeking preference. A third of participants were likely to disclose at least as much information online as face-to-face. Ownership of mobile technology was pervasive. Asian Americans were more likely to be nontreatment seekers than Caucasians. Most participants were interested in serious games for emotional distress. Our results suggest that college students are very open to creative ways of receiving emotional help such as playing games and seeking emotional help online, suggesting a need for online evidence-based treatments.

  18. Interaction between behavioral inhibition and emotional processing in borderline personality disorder using a pictorial emotional go/no-go paradigm.

    Science.gov (United States)

    Sinke, Christopher; Wollmer, M Axel; Kneer, Jonas; Kahl, Kai G; Kruger, Tillmann H C

    2017-10-01

    Borderline personality disorder (BPD) is characterized by difficulties in emotional regulation and impulse control. In this study, we presented a novel picture-based emotional go/no-go task with distracting emotional faces in the background, which was administered to 16 patients with BPD and 16 age-matched healthy controls. The faces displayed different emotional content (angry, neutral, or happy). Results showed differences in sensitivity between patients and the control group, with patients exhibiting less sensitivity in the task, and also showed influences of emotional content represented in the distracting faces in both groups. Specifically, happy faces decreased sensitivity compared to angry faces. It seemed as though processing of a positive emotional stimulus led to a more relaxed state and thereby to decreased sensitivity, while a negative emotional stimulus induced more alertness and tension, leading to higher sensitivity. Thus, this paradigm is suitable to investigate the interplay between emotion processing and impulse control in patients with BPD. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Gender Differences in Human Single Neuron Responses to Male Emotional Faces

    Directory of Open Access Journals (Sweden)

    Morgan eNewhoff

    2015-09-01

    Full Text Available Well-documented differences in the psychology and behavior of men and women have spurred extensive exploration of gender's role within the brain, particularly regarding emotional processing. While neuroanatomical studies clearly show differences between the sexes, the functional effects of these differences are less understood. Neuroimaging studies have shown inconsistent locations and magnitudes of gender differences in brain hemodynamic responses to emotion. To better understand the neurophysiology of these gender differences, we analyzed recordings of single neuron activity in the human brain as subjects of both genders viewed emotional expressions.This study included recordings of single-neuron activity of 14 (6 male epileptic patients in four brain areas: amygdala (236 neurons, hippocampus (n=270, anterior cingulate cortex (n=256, and ventromedial prefrontal cortex (n=174. Neural activity was recorded while participants viewed a series of avatar male faces portraying positive, negative or neutral expressions.Significant gender differences were found in the left amygdala, where 23% (n=15/66 of neurons in men were significantly affected by facial emotion, versus 8% (n=6/76 of neurons in women. A Fisher's exact test comparing the two ratios found a highly significant difference between the two (p<0.01. These results show specific differences between genders at the single-neuron level in the human amygdala. These differences may reflect gender-based distinctions in evolved capacities for emotional processing and also demonstrate the importance of including subject gender as an independent factor in future studies of emotional processing by single neurons in the human amygdala.

  20. School Principals' Emotional Coping Process

    Science.gov (United States)

    Poirel, Emmanuel; Yvon, Frédéric

    2014-01-01

    The present study examines the emotional coping of school principals in Quebec. Emotional coping was measured by stimulated recall; six principals were filmed during a working day and presented a week later with their video showing stressful encounters. The results show that school principals experience anger because of reproaches from staff…

  1. Emotional face recognition in adolescent suicide attempters and adolescents engaging in non-suicidal self-injury.

    Science.gov (United States)

    Seymour, Karen E; Jones, Richard N; Cushman, Grace K; Galvan, Thania; Puzia, Megan E; Kim, Kerri L; Spirito, Anthony; Dickstein, Daniel P

    2016-03-01

    Little is known about the bio-behavioral mechanisms underlying and differentiating suicide attempts from non-suicidal self-injury (NSSI) in adolescents. Adolescents who attempt suicide or engage in NSSI often report significant interpersonal and social difficulties. Emotional face recognition ability is a fundamental skill required for successful social interactions, and deficits in this ability may provide insight into the unique brain-behavior interactions underlying suicide attempts versus NSSI in adolescents. Therefore, we examined emotional face recognition ability among three mutually exclusive groups: (1) inpatient adolescents who attempted suicide (SA, n = 30); (2) inpatient adolescents engaged in NSSI (NSSI, n = 30); and (3) typically developing controls (TDC, n = 30) without psychiatric illness. Participants included adolescents aged 13-17 years, matched on age, gender and full-scale IQ. Emotional face recognition was evaluated using the diagnostic assessment of nonverbal accuracy (DANVA-2). Compared to TDC youth, adolescents with NSSI made more errors on child fearful and adult sad face recognition while controlling for psychopathology and medication status (ps face recognition between NSSI and SA groups. Secondary analyses showed that compared to inpatients without major depression, those with major depression made fewer errors on adult sad face recognition even when controlling for group status (p recognition errors on adult happy faces even when controlling for group status (p face recognition than TDC, but not inpatient adolescents who attempted suicide. Further results suggest the importance of psychopathology in emotional face recognition. Replication of these preliminary results and examination of the role of context-dependent emotional processing are needed moving forward.

  2. Neuroelectric Correlates of Pragmatic Emotional Incongruence Processing: Empathy Matters.

    Directory of Open Access Journals (Sweden)

    Dorian Dozolme

    Full Text Available The emotions people feel can be simulated internally based on emotional situational contexts. In the present study, we assessed the behavioral and neuroelectric effects of seeing an unexpected emotional facial expression. We investigated the correct answer rate, response times and Event-Related Potential (ERP effects during an incongruence paradigm between emotional faces and sentential contexts allowing emotional inferences. Most of the 36 healthy participants were recruited from a larger population (1 463 subjects, based on their scores on the Empathy Questionnaire (EQ. Regression analyses were conducted on these ratings using EQ factors as predictors (cognitive empathy, emotional reactivity and social skills. Recognition of pragmatic emotional incongruence was less accurate (P < .05 and slower (P < .05 than recognition of congruence. The incongruence effect on response times was inversely predicted by social skills. A significant N400 incongruence effect was found at the centro-parietal (P < .001 and centro-posterior midline (P < .01 electrodes. Cognitive empathy predicted the incongruence effect in the left occipital region, in the N400 time window. Finally, incongruence effects were also found on the LPP wave, in frontal midline and dorso-frontal regions, (P < .05, with no modulation by empathy. Processing pragmatic emotional incongruence is more cognitively demanding than congruence (as reflected by both behavioral and ERP data. This processing shows modulation by personality factors at the behavioral (through self-reported social skills and neuroelectric levels (through self-reported cognitive empathy.

  3. An emotional Stroop task with faces and words. A comparison of young and older adults.

    Science.gov (United States)

    Agustí, Ana I; Satorres, Encarnación; Pitarque, Alfonso; Meléndez, Juan C

    2017-08-01

    Given the contradictions of previous studies on the changes in attentional responses produced in aging a Stroop emotional task was proposed to compare young and older adults to words or faces with an emotional valence. The words happy or sad were superimposed on faces that express the emotion of happiness or sadness. The emotion expressed by the word and the face could agree or not (cued and uncued trials, respectively). 85 young and 66 healthy older adults had to identify both faces and words separately, and the interference between the two types of stimuli was examined. An interference effect was observed for both types of stimuli in both groups. There was more interference on positive faces and words than on negative stimuli. Older adults had more difficulty than younger in focusing on positive uncued trials, whereas there was no difference across samples on negative uncued trials. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Differential Interactions between Identity and Emotional Expression in Own and Other-Race Faces: Effects of Familiarity Revealed through Redundancy Gains

    Science.gov (United States)

    Yankouskaya, Alla; Humphreys, Glyn W.; Rotshtein, Pia

    2014-01-01

    We examined relations between the processing of facial identity and emotion in own- and other-race faces, using a fully crossed design with participants from 3 different ethnicities. The benefits of redundant identity and emotion signals were evaluated and formally tested in relation to models of independent and coactive feature processing and…

  5. Dispositional fear, negative affectivity, and neuroimaging response to visually suppressed emotional faces.

    Science.gov (United States)

    Vizueta, Nathalie; Patrick, Christopher J; Jiang, Yi; Thomas, Kathleen M; He, Sheng

    2012-01-02

    "Invisible" stimulus paradigms provide a method for investigating basic affective processing in clinical and non-clinical populations. Neuroimaging studies utilizing continuous flash suppression (CFS) have shown increased amygdala response to invisible fearful versus neutral faces. The current study used CFS in conjunction with functional MRI to test for differences in brain reactivity to visible and invisible emotional faces in relation to two distinct trait dimensions relevant to psychopathology: negative affectivity (NA) and fearfulness. Subjects consisted of college students (N=31) assessed for fear/fearlessness along with dispositional NA. The main brain regions of interest included the fusiform face area (FFA), superior temporal sulcus (STS), and amygdala. Higher NA, but not trait fear, was associated with enhanced response to fearful versus neutral faces in STS and right amygdala (but not FFA), within the invisible condition specifically. The finding that NA rather than fearfulness predicted degree of amygdala reactivity to suppressed faces implicates the input subdivision of the amygdala in the observed effects. Given the central role of NA in anxiety and mood disorders, the current data also support use of the CFS methodology for investigating the neurobiology of these disorders. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. How Context Influences Our Perception of Emotional Faces

    DEFF Research Database (Denmark)

    Calbi, Marta; Heimann, Katrin; Barratt, Daniel

    2017-01-01

    Facial expressions are of major importance in understanding the mental and emotional states of others. So far, most studies on the perception and comprehension of emotions have used isolated facial expressions as stimuli; for example, photographs of actors displaying facial expressions...... corresponding to one of the so called ‘basic emotions.’ However, our real experience during social interactions is different: facial expressions of emotion are mostly perceived in a wider context, constituted by body language, the surrounding environment, and our beliefs and expectations. Already in the early...... twentieth century, the Russian filmmaker Lev Kuleshov argued that such context, established by intermediate shots of strong emotional content, could significantly change our interpretation of facial expressions in film. Prior experiments have shown behavioral effects pointing in this direction, but have...

  7. Differences in neural and cognitive response to emotional faces in middle-aged dizygotic twins at familial risk of depression

    DEFF Research Database (Denmark)

    Miskowiak, K W; Svendsen, A M B; Harmer, C J

    2017-01-01

    -twin history of depression (high-risk) and 20 were without co-twin history of depression (low-risk). During fMRI, participants viewed fearful and happy faces while performing a gender discrimination task. After the scan, they were given a faces dot-probe task, a facial expression recognition task......BACKGROUND: Negative bias and aberrant neural processing of emotional faces are trait-marks of depression but findings in healthy high-risk groups are conflicting. METHODS: Healthy middle-aged dizygotic twins (N = 42) underwent functional magnetic resonance imaging (fMRI): 22 twins had a co...... the amygdala and ventral prefrontal cortex and pregenual anterior cingulate. This was accompanied by greater fear-specific fronto-temporal response and reduced fronto-occipital response to all emotional faces relative to baseline. The risk groups showed no differences in mood, subjective state or coping...

  8. Face and emotion recognition deficits in Turner syndrome: a possible role for X-linked genes in amygdala development.

    Science.gov (United States)

    Lawrence, Kate; Kuntsi, Jonna; Coleman, Michael; Campbell, Ruth; Skuse, David

    2003-01-01

    Face recognition is thought to rely on configural visual processing. Where face recognition impairments have been identified, qualitatively delayed or anomalous configural processing has also been found. A group of women with Turner syndrome (TS) with monosomy for a single maternal X chromosome (45, Xm) showed an impairment in face recognition skills compared with normally developing women. However, normal configural face-processing abilities were apparent. The ability to recognize facial expressions of emotion, particularly fear, was also impaired in this TS subgroup. Face recognition and fear recognition accuracy were significantly correlated in the female control group but not in women with TS. The authors therefore suggest that anomalies in amygdala function may be a neurological feature of TS of this karyotype.

  9. Processing of Facial Emotion in Bipolar Depression and Euthymia.

    Science.gov (United States)

    Robinson, Lucy J; Gray, John M; Burt, Mike; Ferrier, I Nicol; Gallagher, Peter

    2015-10-01

    Previous studies of facial emotion processing in bipolar disorder (BD) have reported conflicting findings. In independently conducted studies, we investigate facial emotion labeling in euthymic and depressed BD patients using tasks with static and dynamically morphed images of different emotions displayed at different intensities. Study 1 included 38 euthymic BD patients and 28 controls. Participants completed two tasks: labeling of static images of basic facial emotions (anger, disgust, fear, happy, sad) shown at different expression intensities; the Eyes Test (Baron-Cohen, Wheelwright, Hill, Raste, & Plumb, 2001), which involves recognition of complex emotions using only the eye region of the face. Study 2 included 53 depressed BD patients and 47 controls. Participants completed two tasks: labeling of "dynamic" facial expressions of the same five basic emotions; the Emotional Hexagon test (Young, Perret, Calder, Sprengelmeyer, & Ekman, 2002). There were no significant group differences on any measures of emotion perception/labeling, compared to controls. A significant group by intensity interaction was observed in both emotion labeling tasks (euthymia and depression), although this effect did not survive the addition of measures of executive function/psychomotor speed as covariates. Only 2.6-15.8% of euthymic patients and 7.8-13.7% of depressed patients scored below the 10th percentile of the controls for total emotion recognition accuracy. There was no evidence of specific deficits in facial emotion labeling in euthymic or depressed BD patients. Methodological variations-including mood state, sample size, and the cognitive demands of the tasks-may contribute significantly to the variability in findings between studies.

  10. How Context Influences Our Perception of Emotional Faces

    DEFF Research Database (Denmark)

    Calbi, Marta; Heimann, Katrin; Barratt, Daniel

    2017-01-01

    corresponding to one of the so called ‘basic emotions.’ However, our real experience during social interactions is different: facial expressions of emotion are mostly perceived in a wider context, constituted by body language, the surrounding environment, and our beliefs and expectations. Already in the early...... twentieth century, the Russian filmmaker Lev Kuleshov argued that such context, established by intermediate shots of strong emotional content, could significantly change our interpretation of facial expressions in film. Prior experiments have shown behavioral effects pointing in this direction, but have...

  11. Cyber Victimization in High School: Measurement, Overlap with Face-to-Face Victimization, and Associations with Social-Emotional Outcomes

    Science.gov (United States)

    Brown, Christina Flynn; Demaray, Michelle Kilpatrick; Tennant, Jaclyn E.; Jenkins, Lyndsay N.

    2017-01-01

    Cyber victimization is a contemporary problem facing youth and adolescents (Diamanduros, Downs, & Jenkins, 2008; Kowalski & Limber, 2007). It is imperative for researchers and school personnel to understand the associations between cyber victimization and student social-emotional outcomes. This article explores (a) gender differences in…

  12. Schizophrenia and sex differences in emotional processing

    NARCIS (Netherlands)

    Scholten, M.R.M.

    2007-01-01

    Patients with schizophrenia are known to be impaired in several domains of emotional processing. These deficits have been associated with impaired social functioning. Since female patients show better social skills than male patients and healthy women outperform men in emotion recognition and

  13. Face puzzle—two new video-based tasks for measuring explicit and implicit aspects of facial emotion recognition

    Science.gov (United States)

    Kliemann, Dorit; Rosenblau, Gabriela; Bölte, Sven; Heekeren, Hauke R.; Dziobek, Isabel

    2013-01-01

    Recognizing others' emotional states is crucial for effective social interaction. While most facial emotion recognition tasks use explicit prompts that trigger consciously controlled processing, emotional faces are almost exclusively processed implicitly in real life. Recent attempts in social cognition suggest a dual process perspective, whereby explicit and implicit processes largely operate independently. However, due to differences in methodology the direct comparison of implicit and explicit social cognition has remained a challenge. Here, we introduce a new tool to comparably measure implicit and explicit processing aspects comprising basic and complex emotions in facial expressions. We developed two video-based tasks with similar answer formats to assess performance in respective facial emotion recognition processes: Face Puzzle, implicit and explicit. To assess the tasks' sensitivity to atypical social cognition and to infer interrelationship patterns between explicit and implicit processes in typical and atypical development, we included healthy adults (NT, n = 24) and adults with autism spectrum disorder (ASD, n = 24). Item analyses yielded good reliability of the new tasks. Group-specific results indicated sensitivity to subtle social impairments in high-functioning ASD. Correlation analyses with established implicit and explicit socio-cognitive measures were further in favor of the tasks' external validity. Between group comparisons provide first hints of differential relations between implicit and explicit aspects of facial emotion recognition processes in healthy compared to ASD participants. In addition, an increased magnitude of between group differences in the implicit task was found for a speed-accuracy composite measure. The new Face Puzzle tool thus provides two new tasks to separately assess explicit and implicit social functioning, for instance, to measure subtle impairments as well as potential improvements due to social cognitive

  14. Memory for faces and voices varies as a function of sex and expressed emotion.

    Science.gov (United States)

    S Cortes, Diana; Laukka, Petri; Lindahl, Christina; Fischer, Håkan

    2017-01-01

    We investigated how memory for faces and voices (presented separately and in combination) varies as a function of sex and emotional expression (anger, disgust, fear, happiness, sadness, and neutral). At encoding, participants judged the expressed emotion of items in forced-choice tasks, followed by incidental Remember/Know recognition tasks. Results from 600 participants showed that accuracy (hits minus false alarms) was consistently higher for neutral compared to emotional items, whereas accuracy for specific emotions varied across the presentation modalities (i.e., faces, voices, and face-voice combinations). For the subjective sense of recollection ("remember" hits), neutral items received the highest hit rates only for faces, whereas for voices and face-voice combinations anger and fear expressions instead received the highest recollection rates. We also observed better accuracy for items by female expressers, and own-sex bias where female participants displayed memory advantage for female faces and face-voice combinations. Results further suggest that own-sex bias can be explained by recollection, rather than familiarity, rates. Overall, results show that memory for faces and voices may be influenced by the expressions that they carry, as well as by the sex of both items and participants. Emotion expressions may also enhance the subjective sense of recollection without enhancing memory accuracy.

  15. Memory for faces and voices varies as a function of sex and expressed emotion.

    Directory of Open Access Journals (Sweden)

    Diana S Cortes

    Full Text Available We investigated how memory for faces and voices (presented separately and in combination varies as a function of sex and emotional expression (anger, disgust, fear, happiness, sadness, and neutral. At encoding, participants judged the expressed emotion of items in forced-choice tasks, followed by incidental Remember/Know recognition tasks. Results from 600 participants showed that accuracy (hits minus false alarms was consistently higher for neutral compared to emotional items, whereas accuracy for specific emotions varied across the presentation modalities (i.e., faces, voices, and face-voice combinations. For the subjective sense of recollection ("remember" hits, neutral items received the highest hit rates only for faces, whereas for voices and face-voice combinations anger and fear expressions instead received the highest recollection rates. We also observed better accuracy for items by female expressers, and own-sex bias where female participants displayed memory advantage for female faces and face-voice combinations. Results further suggest that own-sex bias can be explained by recollection, rather than familiarity, rates. Overall, results show that memory for faces and voices may be influenced by the expressions that they carry, as well as by the sex of both items and participants. Emotion expressions may also enhance the subjective sense of recollection without enhancing memory accuracy.

  16. Emotional sounds modulate early neural processing of emotional pictures

    Directory of Open Access Journals (Sweden)

    Antje B M Gerdes

    2013-10-01

    Full Text Available In our natural environment, emotional information is conveyed by converging visual and auditory information; multimodal integration is of utmost importance. In the laboratory, however, emotion researchers have mostly focused on the examination of unimodal stimuli. Few existing studies on multimodal emotion processing have focused on human communication such as the integration of facial and vocal expressions. Extending the concept of multimodality, the current study examines how the neural processing of emotional pictures is influenced by simultaneously presented sounds. Twenty pleasant, unpleasant, and neutral pictures of complex scenes were presented to 22 healthy participants. On the critical trials these pictures were paired with pleasant, unpleasant and neutral sounds. Sound presentation started 500 ms before picture onset and each stimulus presentation lasted for 2s. EEG was recorded from 64 channels and ERP analyses focused on the picture onset. In addition, valence, and arousal ratings were obtained. Previous findings for the neural processing of emotional pictures were replicated. Specifically, unpleasant compared to neutral pictures were associated with an increased parietal P200 and a more pronounced centroparietal late positive potential (LPP, independent of the accompanying sound valence. For audiovisual stimulation, increased parietal P100 and P200 were found in response to all pictures which were accompanied by unpleasant or pleasant sounds compared to pictures with neutral sounds. Most importantly, incongruent audiovisual pairs of unpleasant pictures and pleasant sounds enhanced parietal P100 and P200 compared to pairings with congruent sounds. Taken together, the present findings indicate that emotional sounds modulate early stages of visual processing and, therefore, provide an avenue by which multimodal experience may enhance perception.

  17. The Process Model of Group-Based Emotion: Integrating Intergroup Emotion and Emotion Regulation Perspectives.

    Science.gov (United States)

    Goldenberg, Amit; Halperin, Eran; van Zomeren, Martijn; Gross, James J

    2016-05-01

    Scholars interested in emotion regulation have documented the different goals and strategies individuals have for regulating their emotions. However, little attention has been paid to the regulation of group-based emotions, which are based on individuals' self-categorization as a group member and occur in response to situations perceived as relevant for that group. We propose a model for examining group-based emotion regulation that integrates intergroup emotions theory and the process model of emotion regulation. This synergy expands intergroup emotion theory by facilitating further investigation of different goals (i.e., hedonic or instrumental) and strategies (e.g., situation selection and modification strategies) used to regulate group-based emotions. It also expands emotion regulation research by emphasizing the role of self-categorization (e.g., as an individual or a group member) in the emotional process. Finally, we discuss the promise of this theoretical synergy and suggest several directions for future research on group-based emotion regulation. © 2015 by the Society for Personality and Social Psychology, Inc.

  18. Looking at My Own Face: Visual Processing Strategies in Self–Other Face Recognition

    Directory of Open Access Journals (Sweden)

    Anya Chakraborty

    2018-02-01

    Full Text Available We live in an age of ‘selfies.’ Yet, how we look at our own faces has seldom been systematically investigated. In this study we test if the visual processing of the highly familiar self-face is different from other faces, using psychophysics and eye-tracking. This paradigm also enabled us to test the association between the psychophysical properties of self-face representation and visual processing strategies involved in self-face recognition. Thirty-three adults performed a self-face recognition task from a series of self-other face morphs with simultaneous eye-tracking. Participants were found to look longer at the lower part of the face for self-face compared to other-face. Participants with a more distinct self-face representation, as indexed by a steeper slope of the psychometric response curve for self-face recognition, were found to look longer at upper part of the faces identified as ‘self’ vs. those identified as ‘other’. This result indicates that self-face representation can influence where we look when we process our own vs. others’ faces. We also investigated the association of autism-related traits with self-face processing metrics since autism has previously been associated with atypical self-processing. The study did not find any self-face specific association with autistic traits, suggesting that autism-related features may be related to self-processing in a domain specific manner.

  19. Age-Group Differences in Interference from Young and Older Emotional Faces.

    Science.gov (United States)

    Ebner, Natalie C; Johnson, Marcia K

    2010-11-01

    Human attention is selective, focusing on some aspects of events at the expense of others. In particular, angry faces engage attention. Most studies have used pictures of young faces, even when comparing young and older age groups. Two experiments asked (1) whether task-irrelevant faces of young and older individuals with happy, angry, and neutral expressions disrupt performance on a face-unrelated task, (2) whether interference varies for faces of different ages and different facial expressions, and (3) whether young and older adults differ in this regard. Participants gave speeded responses on a number task while irrelevant faces appeared in the background. Both age groups were more distracted by own than other-age faces. In addition, young participants' responses were slower for angry than happy faces, whereas older participants' responses were slower for happy than angry faces. Factors underlying age-group differences in interference from emotional faces of different ages are discussed.

  20. Effects of facial emotion recognition remediation on visual scanning of novel face stimuli.

    Science.gov (United States)

    Marsh, Pamela J; Luckett, Gemma; Russell, Tamara; Coltheart, Max; Green, Melissa J

    2012-11-01

    Previous research shows that emotion recognition in schizophrenia can be improved with targeted remediation that draws attention to important facial features (eyes, nose, mouth). Moreover, the effects of training have been shown to last for up to one month after training. The aim of this study was to investigate whether improved emotion recognition of novel faces is associated with concomitant changes in visual scanning of these same novel facial expressions. Thirty-nine participants with schizophrenia received emotion recognition training using Ekman's Micro-Expression Training Tool (METT), with emotion recognition and visual scanpath (VSP) recordings to face stimuli collected simultaneously. Baseline ratings of interpersonal and cognitive functioning were also collected from all participants. Post-METT training, participants showed changes in foveal attention to the features of facial expressions of emotion not used in METT training, which were generally consistent with the information about important features from the METT. In particular, there were changes in how participants looked at the features of facial expressions of emotion surprise, disgust, fear, happiness, and neutral, demonstrating that improved emotion recognition is paralleled by changes in the way participants with schizophrenia viewed novel facial expressions of emotion. However, there were overall decreases in foveal attention to sad and neutral faces that indicate more intensive instruction might be needed for these faces during training. Most importantly, the evidence shows that participant gender may affect training outcomes. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Emotion processing facilitates working memory performance.

    Science.gov (United States)

    Lindström, Björn R; Bohlin, Gunilla

    2011-11-01

    The effect of emotional stimulus content on working memory performance has been investigated with conflicting results, as both emotion-dependent facilitation and impairments are reported in the literature. To clarify this issue, 52 adult participants performed a modified visual 2-back task with highly arousing positive stimuli (sexual scenes), highly arousing negative stimuli (violent death) and low-arousal neutral stimuli. Emotional stimulus processing was found to facilitate task performance relative to that of neutral stimuli, both in regards to response accuracy and reaction times. No emotion-dependent differences in false-alarm rates were found. These results indicate that emotional information can have a facilitating effect on working memory maintenance and processing of information.

  2. Emotion recognition through static faces and moving bodies: a comparison between typically developed adults and individuals with high level of autistic traits.

    Science.gov (United States)

    Actis-Grosso, Rossana; Bossi, Francesco; Ricciardelli, Paola

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits or with High Functioning Autism Spectrum Disorder (HAT group) was compared in the recognition of four emotions (Happiness, Anger, Fear, and Sadness) either shown in static faces or conveyed by moving body patch-light displays (PLDs). Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage). Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that (i) emotion recognition is not generally impaired in HAT individuals, (ii) the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals.

  3. Emotion recognition through static faces and moving bodies: a comparison between typically-developed adults and individuals with high level of autistic traits

    Directory of Open Access Journals (Sweden)

    Rossana eActis-Grosso

    2015-10-01

    Full Text Available We investigated whether the type of stimulus (pictures of static faces vs. body motion contributes differently to the recognition of emotions. The performance (accuracy and response times of 25 Low Autistic Traits (LAT group young adults (21 males and 20 young adults (16 males with either High Autistic Traits (HAT group or with High Functioning Autism Spectrum Disorder was compared in the recognition of four emotions (Happiness, Anger, Fear and Sadness either shown in static faces or conveyed by moving bodies (patch-light displays, PLDs. Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage. Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that i emotion recognition is not generally impaired in HAT individuals, ii the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals.

  4. Transcutaneous vagus nerve stimulation (tVNS) enhances recognition of emotions in faces but not bodies.

    Science.gov (United States)

    Sellaro, Roberta; de Gelder, Beatrice; Finisguerra, Alessandra; Colzato, Lorenza S

    2018-02-01

    The polyvagal theory suggests that the vagus nerve is the key phylogenetic substrate enabling optimal social interactions, a crucial aspect of which is emotion recognition. A previous study showed that the vagus nerve plays a causal role in mediating people's ability to recognize emotions based on images of the eye region. The aim of this study is to verify whether the previously reported causal link between vagal activity and emotion recognition can be generalized to situations in which emotions must be inferred from images of whole faces and bodies. To this end, we employed transcutaneous vagus nerve stimulation (tVNS), a novel non-invasive brain stimulation technique that causes the vagus nerve to fire by the application of a mild electrical stimulation to the auricular branch of the vagus nerve, located in the anterior protuberance of the outer ear. In two separate sessions, participants received active or sham tVNS before and while performing two emotion recognition tasks, aimed at indexing their ability to recognize emotions from facial and bodily expressions. Active tVNS, compared to sham stimulation, enhanced emotion recognition for whole faces but not for bodies. Our results confirm and further extend recent observations supporting a causal relationship between vagus nerve activity and the ability to infer others' emotional state, but restrict this association to situations in which the emotional state is conveyed by the whole face and/or by salient facial cues, such as eyes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. A Shape-Based Account for Holistic Face Processing

    Science.gov (United States)

    Zhao, Mintao; Bülthoff, Heinrich H.; Bülthoff, Isabelle

    2016-01-01

    Faces are processed holistically, so selective attention to 1 face part without any influence of the others often fails. In this study, 3 experiments investigated what type of facial information (shape or surface) underlies holistic face processing and whether generalization of holistic processing to nonexperienced faces requires extensive…

  6. Holistic Processing of Faces: Perceptual and Decisional Components

    Science.gov (United States)

    Richler, Jennifer J.; Gauthier, Isabel; Wenger, Michael J.; Palmeri, Thomas J.

    2008-01-01

    Researchers have used several composite face paradigms to assess holistic processing of faces. In the selective attention paradigm, participants decide whether one face part (e.g., top) is the same as a previously seen face part. Their judgment is affected by whether the irrelevant part of the test face is the same as or different than the…

  7. Music-Elicited Emotion Identification Using Optical Flow Analysis of Human Face

    Science.gov (United States)

    Kniaz, V. V.; Smirnova, Z. N.

    2015-05-01

    Human emotion identification from image sequences is highly demanded nowadays. The range of possible applications can vary from an automatic smile shutter function of consumer grade digital cameras to Biofied Building technologies, which enables communication between building space and residents. The highly perceptual nature of human emotions leads to the complexity of their classification and identification. The main question arises from the subjective quality of emotional classification of events that elicit human emotions. A variety of methods for formal classification of emotions were developed in musical psychology. This work is focused on identification of human emotions evoked by musical pieces using human face tracking and optical flow analysis. Facial feature tracking algorithm used for facial feature speed and position estimation is presented. Facial features were extracted from each image sequence using human face tracking with local binary patterns (LBP) features. Accurate relative speeds of facial features were estimated using optical flow analysis. Obtained relative positions and speeds were used as the output facial emotion vector. The algorithm was tested using original software and recorded image sequences. The proposed technique proves to give a robust identification of human emotions elicited by musical pieces. The estimated models could be used for human emotion identification from image sequences in such fields as emotion based musical background or mood dependent radio.

  8. Enhanced embodied response following ambiguous emotional processing.

    Science.gov (United States)

    Beffara, Brice; Ouellet, Marc; Vermeulen, Nicolas; Basu, Anamitra; Morisseau, Tiffany; Mermillod, Martial

    2012-08-01

    It has generally been assumed that high-level cognitive and emotional processes are based on amodal conceptual information. In contrast, however, "embodied simulation" theory states that the perception of an emotional signal can trigger a simulation of the related state in the motor, somatosensory, and affective systems. To study the effect of social context on the mimicry effect predicted by the "embodied simulation" theory, we recorded the electromyographic (EMG) activity of participants when looking at emotional facial expressions. We observed an increase in embodied responses when the participants were exposed to a context involving social valence before seeing the emotional facial expressions. An examination of the dynamic EMG activity induced by two socially relevant emotional expressions (namely joy and anger) revealed enhanced EMG responses of the facial muscles associated with the related social prime (either positive or negative). These results are discussed within the general framework of embodiment theory.

  9. Face-body integration of intense emotional expressions of victory and defeat.

    Directory of Open Access Journals (Sweden)

    Lili Wang

    Full Text Available Human facial expressions can be recognized rapidly and effortlessly. However, for intense emotions from real life, positive and negative facial expressions are difficult to discriminate and the judgment of facial expressions is biased towards simultaneously perceived body expressions. This study employed event-related potentials (ERPs to investigate the neural dynamics involved in the integration of emotional signals from facial and body expressions of victory and defeat. Emotional expressions of professional players were used to create pictures of face-body compounds, with either matched or mismatched emotional expressions in faces and bodies. Behavioral results showed that congruent emotional information of face and body facilitated the recognition of facial expressions. ERP data revealed larger P1 amplitudes for incongruent compared to congruent stimuli. Also, a main effect of body valence on the P1 was observed, with enhanced amplitudes for the stimuli with losing compared to winning bodies. The main effect of body expression was also observed in N170 and N2, with winning bodies producing larger N170/N2 amplitudes. In the later stage, a significant interaction of congruence by body valence was found on the P3 component. Winning bodies elicited lager P3 amplitudes than losing bodies did when face and body conveyed congruent emotional signals. Beyond the knowledge based on prototypical facial and body expressions, the results of this study facilitate us to understand the complexity of emotion evaluation and categorization out of laboratory.

  10. Amygdala hypersensitivity in response to emotional faces in Tourette's patients.

    Science.gov (United States)

    Neuner, Irene; Kellermann, Thilo; Stöcker, Tony; Kircher, Tilo; Habel, Ute; Shah, Jon N; Schneider, Frank

    2010-10-01

    Tourette's syndrome is characterised by motor and vocal tics as well as a high level of impulsivity and emotional dysregulation. Neuroimaging studies point to structural changes of the basal ganglia, prefrontal cortex and parts of the limbic system. However, there is no link between behavioural symptoms and the structural changes in the amygdala. One aspect of daily social interaction is the perception of emotional facial expressions, closely linked to amgydala function. We therefore investigated via fMRI the implicit discrimination of six emotional facial expressions in 19 adult Tourette's patients. In comparison to healthy control group, Tourette's patients showed significantly higher amygdala activation, especially pronounced for fearful, angry and neutral expressions. The BOLD-activity of the left amygdala correlated negatively with the personality trait extraversion. We will discuss these findings as a result of either deficient frontal inhibition due to structural changes or a desynchronization in the interaction of the cortico-striato-thalamo-cortical network within structures of the limbic system. Our data show an altered pattern of implicit emotion discrimination and emphasize the need to consider motor and non-motor symptoms in Tourette's syndrome in the choice of both behavioural and pharmacological treatment.

  11. Reduced amygdala and ventral striatal activity to happy faces in PTSD is associated with emotional numbing.

    Directory of Open Access Journals (Sweden)

    Kim L Felmingham

    Full Text Available There has been a growing recognition of the importance of reward processing in PTSD, yet little is known of the underlying neural networks. This study tested the predictions that (1 individuals with PTSD would display reduced responses to happy facial expressions in ventral striatal reward networks, and (2 that this reduction would be associated with emotional numbing symptoms. 23 treatment-seeking patients with Posttraumatic Stress Disorder were recruited from the treatment clinic at the Centre for Traumatic Stress Studies, Westmead Hospital, and 20 trauma-exposed controls were recruited from a community sample. We examined functional magnetic resonance imaging responses during the presentation of happy and neutral facial expressions in a passive viewing task. PTSD participants rated happy facial expression as less intense than trauma-exposed controls. Relative to controls, PTSD participants revealed lower activation to happy (-neutral faces in ventral striatum and and a trend for reduced activation in left amygdala. A significant negative correlation was found between emotional numbing symptoms in PTSD and right ventral striatal regions after controlling for depression, anxiety and PTSD severity. This study provides initial evidence that individuals with PTSD have lower reactivity to happy facial expressions, and that lower activation in ventral striatal-limbic reward networks may be associated with symptoms of emotional numbing.

  12. Amygdala habituation to emotional faces in adolescents with internalizing disorders, adolescents with childhood sexual abuse related PTSD and healthy adolescents

    Directory of Open Access Journals (Sweden)

    Bianca G. van den Bulk

    2016-10-01

    Full Text Available Adolescents with internalizing disorders and adolescents with childhood sexual abuse related post-traumatic stress disorder (CSA-related PTSD show a large overlap in symptomatology. In addition, brain research indicated hyper-responsiveness and sustained activation instead of habituation of amygdala activation to emotional faces in both groups. Little is known, however, about whether the same patterns of amygdala habituation are present in these two groups. The current study examined habituation patterns of amygdala activity to emotional faces (fearful, happy and neutral in adolescents with a DSM-IV depressive and/or anxiety disorder (N = 25, adolescents with CSA-related PTSD (N = 19 and healthy controls (N = 26. Behaviourally, the adolescents from the internalizing and CSA-related PTSD group reported more anxiety to fearful and neutral faces than adolescents from the control group and adolescents from the CSA-related PTSD group reacted slower compared to the internalizing group. At the whole brain level, there was a significant interaction between time and group within the left amygdala. Follow-up ROI analysis showed elevated initial activity in the amygdala and rapid habituation in the CSA-related PTSD group compared to the internalizing group. These findings suggest that habituation patterns of amygdala activation provide additional information on problems with emotional face processing. Furthermore, the results suggest there are differences in the underlying neurobiological mechanisms related to emotional face processing for adolescents with internalizing disorders and adolescents with CSA-related PTSD. Possibly CSA-related PTSD is characterized by a stronger primary emotional response driven by the amygdala.

  13. Amygdala habituation to emotional faces in adolescents with internalizing disorders, adolescents with childhood sexual abuse related PTSD and healthy adolescents.

    Science.gov (United States)

    van den Bulk, Bianca G; Somerville, Leah H; van Hoof, Marie-José; van Lang, Natasja D J; van der Wee, Nic J A; Crone, Eveline A; Vermeiren, Robert R J M

    2016-10-01

    Adolescents with internalizing disorders and adolescents with childhood sexual abuse related post-traumatic stress disorder (CSA-related PTSD) show a large overlap in symptomatology. In addition, brain research indicated hyper-responsiveness and sustained activation instead of habituation of amygdala activation to emotional faces in both groups. Little is known, however, about whether the same patterns of amygdala habituation are present in these two groups. The current study examined habituation patterns of amygdala activity to emotional faces (fearful, happy and neutral) in adolescents with a DSM-IV depressive and/or anxiety disorder (N=25), adolescents with CSA-related PTSD (N=19) and healthy controls (N=26). Behaviourally, the adolescents from the internalizing and CSA-related PTSD group reported more anxiety to fearful and neutral faces than adolescents from the control group and adolescents from the CSA-related PTSD group reacted slower compared to the internalizing group. At the whole brain level, there was a significant interaction between time and group within the left amygdala. Follow-up ROI analysis showed elevated initial activity in the amygdala and rapid habituation in the CSA-related PTSD group compared to the internalizing group. These findings suggest that habituation patterns of amygdala activation provide additional information on problems with emotional face processing. Furthermore, the results suggest there are differences in the underlying neurobiological mechanisms related to emotional face processing for adolescents with internalizing disorders and adolescents with CSA-related PTSD. Possibly CSA-related PTSD is characterized by a stronger primary emotional response driven by the amygdala. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Do bodily expressions compete with facial expressions? Time course of integration of emotional signals from the face and the body.

    Science.gov (United States)

    Gu, Yuanyuan; Mai, Xiaoqin; Luo, Yue-jia

    2013-01-01

    The decoding of social signals from nonverbal cues plays a vital role in the social interactions of socially gregarious animals such as humans. Because nonverbal emotional signals from the face and body are normally seen together, it is important to investigate the mechanism underlying the integration of emotional signals from these two sources. We conducted a study in which the time course of the integration of facial and bodily expressions was examined via analysis of event-related potentials (ERPs) while the focus of attention was manipulated. Distinctive integrating features were found during multiple stages of processing. In the first stage, threatening information from the body was extracted automatically and rapidly, as evidenced by enhanced P1 amplitudes when the subjects viewed compound face-body images with fearful bodies compared with happy bodies. In the second stage, incongruency between emotional information from the face and the body was detected and captured by N2. Incongruent compound images elicited larger N2s than did congruent compound images. The focus of attention modulated the third stage of integration. When the subjects' attention was focused on the face, images with congruent emotional signals elicited larger P3s than did images with incongruent signals, suggesting more sustained attention and elaboration of congruent emotional information extracted from the face and body. On the other hand, when the subjects' attention was focused on the body, images with fearful bodies elicited larger P3s than did images with happy bodies, indicating more sustained attention and elaboration of threatening information from the body during evaluative processes.

  15. Emotional facial expressions differentially influence predictions and performance for face recognition.

    Science.gov (United States)

    Nomi, Jason S; Rhodes, Matthew G; Cleary, Anne M

    2013-01-01

    This study examined how participants' predictions of future memory performance are influenced by emotional facial expressions. Participants made judgements of learning (JOLs) predicting the likelihood that they would correctly identify a face displaying a happy, angry, or neutral emotional expression in a future two-alternative forced-choice recognition test of identity (i.e., recognition that a person's face was seen before). JOLs were higher for studied faces with happy and angry emotional expressions than for neutral faces. However, neutral test faces with studied neutral expressions had significantly higher identity recognition rates than neutral test faces studied with happy or angry expressions. Thus, these data are the first to demonstrate that people believe happy and angry emotional expressions will lead to better identity recognition in the future relative to neutral expressions. This occurred despite the fact that neutral expressions elicited better identity recognition than happy and angry expressions. These findings contribute to the growing literature examining the interaction of cognition and emotion.

  16. Oxytocin Reduces Face Processing Time but Leaves Recognition Accuracy and Eye-Gaze Unaffected.

    Science.gov (United States)

    Hubble, Kelly; Daughters, Katie; Manstead, Antony S R; Rees, Aled; Thapar, Anita; van Goozen, Stephanie H M

    2017-01-01

    Previous studies have found that oxytocin (OXT) can improve the recognition of emotional facial expressions; it has been proposed that this effect is mediated by an increase in attention to the eye-region of faces. Nevertheless, evidence in support of this claim is inconsistent, and few studies have directly tested the effect of oxytocin on emotion recognition via altered eye-gaze Methods: In a double-blind, within-subjects, randomized control experiment, 40 healthy male participants received 24 IU intranasal OXT and placebo in two identical experimental sessions separated by a 2-week interval. Visual attention to the eye-region was assessed on both occasions while participants completed a static facial emotion recognition task using medium intensity facial expressions. Although OXT had no effect on emotion recognition accuracy, recognition performance was improved because face processing was faster across emotions under the influence of OXT. This effect was marginally significant (pfaces and this was not related to recognition accuracy or face processing time. These findings suggest that OXT-induced enhanced facial emotion recognition is not necessarily mediated by an increase in attention to the eye-region of faces, as previously assumed. We discuss several methodological issues which may explain discrepant findings and suggest the effect of OXT on visual attention may differ depending on task requirements. (JINS, 2017, 23, 23-33).

  17. Neurofunctional Underpinnings of Audiovisual Emotion Processing in Teens with Autism Spectrum Disorders

    Science.gov (United States)

    Doyle-Thomas, Krissy A.R.; Goldberg, Jeremy; Szatmari, Peter; Hall, Geoffrey B.C.

    2013-01-01

    Despite successful performance on some audiovisual emotion tasks, hypoactivity has been observed in frontal and temporal integration cortices in individuals with autism spectrum disorders (ASD). Little is understood about the neurofunctional network underlying this ability in individuals with ASD. Research suggests that there may be processing biases in individuals with ASD, based on their ability to obtain meaningful information from the face and/or the voice. This functional magnetic resonance imaging study examined brain activity in teens with ASD (n = 18) and typically developing controls (n = 16) during audiovisual and unimodal emotion processing. Teens with ASD had a significantly lower accuracy when matching an emotional face to an emotion label. However, no differences in accuracy were observed between groups when matching an emotional voice or face-voice pair to an emotion label. In both groups brain activity during audiovisual emotion matching differed significantly from activity during unimodal emotion matching. Between-group analyses of audiovisual processing revealed significantly greater activation in teens with ASD in a parietofrontal network believed to be implicated in attention, goal-directed behaviors, and semantic processing. In contrast, controls showed greater activity in frontal and temporal association cortices during this task. These results suggest that in the absence of engaging integrative emotional networks during audiovisual emotion matching, teens with ASD may have recruited the parietofrontal network as an alternate compensatory system. PMID:23750139

  18. Face processing regions are sensitive to distinct aspects of temporal sequence in facial dynamics.

    Science.gov (United States)

    Reinl, Maren; Bartels, Andreas

    2014-11-15

    Facial movement conveys important information for social interactions, yet its neural processing is poorly understood. Computational models propose that shape- and temporal sequence sensitive mechanisms interact in processing dynamic faces. While face processing regions are known to respond to facial movement, their sensitivity to particular temporal sequences has barely been studied. Here we used fMRI to examine the sensitivity of human face-processing regions to two aspects of directionality in facial movement trajectories. We presented genuine movie recordings of increasing and decreasing fear expressions, each of which were played in natural or reversed frame order. This two-by-two factorial design matched low-level visual properties, static content and motion energy within each factor, emotion-direction (increasing or decreasing emotion) and timeline (natural versus artificial). The results showed sensitivity for emotion-direction in FFA, which was timeline-dependent as it only occurred within the natural frame order, and sensitivity to timeline in the STS, which was emotion-direction-dependent as it only occurred for decreased fear. The occipital face area (OFA) was sensitive to the factor timeline. These findings reveal interacting temporal sequence sensitive mechanisms that are responsive to both ecological meaning and to prototypical unfolding of facial dynamics. These mechanisms are temporally directional, provide socially relevant information regarding emotional state or naturalness of behavior, and agree with predictions from modeling and predictive coding theory. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  19. ERP Correlates of Target-Distracter Differentiation in Repeated Runs of a Continuous Recognition Task with Emotional and Neutral Faces

    Science.gov (United States)

    Treese, Anne-Cecile; Johansson, Mikael; Lindgren, Magnus

    2010-01-01

    The emotional salience of faces has previously been shown to induce memory distortions in recognition memory tasks. This event-related potential (ERP) study used repeated runs of a continuous recognition task with emotional and neutral faces to investigate emotion-induced memory distortions. In the second and third runs, participants made more…

  20. Is emotion recognition the only problem in ADHD? effects of pharmacotherapy on face and emotion recognition in children with ADHD.

    Science.gov (United States)

    Demirci, Esra; Erdogan, Ayten

    2016-12-01

    The objectives of this study were to evaluate both face and emotion recognition, to detect differences among attention deficit and hyperactivity disorder (ADHD) subgroups, to identify effects of the gender and to assess the effects of methylphenidate and atomoxetine treatment on both face and emotion recognition in patients with ADHD. The study sample consisted of 41 male, 29 female patients, 8-15 years of age, who were diagnosed as having combined type ADHD (N = 26), hyperactive/impulsive type ADHD (N = 21) or inattentive type ADHD (N = 23) but had not previously used any medication for ADHD and 35 male, 25 female healthy individuals. Long-acting methylphenidate (OROS-MPH) was prescribed to 38 patients, whereas atomoxetine was prescribed to 32 patients. The reading the mind in the eyes test (RMET) and Benton face recognition test (BFRT) were applied to all participants before and after treatment. The patients with ADHD had a significantly lower number of correct answers in child and adolescent RMET and in BFRT than the healthy controls. Among the ADHD subtypes, the hyperactive/impulsive subtype had a lower number of correct answers in the RMET than the inattentive subtypes, and the hyperactive/impulsive subtype had a lower number of correct answers in short and long form of BFRT than the combined and inattentive subtypes. Male and female patients with ADHD did not differ significantly with respect to the number of correct answers on the RMET and BFRT. The patients showed significant improvement in RMET and BFRT after treatment with OROS-MPH or atomoxetine. Patients with ADHD have difficulties in face recognition as well as emotion recognition. Both OROS-MPH and atomoxetine affect emotion recognition. However, further studies on the face and emotion recognition are needed in ADHD.

  1. Dissociating Face Identity and Facial Expression Processing Via Visual Adaptation

    Directory of Open Access Journals (Sweden)

    Hong Xu

    2012-10-01

    Full Text Available Face identity and facial expression are processed in two distinct neural pathways. However, most of the existing face adaptation literature studies them separately, despite the fact that they are two aspects from the same face. The current study conducted a systematic comparison between these two aspects by face adaptation, investigating how top- and bottom-half face parts contribute to the processing of face identity and facial expression. A real face (sad, “Adam” and its two size-equivalent face parts (top- and bottom-half were used as the adaptor in separate conditions. For face identity adaptation, the test stimuli were generated by morphing Adam's sad face with another person's sad face (“Sam”. For facial expression adaptation, the test stimuli were created by morphing Adam's sad face with his neutral face and morphing the neutral face with his happy face. In each trial, after exposure to the adaptor, observers indicated the perceived face identity or facial expression of the following test face via a key press. They were also tested in a baseline condition without adaptation. Results show that the top- and bottom-half face each generated a significant face identity aftereffect. However, the aftereffect by top-half face adaptation is much larger than that by the bottom-half face. On the contrary, only the bottom-half face generated a significant facial expression aftereffect. This dissociation of top- and bottom-half face adaptation suggests that face parts play different roles in face identity and facial expression. It thus provides further evidence for the distributed systems of face perception.

  2. Infants’ Temperament and Mothers’, and Fathers’ Depression Predict Infants’ Attention to Objects Paired with Emotional Faces

    NARCIS (Netherlands)

    Aktar, E.; Mandell, D.J.; de Vente, W.; Majdandžić, M.; Raijmakers, M.E.J.; Bögels, S.M.

    2016-01-01

    Between 10 and 14 months, infants gain the ability to learn about unfamiliar stimuli by observing others’ emotional reactions to those stimuli, so called social referencing (SR). Joint processing of emotion and head/gaze direction is essential for SR. This study tested emotion and head/gaze

  3. Identification of emotions in mixed disgusted-happy faces as a function of depressive symptom severity.

    Science.gov (United States)

    Sanchez, Alvaro; Romero, Nuria; Maurage, Pierre; De Raedt, Rudi

    2017-12-01

    Interpersonal difficulties are common in depression, but their underlying mechanisms are not yet fully understood. The role of depression in the identification of mixed emotional signals with a direct interpersonal value remains unclear. The present study aimed to clarify this question. A sample of 39 individuals reporting a broad range of depression levels completed an emotion identification task where they viewed faces expressing three emotional categories (100% disgusted and 100% happy faces, as well as their morphed 50% disgusted - 50% happy exemplars). Participants were asked to identify the corresponding depicted emotion as "clearly disgusted", "mixed", or "clearly happy". Higher depression levels were associated with lower identification of positive emotions in 50% disgusted - 50% happy faces. The study was conducted with an analogue sample reporting individual differences in subclinical depression levels. Further research must replicate these findings in a clinical sample and clarify whether differential emotional identification patterns emerge in depression for different mixed negative-positive emotions (sad-happy vs. disgusted-happy). Depression may account for a lower bias to perceive positive states when ambiguous states from others include subtle signals of social threat (i.e., disgust), leading to an under-perception of positive social signals. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Reaction times and face discrimination with emotional content

    Directory of Open Access Journals (Sweden)

    ANA MARÍA MARTÍNEZ

    2002-07-01

    Full Text Available Sixty-two university subjects students located in two groups, with a stocking of age of 21.6 for thegroup of women and 22 for the group of men with the purpose to carry out a study upon visual timesof reaction TRV with emotional content keeping in mind the position: start, half and end; the emotionalcontent: neutral, friendly and threatening; and the combinations of the stimuli. The group of womenI present TR more prolonged than that of the men in all the experimental conditions. Also it wasobserved, that more are prolonged when the stimulus to discriminate this located in the half so muchin men as women.

  5. Face to face: blocking facial mimicry can selectively impair recognition of emotional expressions.

    Science.gov (United States)

    Oberman, Lindsay M; Winkielman, Piotr; Ramachandran, Vilayanur S

    2007-01-01

    People spontaneously mimic a variety of behaviors, including emotional facial expressions. Embodied cognition theories suggest that mimicry reflects internal simulation of perceived emotion in order to facilitate its understanding. If so, blocking facial mimicry should impair recognition of expressions, especially of emotions that are simulated using facial musculature. The current research tested this hypothesis using four expressions (happy, disgust, fear, and sad) and two mimicry-interfering manipulations (1) biting on a pen and (2) chewing gum, as well as two control conditions. Experiment 1 used electromyography over cheek, mouth, and nose regions. The bite manipulation consistently activated assessed muscles, whereas the chew manipulation activated muscles only intermittently. Further, expressing happiness generated most facial action. Experiment 2 found that the bite manipulation interfered most with recognition of happiness. These findings suggest that facial mimicry differentially contributes to recognition of specific facial expressions, thus allowing for more refined predictions from embodied cognition theories.

  6. Mixed emotions: Sensitivity to facial variance in a crowd of faces.

    Science.gov (United States)

    Haberman, Jason; Lee, Pegan; Whitney, David

    2015-01-01

    The visual system automatically represents summary information from crowds of faces, such as the average expression. This is a useful heuristic insofar as it provides critical information about the state of the world, not simply information about the state of one individual. However, the average alone is not sufficient for making decisions about how to respond to a crowd. The variance or heterogeneity of the crowd--the mixture of emotions--conveys information about the reliability of the average, essential for determining whether the average can be trusted. Despite its importance, the representation of variance within a crowd of faces has yet to be examined. This is addressed here in three experiments. In the first experiment, observers viewed a sample set of faces that varied in emotion, and then adjusted a subsequent set to match the variance of the sample set. To isolate variance as the summary statistic of interest, the average emotion of both sets was random. Results suggested that observers had information regarding crowd variance. The second experiment verified that this was indeed a uniquely high-level phenomenon, as observers were unable to derive the variance of an inverted set of faces as precisely as an upright set of faces. The third experiment replicated and extended the first two experiments using method-of-constant-stimuli. Together, these results show that the visual system is sensitive to emergent information about the emotional heterogeneity, or ambivalence, in crowds of faces.

  7. Elevated responses to constant facial emotions in different faces in the human amygdala: an fMRI study of facial identity and expression

    Directory of Open Access Journals (Sweden)

    Weiller Cornelius

    2004-11-01

    Full Text Available Abstract Background Human faces provide important signals in social interactions by inferring two main types of information, individual identity and emotional expression. The ability to readily assess both, the variability and consistency among emotional expressions in different individuals, is central to one's own interpretation of the imminent environment. A factorial design was used to systematically test the interaction of either constant or variable emotional expressions with constant or variable facial identities in areas involved in face processing using functional magnetic resonance imaging. Results Previous studies suggest a predominant role of the amygdala in the assessment of emotional variability. Here we extend this view by showing that this structure activated to faces with changing identities that display constant emotional expressions. Within this condition, amygdala activation was dependent on the type and intensity of displayed emotion, with significant responses to fearful expressions and, to a lesser extent so to neutral and happy expressions. In contrast, the lateral fusiform gyrus showed a binary pattern of increased activation to changing stimulus features while it was also differentially responsive to the intensity of displayed emotion when processing different facial identities. Conclusions These results suggest that the amygdala might serve to detect constant facial emotions in different individuals, complementing its established role for detecting emotional variability.

  8. The not face: A grammaticalization of facial expressions of emotion.

    Science.gov (United States)

    Benitez-Quiroz, C Fabian; Wilbur, Ronnie B; Martinez, Aleix M

    2016-05-01

    Facial expressions of emotion are thought to have evolved from the development of facial muscles used in sensory regulation and later adapted to express moral judgment. Negative moral judgment includes the expressions of anger, disgust and contempt. Here, we study the hypothesis that these facial expressions of negative moral judgment have further evolved into a facial expression of negation regularly used as a grammatical marker in human language. Specifically, we show that people from different cultures expressing negation use the same facial muscles as those employed to express negative moral judgment. We then show that this nonverbal signal is used as a co-articulator in speech and that, in American Sign Language, it has been grammaticalized as a non-manual marker. Furthermore, this facial expression of negation exhibits the theta oscillation (3-8 Hz) universally seen in syllable and mouthing production in speech and signing. These results provide evidence for the hypothesis that some components of human language have evolved from facial expressions of emotion, and suggest an evolutionary route for the emergence of grammatical markers. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Emotion Words, Regardless of Polarity, Have a Processing Advantage over Neutral Words

    Science.gov (United States)

    Kousta, Stavroula-Thaleia; Vinson, David P.; Vigliocco, Gabriella

    2009-01-01

    Despite increasing interest in the interface between emotion and cognition, the role of emotion in cognitive tasks is unclear. According to one hypothesis, negative valence is more relevant for survival and is associated with a general slowdown of the processing of stimuli, due to a defense mechanism that freezes activity in the face of threat.…

  10. Cultural in-group advantage: emotion recognition in African American and European American faces and voices.

    Science.gov (United States)

    Wickline, Virginia B; Bailey, Wendy; Nowicki, Stephen

    2009-03-01

    The authors explored whether there were in-group advantages in emotion recognition of faces and voices by culture or geographic region. Participants were 72 African American students (33 men, 39 women), 102 European American students (30 men, 72 women), 30 African international students (16 men, 14 women), and 30 European international students (15 men, 15 women). The participants determined emotions in African American and European American faces and voices. Results showed an in-group advantage-sometimes by culture, less often by race-in recognizing facial and vocal emotional expressions. African international students were generally less accurate at interpreting American nonverbal stimuli than were European American, African American, and European international peers. Results suggest that, although partly universal, emotional expressions have subtle differences across cultures that persons must learn.

  11. Avoidant decision making in social anxiety: the interaction of angry faces and emotional responses

    Science.gov (United States)

    Pittig, Andre; Pawlikowski, Mirko; Craske, Michelle G.; Alpers, Georg W.

    2014-01-01

    Recent research indicates that angry facial expressions are preferentially processed and may facilitate automatic avoidance response, especially in socially anxious individuals. However, few studies have examined whether this bias also expresses itself in more complex cognitive processes and behavior such as decision making. We recently introduced a variation of the Iowa Gambling Task which allowed us to document the influence of task-irrelevant emotional cues on rational decision making. The present study used a modified gambling task to investigate the impact of angry facial expressions on decision making in 38 individuals with a wide range of social anxiety. Participants were to find out which choices were (dis-) advantageous to maximize overall gain. To create a decision conflict between approach of reward and avoidance of fear-relevant angry faces, advantageous choices were associated with angry facial expressions, whereas disadvantageous choices were associated with happy facial expressions. Results indicated that higher social avoidance predicted less advantageous decisions in the beginning of the task, i.e., when contingencies were still uncertain. Interactions with specific skin conductance responses further clarified that this initial avoidance only occurred in combination with elevated responses before choosing an angry facial expressions. In addition, an interaction between high trait anxiety and elevated responses to early losses predicted faster learning of an advantageous strategy. These effects were independent of intelligence, general risky decision-making, self-reported state anxiety, and depression. Thus, socially avoidant individuals who respond emotionally to angry facial expressions are more likely to show avoidance of these faces under uncertainty. This novel laboratory paradigm may be an appropriate analog for central features of social anxiety. PMID:25324792

  12. Avoidant decision making in social anxiety: the interaction of angry faces and emotional responses.

    Science.gov (United States)

    Pittig, Andre; Pawlikowski, Mirko; Craske, Michelle G; Alpers, Georg W

    2014-01-01

    Recent research indicates that angry facial expressions are preferentially processed and may facilitate automatic avoidance response, especially in socially anxious individuals. However, few studies have examined whether this bias also expresses itself in more complex cognitive processes and behavior such as decision making. We recently introduced a variation of the Iowa Gambling Task which allowed us to document the influence of task-irrelevant emotional cues on rational decision making. The present study used a modified gambling task to investigate the impact of angry facial expressions on decision making in 38 individuals with a wide range of social anxiety. Participants were to find out which choices were (dis-) advantageous to maximize overall gain. To create a decision conflict between approach of reward and avoidance of fear-relevant angry faces, advantageous choices were associated with angry facial expressions, whereas disadvantageous choices were associated with happy facial expressions. Results indicated that higher social avoidance predicted less advantageous decisions in the beginning of the task, i.e., when contingencies were still uncertain. Interactions with specific skin conductance responses further clarified that this initial avoidance only occurred in combination with elevated responses before choosing an angry facial expressions. In addition, an interaction between high trait anxiety and elevated responses to early losses predicted faster learning of an advantageous strategy. These effects were independent of intelligence, general risky decision-making, self-reported state anxiety, and depression. Thus, socially avoidant individuals who respond emotionally to angry facial expressions are more likely to show avoidance of these faces under uncertainty. This novel laboratory paradigm may be an appropriate analog for central features of social anxiety.

  13. Avoidant decision making in social anxiety: The interaction of angry faces and emotional responses

    Directory of Open Access Journals (Sweden)

    Andre ePittig

    2014-09-01

    Full Text Available Recent research indicates that angry facial expressions are preferentially processed and may facilitate automatic avoidance response, especially in socially anxious individuals. However, few studies have examined whether this bias also expresses itself in more complex cognitive processes and behavior such as decision making. We recently introduced a variation of the Iowa Gambling Task which allowed us to document the influence of task-irrelevant emotional cues on rational decision making. The present study used a modified gambling task to investigate the impact of angry facial expressions on decision making in 38 individuals with a wide range of social anxiety. Participants were to find out which choices were (dis- advantageous to maximize overall gain. To create a decision conflict between approach of rewards and avoidance of fear-relevant angry faces, advantageous choices were associated with angry facial expressions, whereas disadvantageous choices were associated with happy facial expressions. Results indicated that higher social avoidance predicted less advantageous decisions in the beginning of the task, i.e., when contingencies were still uncertain. Interactions with specific skin conductance responses further clarified that this initial avoidance only occurred in combination with elevated responses before choosing an angry facial expressions. In addition, an interaction between high trait anxiety and elevated responses to early losses predicted faster learning of an advantageous strategy. These effects were independent of intelligence, general risky decision-making, self-reported state anxiety, and depression. Thus, socially avoidant individuals who respond emotionally to angry facial expressions are more likely to show avoidance of these faces under uncertainty. This novel laboratory paradigm may be an appropriate analog for central features of social anxiety.

  14. How clients "change emotion with emotion": A programme of research on emotional processing.

    Science.gov (United States)

    Pascual-Leone, Antonio

    2018-03-01

    This paper reviews a body of research that has examined Pascual-Leone and Greenberg's sequential model of emotional processing or used its accompanying measure (the Classification of Affective Meaning States). Research from 24 studies using a plurality of methods examined process-outcome relationships from micro to macro levels of observation and builds support for emotional transformation as a possible causal mechanism of change in psychotherapy. A pooled sample of 310 clinical and 130 sub-clinical cases have been studied, reflecting the process of 7 different treatment approaches in addressing over 5 different presenting clinical problems (including depression, anxiety, relational trauma, and personality disorders). The initial findings on this model support the hypothesis that emotional transformation occurs in specific canonical sequences and these show large effects in the prediction of positive treatment outcomes. This model is the first in the field of psychotherapy to show how non-linear temporal patterns of moment-by-moment process relate to the unfolding of increasingly larger changes to create good psychotherapy treatment outcomes. Finally, clinical application of the model is also considered as a template for case formulations focused on emotion. Clinical or methodological significance of this article: This review article examines research on a specific model of emotional processing. (i) Experiencing certain key emotions during psychotherapy seems to predict good treatment outcomes, at both the session and treatment levels. (ii) There is also evidence to suggest that these productive emotional experiences unfold in an ordered pattern. Moreover, (iii) support for this way of understanding emotional processing comes from a number of very different treatment approaches and for several kinds of major disorders.

  15. Neurocognitive mechanisms of gaze-expression interactions in face processing and social attention.

    Science.gov (United States)

    Graham, Reiko; Labar, Kevin S

    2012-04-01

    The face conveys a rich source of non-verbal information used during social communication. While research has revealed how specific facial channels such as emotional expression are processed, little is known about the prioritization and integration of multiple cues in the face during dyadic exchanges. Classic models of face perception have emphasized the segregation of dynamic vs. static facial features along independent information processing pathways. Here we review recent behavioral and neuroscientific evidence suggesting that within the dynamic stream, concurrent changes in eye gaze and emotional expression can yield early independent effects on face judgments and covert shifts of visuospatial attention. These effects are partially segregated within initial visual afferent processing volleys, but are subsequently integrated in limbic regions such as the amygdala or via reentrant visual processing volleys. This spatiotemporal pattern may help to resolve otherwise perplexing discrepancies across behavioral studies of emotional influences on gaze-directed attentional cueing. Theoretical explanations of gaze-expression interactions are discussed, with special consideration of speed-of-processing (discriminability) and contextual (ambiguity) accounts. Future research in this area promises to reveal the mental chronometry of face processing and interpersonal attention, with implications for understanding how social referencing develops in infancy and is impaired in autism and other disorders of social cognition. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Sad benefit in face working memory: an emotional bias of melancholic depression.

    Science.gov (United States)

    Linden, Stefanie C; Jackson, Margaret C; Subramanian, Leena; Healy, David; Linden, David E J

    2011-12-01

    Emotion biases feature prominently in cognitive theories of depression and are a focus of psychological interventions. However, there is presently no stable neurocognitive marker of altered emotion-cognition interactions in depression. One reason may be the heterogeneity of major depressive disorder. Our aim in the present study was to find an emotional bias that differentiates patients with melancholic depression from controls, and patients with melancholic from those with non-melancholic depression. We used a working memory paradigm for emotional faces, where two faces with angry, happy, neutral, sad or fearful expression had to be retained over one second. Twenty patients with melancholic depression, 20 age-, education- and gender-matched control participants and 20 patients with non-melancholic depression participated in the study. We analysed performance on the working memory task using signal detection measures. We found an interaction between group and emotion on working memory performance that was driven by the higher performance for sad faces compared to other categories in the melancholic group. We computed a measure of "sad benefit", which distinguished melancholic and non-melancholic patients with good sensitivity and specificity. However, replication studies and formal discriminant analysis will be needed in order to assess whether emotion bias in working memory may become a useful diagnostic tool to distinguish these two syndromes. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Virtual faces expressing emotions: an initial concomitant and construct validity study.

    Science.gov (United States)

    Joyal, Christian C; Jacob, Laurence; Cigna, Marie-Hélène; Guay, Jean-Pierre; Renaud, Patrice

    2014-01-01

    Facial expressions of emotions represent classic stimuli for the study of social cognition. Developing virtual dynamic facial expressions of emotions, however, would open-up possibilities, both for fundamental and clinical research. For instance, virtual faces allow real-time Human-Computer retroactions between physiological measures and the virtual agent. The goal of this study was to initially assess concomitants and construct validity of a newly developed set of virtual faces expressing six fundamental emotions (happiness, surprise, anger, sadness, fear, and disgust). Recognition rates, facial electromyography (zygomatic major and corrugator supercilii muscles), and regional gaze fixation latencies (eyes and mouth regions) were compared in 41 adult volunteers (20 ♂, 21 ♀) during the presentation of video clips depicting real vs. virtual adults expressing emotions. Emotions expressed by each set of stimuli were similarly recognized, both by men and women. Accordingly, both sets of stimuli elicited similar activation of facial muscles and similar ocular fixation times in eye regions from man and woman participants. Further validation studies can be performed with these virtual faces among clinical populations known to present social cognition difficulties. Brain-Computer Interface studies with feedback-feedforward interactions based on facial emotion expressions can also be conducted with these stimuli.

  18. Attentional capture by emotional faces is contingent on attentional control settings

    DEFF Research Database (Denmark)

    Barratt, D.; Bundesen, Claus

    2012-01-01

    faster and attract more processing resources), responses to positive faces were slower when these were flanked by (response incompatible) negative faces as compared with positive or neutral faces, whereas responses to negative faces were unaffected by the identity of the flankers. Experiment 2...

  19. Emotion processing in preschoolers with autism spectrum disorders

    NARCIS (Netherlands)

    Zantinge, G.M.

    2018-01-01

    Children spend most of their days interacting with their social environment. Emotions form a large part of these interactions and vice versa social emotions become meaningful when interacting with others. Understanding the emotion processes that underlie successful social functioning is

  20. Dissociable neural effects of stimulus valence and preceding context during the inhibition of responses to emotional faces.

    Science.gov (United States)

    Schulz, Kurt P; Clerkin, Suzanne M; Halperin, Jeffrey M; Newcorn, Jeffrey H; Tang, Cheuk Y; Fan, Jin

    2009-09-01

    Socially appropriate behavior requires the concurrent inhibition of actions that are inappropriate in the context. This self-regulatory function requires an interaction of inhibitory and emotional processes that recruits brain regions beyond those engaged by either processes alone. In this study, we isolated brain activity associated with response inhibition and emotional processing in 24 healthy adults using event-related functional magnetic resonance imaging (fMRI) and a go/no-go task that independently manipulated the context preceding no-go trials (ie, number of go trials) and the valence (ie, happy, sad, and neutral) of the face stimuli used as trial cues. Parallel quadratic trends were seen in correct inhibitions on no-go trials preceded by increasing numbers of go trials and associated activation for correct no-go trials in inferior frontal gyrus pars opercularis, pars triangularis, and pars orbitalis, temporoparietal junction, superior parietal lobule, and temporal sensory association cortices. Conversely, the comparison of happy versus neutral faces and sad versus neutral faces revealed valence-dependent activation in the amygdala, anterior insula cortex, and posterior midcingulate cortex. Further, an interaction between inhibition and emotion was seen in valence-dependent variations in the quadratic trend in no-go activation in the right inferior frontal gyrus and left posterior insula cortex. These results suggest that the inhibition of response to emotional cues involves the interaction of partly dissociable limbic and frontoparietal networks that encode emotional cues and use these cues to exert inhibitory control over the motor, attention, and sensory functions needed to perform the task, respectively. 2008 Wiley-Liss, Inc.

  1. The ties to unbind: Age-related differences in feature (unbinding in working memory for emotional faces

    Directory of Open Access Journals (Sweden)

    Didem ePehlivanoglu

    2014-04-01

    Full Text Available In the present study, we investigated age-related differences in the processing of emotional stimuli. Specifically, we were interested in whether older adults would show deficits in unbinding emotional expression (i.e., either no emotion, happiness, anger, or disgust from bound stimuli (i.e., photographs of faces expressing these emotions, as a hyperbinding account of age-related differences in working memory would predict. Younger and older adults completed different N-Back tasks (side-by-side 0-Back, 1-Back, 2-Back under three conditions: match/mismatch judgments based on either the identity of the face (identity condition, the face’s emotional expression (expression condition, or both identity and expression of the face (binding condition. Both age groups performed more slowly and with lower accuracy in the expression condition than in the binding condition, indicating the presence of an unbinding process. This unbinding effect was more pronounced in older adults than in younger adults, but only in the 2-Back task. Thus, older adults seemed to have a specific deficit in unbinding in working memory, over and beyond age-related differences observed in perceptual processing (0-Back and attention/short-term memory (1-Back. Additionally, no age-related differences were found in accuracy in the 0-Back task, but such differences emerged in the 1-Back task, and were further magnified in the 2-Back task, indicating independent age-related differences in attention/short-term memory and working memory. Pupil dilation data confirmed that the attention/short-term memory version of the task (1-Back is more effortful in older adults than younger adults.

  2. Love withdrawal predicts electrocortical responses to emotional faces with performance feedback: a follow-up and extension.

    Science.gov (United States)

    Huffmeijer, Renske; Bakermans-Kranenburg, Marian J; Alink, Lenneke R A; van IJzendoorn, Marinus H

    2014-06-02

    Parental use of love withdrawal is thought to affect children's later psychological functioning because it creates a link between children's performance and relational consequences. In addition, recent studies have begun to show that experiences of love withdrawal also relate to the neural processing of socio-emotional information relevant to a performance-relational consequence link, and can moderate effects of oxytocin on social information processing and behavior. The current study follows-up on our previous results by attempting to confirm and extend previous findings indicating that experiences of maternal love withdrawal are related to electrocortical responses to emotional faces presented with performance feedback. More maternal love withdrawal was related to enhanced early processing of facial feedback stimuli (reflected in more positive VPP amplitudes, and confirming previous findings). However, attentional engagement with and processing of the stimuli at a later stage were diminished in those reporting higher maternal love withdrawal (reflected in less positive LPP amplitudes, and diverging from previous findings). Maternal love withdrawal affects the processing of emotional faces presented with performance feedback differently in different stages of neural processing.

  3. Effects on automatic attention due to exposure to pictures of emotional faces while performing Chinese word judgment tasks.

    Science.gov (United States)

    Junhong, Huang; Renlai, Zhou; Senqi, Hu

    2013-01-01

    Two experiments were conducted to investigate the automatic processing of emotional facial expressions while performing low or high demand cognitive tasks under unattended conditions. In Experiment 1, 35 subjects performed low (judging the structure of Chinese words) and high (judging the tone of Chinese words) cognitive load tasks while exposed to unattended pictures of fearful, neutral, or happy faces. The results revealed that the reaction time was slower and the performance accuracy was higher while performing the low cognitive load task than while performing the high cognitive load task. Exposure to fearful faces resulted in significantly longer reaction times and lower accuracy than exposure to neutral faces on the low cognitive load task. In Experiment 2, 26 subjects performed the same word judgment tasks and their brain event-related potentials (ERPs) were measured for a period of 800 ms after the onset of the task stimulus. The amplitudes of the early component of ERP around 176 ms (P2) elicited by unattended fearful faces over frontal-central-parietal recording sites was significantly larger than those elicited by unattended neutral faces while performing the word structure judgment task. Together, the findings of the two experiments indicated that unattended fearful faces captured significantly more attention resources than unattended neutral faces on a low cognitive load task, but not on a high cognitive load task. It was concluded that fearful faces could automatically capture attention if residues of attention resources were available under the unattended condition.

  4. Investigating vulnerability to eating disorders: biases in emotional processing.

    Science.gov (United States)

    Pringle, A; Harmer, C J; Cooper, M J

    2010-04-01

    Biases in emotional processing and cognitions about the self are thought to play a role in the maintenance of eating disorders (EDs). However, little is known about whether these difficulties exist pre-morbidly and how they might contribute to risk. Female dieters (n=82) completed a battery of tasks designed to assess the processing of social cues (facial emotion recognition), cognitions about the self [Self-Schema Processing Task (SSPT)] and ED-specific cognitions about eating, weight and shape (emotional Stroop). The 26-item Eating Attitudes Test (EAT-26; Garner et al. 1982) was used to assess subclinical ED symptoms; this was used as an index of vulnerability within this at-risk group. Regression analyses showed that biases in the processing of both neutral and angry faces were predictive of our measure of vulnerability (EAT-26). In the self-schema task, biases in the processing of negative self descriptors previously found to be common in EDs predicted vulnerability. Biases in the processing of shape-related words on the Stroop task were also predictive; however, these biases were more important in dieters who also displayed biases in the self-schema task. We were also able to demonstrate that these biases are specific and separable from more general negative biases that could be attributed to depressive symptoms. These results suggest that specific biases in the processing of social cues, cognitions about the self, and also about eating, weight and shape information, may be important in understanding risk and preventing relapse in EDs.

  5. A Rapid Subcortical Amygdala Route for Faces Irrespective of Spatial Frequency and Emotion.

    Science.gov (United States)

    McFadyen, Jessica; Mermillod, Martial; Mattingley, Jason B; Halász, Veronika; Garrido, Marta I

    2017-04-05

    There is significant controversy over the existence and function of a direct subcortical visual pathway to the amygdala. It is thought that this pathway rapidly transmits low spatial frequency information to the amygdala independently of the cortex, and yet the directionality of this function has never been determined. We used magnetoencephalography to measure neural activity while human participants discriminated the gender of neutral and fearful faces filtered for low or high spatial frequencies. We applied dynamic causal modeling to demonstrate that the most likely underlying neural network consisted of a pulvinar-amygdala connection that was uninfluenced by spatial frequency or emotion, and a cortical-amygdala connection that conveyed high spatial frequencies. Crucially, data-driven neural simulations revealed a clear temporal advantage of the subcortical connection over the cortical connection in influencing amygdala activity. Thus, our findings support the existence of a rapid subcortical pathway that is nonselective in terms of the spatial frequency or emotional content of faces. We propose that that the "coarseness" of the subcortical route may be better reframed as "generalized." SIGNIFICANCE STATEMENT The human amygdala coordinates how we respond to biologically relevant stimuli, such as threat or reward. It has been postulated that the amygdala first receives visual input via a rapid subcortical route that conveys "coarse" information, namely, low spatial frequencies. For the first time, the present paper provides direction-specific evidence from computational modeling that the subcortical route plays a generalized role in visual processing by rapidly transmitting raw, unfiltered information directly to the amygdala. This calls into question a widely held assumption across human and animal research that fear responses are produced faster by low spatial frequencies. Our proposed mechanism suggests organisms quickly generate fear responses to a wide range

  6. Neural Temporal Dynamics of Facial Emotion Processing: Age Effects and Relationship to Cognitive Function

    Directory of Open Access Journals (Sweden)

    Xiaoyan Liao

    2017-06-01

    Full Text Available This study used event-related potentials (ERPs to investigate the effects of age on neural temporal dynamics of processing task-relevant facial expressions and their relationship to cognitive functions. Negative (sad, afraid, angry, and disgusted, positive (happy, and neutral faces were presented to 30 older and 31 young participants who performed a facial emotion categorization task. Behavioral and ERP indices of facial emotion processing were analyzed. An enhanced N170 for negative faces, in addition to intact right-hemispheric N170 for positive faces, was observed in older adults relative to their younger counterparts. Moreover, older adults demonstrated an attenuated within-group N170 laterality effect for neutral faces, while younger adults showed the opposite pattern. Furthermore, older adults exhibited sustained temporo-occipital negativity deflection over the time range of 200–500 ms post-stimulus, while young adults showed posterior positivity and subsequent emotion-specific frontal negativity deflections. In older adults, decreased accuracy for labeling negative faces was positively correlated with Montreal Cognitive Assessment Scores, and accuracy for labeling neutral faces was negatively correlated with age. These findings suggest that older people may exert more effort in structural encoding for negative faces and there are different response patterns for the categorization of different facial emotions. Cognitive functioning may be related to facial emotion categorization deficits observed in older adults. This may not be attributable to positivity effects: it may represent a selective deficit for the processing of negative facial expressions in older adults.

  7. Origin of Emotion Effects on ERP Correlates of Emotional Word Processing: The Emotion Duality Approach.

    Science.gov (United States)

    Imbir, Kamil Konrad; Jarymowicz, Maria Teresa; Spustek, Tomasz; Kuś, Rafał; Żygierewicz, Jarosław

    2015-01-01

    We distinguish two evaluative systems which evoke automatic and reflective emotions. Automatic emotions are direct reactions to stimuli whereas reflective emotions are always based on verbalized (and often abstract) criteria of evaluation. We conducted an electroencephalography (EEG) study in which 25 women were required to read and respond to emotional words which engaged either the automatic or reflective system. Stimulus words were emotional (positive or negative) and neutral. We found an effect of valence on an early response with dipolar fronto-occipital topography; positive words evoked a higher amplitude response than negative words. We also found that topographically specific differences in the amplitude of the late positive complex were related to the system involved in processing. Emotional stimuli engaging the automatic system were associated with significantly higher amplitudes in the left-parietal region; the response to neutral words was similar regardless of the system engaged. A different pattern of effects was observed in the central region, neutral stimuli engaging the reflective system evoked a higher amplitudes response whereas there was no system effect for emotional stimuli. These differences could not be reduced to effects of differences between the arousing properties and concreteness of the words used as stimuli.

  8. The match-mismatch model of emotion processing styles and emotion regulation strategies in fibromyalgia.

    NARCIS (Netherlands)

    Geenen, R.; Ooijen-van der Linden, L. van; Lumley, M.A.; Bijlsma, J.W.J.; Middendorp, H. van

    2012-01-01

    OBJECTIVE: Individuals differ in their style of processing emotions (e.g., experiencing affects intensely or being alexithymic) and their strategy of regulating emotions (e.g., expressing or reappraising). A match-mismatch model of emotion processing styles and emotion regulation strategies is

  9. Sex differences in functional activation patterns revealed by increased emotion processing demands.

    Science.gov (United States)

    Hall, Geoffrey B C; Witelson, Sandra F; Szechtman, Henry; Nahmias, Claude

    2004-02-09

    Two [O(15)] PET studies assessed sex differences regional brain activation in the recognition of emotional stimuli. Study I revealed that the recognition of emotion in visual faces resulted in bilateral frontal activation in women, and unilateral right-sided activation in men. In study II, the complexity of the emotional face task was increased through tje addition of associated auditory emotional stimuli. Men again showed unilateral frontal activation, in this case to the left; whereas women did not show bilateral frontal activation, but showed greater limbic activity. These results suggest that when processing broader cross-modal emotional stimuli, men engage more in associative cognitive strategies while women draw more on primary emotional references.

  10. The Impact of Early Bilingualism on Face Recognition Processes.

    Science.gov (United States)

    Kandel, Sonia; Burfin, Sabine; Méary, David; Ruiz-Tada, Elisa; Costa, Albert; Pascalis, Olivier

    2016-01-01

    Early linguistic experience has an impact on the way we decode audiovisual speech in face-to-face communication. The present study examined whether differences in visual speech decoding could be linked to a broader difference in face processing. To identify a phoneme we have to do an analysis of the speaker's face to focus on the relevant cues for speech decoding (e.g., locating the mouth with respect to the eyes). Face recognition processes were investigated through two classic effects in face recognition studies: the Other-Race Effect (ORE) and the Inversion Effect. Bilingual and monolingual participants did a face recognition task with Caucasian faces (own race), Chinese faces (other race), and cars that were presented in an Upright or Inverted position. The results revealed that monolinguals exhibited the classic ORE. Bilinguals did not. Overall, bilinguals were slower than monolinguals. These results suggest that bilinguals' face processing abilities differ from monolinguals'. Early exposure to more than one language may lead to a perceptual organization that goes beyond language processing and could extend to face analysis. We hypothesize that these differences could be due to the fact that bilinguals focus on different parts of the face than monolinguals, making them more efficient in other race face processing but slower. However, more studies using eye-tracking techniques are necessary to confirm this explanation.

  11. The impact of early bilingualism on face recognition processes

    Directory of Open Access Journals (Sweden)

    Sonia Kandel

    2016-07-01

    Full Text Available Early linguistic experience has an impact on the way we decode audiovisual speech in face-to-face communication. The present study examined whether differences in visual speech decoding could be linked to a broader difference in face processing. To identify a phoneme we have to do an analysis of the speaker’s face to focus on the relevant cues for speech decoding (e.g., locating the mouth with respect to the eyes. Face recognition processes were investigated through two classic effects in face recognition studies: the Other Race Effect (ORE and the Inversion Effect. Bilingual and monolingual participants did a face recognition task with Caucasian faces (own race, Chinese faces (other race and cars that were presented in an Upright or Inverted position. The results revealed that monolinguals exhibited the classic ORE. Bilinguals did not. Overall, bilinguals were slower than monolinguals. These results suggest that bilinguals’ face processing abilities differ from monolinguals’. Early exposure to more than one language may lead to a perceptual organization that goes beyond language processing and could extend to face analysis. We hypothesize that these differences could be due to the fact that bilinguals focus on different parts of the face than monolinguals, making them more efficient in other race face processing but slower. However, more studies using eye-tracking techniques are necessary to confirm this explanation.

  12. Age-related perspectives and emotion processing.

    Science.gov (United States)

    Lynchard, Nicholas A; Radvansky, Gabriel A

    2012-12-01

    Emotion is processed differently in younger and older adults. Older adults show a positivity effect, whereas younger adults show a negativity effect. Socioemotional selectivity theory suggests that these effects can be elicited in any age group when age-related perspectives are manipulated. To examine this, younger and older adults were oriented to actual and age-contrasting possible selves. Emotion activations were assessed using lexical decision. In line with socioemotional selectivity theory, shifts in emotion orientation varied according to perspective, with both younger and older adults showing a negativity effect when a younger adult perspective was taken and a positivity effect when an older adult perspective was taken. 2013 APA, all rights reserved

  13. Emotional Faces in Context: Age Differences in Recognition Accuracy and Scanning Patterns

    Science.gov (United States)

    Noh, Soo Rim; Isaacowitz, Derek M.

    2014-01-01

    While age-related declines in facial expression recognition are well documented, previous research relied mostly on isolated faces devoid of context. We investigated the effects of context on age differences in recognition of facial emotions and in visual scanning patterns of emotional faces. While their eye movements were monitored, younger and older participants viewed facial expressions (i.e., anger, disgust) in contexts that were emotionally congruent, incongruent, or neutral to the facial expression to be identified. Both age groups had highest recognition rates of facial expressions in the congruent context, followed by the neutral context, and recognition rates in the incongruent context were worst. These context effects were more pronounced for older adults. Compared to younger adults, older adults exhibited a greater benefit from congruent contextual information, regardless of facial expression. Context also influenced the pattern of visual scanning characteristics of emotional faces in a similar manner across age groups. In addition, older adults initially attended more to context overall. Our data highlight the importance of considering the role of context in understanding emotion recognition in adulthood. PMID:23163713

  14. The time course of attentional modulation on emotional conflict processing.

    Science.gov (United States)

    Zhou, Pingyan; Yang, Guochun; Nan, Weizhi; Liu, Xun

    2016-01-01

    Cognitive conflict resolution is critical to human survival in a rapidly changing environment. However, emotional conflict processing seems to be particularly important for human interactions. This study examined whether the time course of attentional modulation on emotional conflict processing was different from cognitive conflict processing during a flanker task. Results showed that emotional N200 and P300 effects, similar to colour conflict processing, appeared only during the relevant task. However, the emotional N200 effect preceded the colour N200 effect, indicating that emotional conflict can be identified earlier than cognitive conflict. Additionally, a significant emotional N100 effect revealed that emotional valence differences could be perceived during early processing based on rough aspects of input. The present data suggest that emotional conflict processing is modulated by top-down attention, similar to cognitive conflict processing (reflected by N200 and P300 effects). However, emotional conflict processing seems to have more time advantages during two different processing stages.

  15. Emotional Face Identification in Youths with Primary Bipolar Disorder or Primary Attention-Deficit/Hyperactivity Disorder

    Science.gov (United States)

    Seymour, Karen E.; Pescosolido, Matthew F.; Reidy, Brooke L.; Galvan, Thania; Kim, Kerri L.; Young, Matthew; Dickstein, Daniel P.

    2013-01-01

    Objective: Bipolar disorder (BD) and attention-deficit/hyperactivity disorder (ADHD) are often comorbid or confounded; therefore, we evaluated emotional face identification to better understand brain/behavior interactions in children and adolescents with either primary BD, primary ADHD, or typically developing controls (TDC). Method: Participants…

  16. Ratings of Emotion in Laterally Presented Faces: Sex and handedness effects

    NARCIS (Netherlands)

    van Strien, J.W.; van Beek, S.

    2000-01-01

    Sixteen right-handed participants (8 male and 8 female students) and 16 left-handed participants (8 male and 8 female students) were presented with cartoon faces expressing emotions ranging from extremely positive to extremely negative. A forced-choice paradigm was used in which the participants

  17. Mineralocorticoid receptor haplotype, oral contraceptives and emotional information processing.

    Science.gov (United States)

    Hamstra, D A; de Kloet, E R; van Hemert, A M; de Rijk, R H; Van der Does, A J W

    2015-02-12

    Oral contraceptives (OCs) affect mood in some women and may have more subtle effects on emotional information processing in many more users. Female carriers of mineralocorticoid receptor (MR) haplotype 2 have been shown to be more optimistic and less vulnerable to depression. To investigate the effects of oral contraceptives on emotional information processing and a possible moderating effect of MR haplotype. Cross-sectional study in 85 healthy premenopausal women of West-European descent. We found significant main effects of oral contraceptives on facial expression recognition, emotional memory and decision-making. Furthermore, carriers of MR haplotype 1 or 3 were sensitive to the impact of OCs on the recognition of sad and fearful faces and on emotional memory, whereas MR haplotype 2 carriers were not. Different compounds of OCs were included. No hormonal measures were taken. Most naturally cycling participants were assessed in the luteal phase of their menstrual cycle. Carriers of MR haplotype 2 may be less sensitive to depressogenic side-effects of OCs. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.

  18. The perception and identification of facial emotions in individuals with autism spectrum disorders using the Let's Face It! Emotion Skills Battery.

    Science.gov (United States)

    Tanaka, James W; Wolf, Julie M; Klaiman, Cheryl; Koenig, Kathleen; Cockburn, Jeffrey; Herlihy, Lauren; Brown, Carla; Stahl, Sherin S; South, Mikle; McPartland, James C; Kaiser, Martha D; Schultz, Robert T

    2012-12-01

    Although impaired social-emotional ability is a hallmark of autism spectrum disorder (ASD), the perceptual skills and mediating strategies contributing to the social deficits of autism are not well understood. A perceptual skill that is fundamental to effective social communication is the ability to accurately perceive and interpret facial emotions. To evaluate the expression processing of participants with ASD, we designed the Let's Face It! Emotion Skills Battery (LFI! Battery), a computer-based assessment composed of three subscales measuring verbal and perceptual skills implicated in the recognition of facial emotions. We administered the LFI! Battery to groups of participants with ASD and typically developing control (TDC) participants that were matched for age and IQ. On the Name Game labeling task, participants with ASD (N = 68) performed on par with TDC individuals (N = 66) in their ability to name the facial emotions of happy, sad, disgust and surprise and were only impaired in their ability to identify the angry expression. On the Matchmaker Expression task that measures the recognition of facial emotions across different facial identities, the ASD participants (N = 66) performed reliably worse than TDC participants (N = 67) on the emotions of happy, sad, disgust, frighten and angry. In the Parts-Wholes test of perceptual strategies of expression, the TDC participants (N = 67) displayed more holistic encoding for the eyes than the mouths in expressive faces whereas ASD participants (N = 66) exhibited the reverse pattern of holistic recognition for the mouth and analytic recognition of the eyes. In summary, findings from the LFI! Battery show that participants with ASD were able to label the basic facial emotions (with the exception of angry expression) on par with age- and IQ-matched TDC participants. However, participants with ASD were impaired in their ability to generalize facial emotions across different identities and showed a tendency to recognize

  19. Double attention bias for positive and negative emotional faces in clinical depression: evidence from an eye-tracking study.

    Science.gov (United States)

    Duque, Almudena; Vázquez, Carmelo

    2015-03-01

    According to cognitive models, attentional biases in depression play key roles in the onset and subsequent maintenance of the disorder. The present study examines the processing of emotional facial expressions (happy, angry, and sad) in depressed and non-depressed adults. Sixteen unmedicated patients with Major Depressive Disorder (MDD) and 34 never-depressed controls (ND) completed an eye-tracking task to assess different components of visual attention (orienting attention and maintenance of attention) in the processing of emotional faces. Compared to ND, participants with MDD showed a negative attentional bias in attentional maintenance indices (i.e. first fixation duration and total fixation time) for sad faces. This attentional bias was positively associated with the severity of depressive symptoms. Furthermore, the MDD group spent a marginally less amount of time viewing happy faces compared with the ND group. No differences were found between the groups with respect to angry faces and orienting attention indices. The current study is limited by its cross-sectional design. These results support the notion that attentional biases in depression are specific to depression-related information and that they operate in later stages in the deployment of attention. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Infants? Temperament and Mothers?, and Fathers? Depression Predict Infants? Attention to Objects Paired with Emotional Faces

    OpenAIRE

    Aktar, Evin; Mandell, Dorothy J.; de Vente, Wieke; Majdand?i?, Mirjana; Raijmakers, Maartje E. J.; B?gels, Susan M.

    2015-01-01

    Between 10 and 14 months, infants gain the ability to learn about unfamiliar stimuli by observing others’ emotional reactions to those stimuli, so called social referencing (SR). Joint processing of emotion and head/gaze direction is essential for SR. This study tested emotion and head/gaze direction effects on infants’ attention via pupillometry in the period following the emergence of SR. Pupil responses of 14-to-17-month-old infants (N = 57) were measured during computerized presentations ...

  1. Increased amygdala responses to emotional faces after psilocybin for treatment-resistant depression.

    Science.gov (United States)

    Roseman, Leor; Demetriou, Lysia; Wall, Matthew B; Nutt, David J; Carhart-Harris, Robin L

    2017-12-27

    Recent evidence indicates that psilocybin with psychological support may be effective for treating depression. Some studies have found that patients with depression show heightened amygdala responses to fearful faces and there is reliable evidence that treatment with SSRIs attenuates amygdala responses (Ma, 2015). We hypothesised that amygdala responses to emotional faces would be altered post-treatment with psilocybin. In this open-label study, 20 individuals diagnosed with moderate to severe, treatment-resistant depression, underwent two separate dosing sessions with psilocybin. Psychological support was provided before, during and after these sessions and 19 completed fMRI scans one week prior to the first session and one day after the second and last. Neutral, fearful and happy faces were presented in the scanner and analyses focused on the amygdala. Group results revealed rapid and enduring improvements in depressive symptoms post psilocybin. Increased responses to fearful and happy faces were observed in the right amygdala post-treatment, and right amygdala increases to fearful versus neutral faces were predictive of clinical improvements at 1-week. Psilocybin with psychological support was associated with increased amygdala responses to emotional stimuli, an opposite effect to previous findings with SSRIs. This suggests fundamental differences in these treatments' therapeutic actions, with SSRIs mitigating negative emotions and psilocybin allowing patients to confront and work through them. Based on the present results, we propose that psilocybin with psychological support is a treatment approach that potentially revives emotional responsiveness in depression, enabling patients to reconnect with their emotions. ISRCTN, number ISRCTN14426797. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Chess masters show a hallmark of face processing with chess.

    Science.gov (United States)

    Boggan, Amy L; Bartlett, James C; Krawczyk, Daniel C

    2012-02-01

    Face processing has several distinctive hallmarks that researchers have attributed either to face-specific mechanisms or to extensive experience distinguishing faces. Here, we examined the face-processing hallmark of selective attention failure--as indexed by the congruency effect in the composite paradigm--in a domain of extreme expertise: chess. Among 27 experts, we found that the congruency effect was equally strong with chessboards and faces. Further, comparing these experts with recreational players and novices, we observed a trade-off: Chess expertise was positively related to the congruency effect with chess yet negatively related to the congruency effect with faces. These and other findings reveal a case of expertise-dependent, facelike processing of objects of expertise and suggest that face and expert-chess recognition share common processes.

  3. The Way Dogs (Canis familiaris Look at Human Emotional Faces Is Modulated by Oxytocin. An Eye-Tracking Study

    Directory of Open Access Journals (Sweden)

    Anna Kis

    2017-10-01

    Full Text Available Dogs have been shown to excel in reading human social cues, including facial cues. In the present study we used eye-tracking technology to further study dogs’ face processing abilities. It was found that dogs discriminated between human facial regions in their spontaneous viewing pattern and looked most to the eye region independently of facial expression. Furthermore dogs played most attention to the first two images presented, afterwards their attention dramatically decreases; a finding that has methodological implications. Increasing evidence indicates that the oxytocin system is involved in dogs’ human-directed social competence, thus as a next step we investigated the effects of oxytocin on processing of human facial emotions. It was found that oxytocin decreases dogs’ looking to the human faces expressing angry emotional expression. More interestingly, however, after oxytocin pre-treatment dogs’ preferential gaze toward the eye region when processing happy human facial expressions disappears. These results provide the first evidence that oxytocin is involved in the regulation of human face processing in dogs. The present study is one of the few empirical investigations that explore eye gaze patterns in naïve and untrained pet dogs using a non-invasive eye-tracking technique and thus offers unique but largely untapped method for studying social cognition in dogs.

  4. The special status of sad infant faces: age and valence differences in adults' cortical face processing.

    Science.gov (United States)

    Colasante, Tyler; Mossad, Sarah I; Dudek, Joanna; Haley, David W

    2017-04-01

    Understanding the relative and joint prioritization of age- and valence-related face characteristics in adults' cortical face processing remains elusive because these two characteristics have not been manipulated in a single study of neural face processing. We used electroencephalography to investigate adults' P1, N170, P2 and LPP responses to infant and adult faces with happy and sad facial expressions. Viewing infant vs adult faces was associated with significantly larger P1, N170, P2 and LPP responses, with hemisphere and/or participant gender moderating this effect in select cases. Sad faces were associated with significantly larger N170 responses than happy faces. Sad infant faces were associated with significantly larger N170 responses in the right hemisphere than all other combinations of face age and face valence characteristics. We discuss the relative and joint neural prioritization of infant face characteristics and negative facial affect, and their biological value as distinct caregiving and social cues. © The Author (2016). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  5. Enhanced 3D face processing using an active vision system

    DEFF Research Database (Denmark)

    Lidegaard, Morten; Larsen, Rasmus; Kraft, Dirk

    2014-01-01

    We present an active face processing system based on 3D shape information extracted by means of stereo information. We use two sets of stereo cameras with different field of views (FOV): One with a wide FOV is used for face tracking, while the other with a narrow FOV is used for face identification...

  6. An exploration of emotional protection and regulation in nurse-patient interactions: The role of the professional face and the emotional mirror.

    Science.gov (United States)

    Cecil, Penelope; Glass, Nel

    2015-01-01

    While interpersonal styles of nurse-patient communication have become more relaxed in recent years, nurses remain challenged in emotional engagement with patients and other health professionals. In order to preserve a professional distance in patient care delivery however slight, nurses need to be able to regulate their emotions. This research aimed to investigate nurses' perceptions of emotional protection and regulation in patient care delivery. A qualitative approach was used for the study utilising in-depth semi-structured interviews and researcher reflective journaling. Participants were drawn from rural New South Wales. Following institutional ethics approval 5 nurses were interviewed and reflective journaling commenced. The interviews and the reflective journal were transcribed verbatim. The results revealed that nurses' emotional regulation demonstrated by a 'professional face' was an important strategy to enable delivery of quality care even though it resulted in emotional containment. Such regulation was a protective mechanism employed to look after self and was critical in situations of emotional dissonance. The results also found that nurses experience emotional dissonance in situations where they have unresolved personal emotional issues and the latter was a individual motivator to manage emotions in the workplace. Emotions play a pivotal role within nurse-patient relationships. The professional face can be recognised as contributing to emotional health and therefore maintaining the emotional health of nurses in practice. This study foregrounds the importance of regulating emotions and nurturing nurses' emotional health in contemporary practice.

  7. Behavioral assessment of emotional and motivational appraisal during visual processing of emotional scenes depending on spatial frequencies.

    Science.gov (United States)

    Fradcourt, B; Peyrin, C; Baciu, M; Campagne, A

    2013-10-01

    Previous studies performed on visual processing of emotional stimuli have revealed preference for a specific type of visual spatial frequencies (high spatial frequency, HSF; low spatial frequency, LSF) according to task demands. The majority of studies used a face and focused on the appraisal of the emotional state of others. The present behavioral study investigates the relative role of spatial frequencies on processing emotional natural scenes during two explicit cognitive appraisal tasks, one emotional, based on the self-emotional experience and one motivational, based on the tendency to action. Our results suggest that HSF information was the most relevant to rapidly identify the self-emotional experience (unpleasant, pleasant, and neutral) while LSF was required to rapidly identify the tendency to action (avoidance, approach, and no action). The tendency to action based on LSF analysis showed a priority for unpleasant stimuli whereas the identification of emotional experience based on HSF analysis showed a priority for pleasant stimuli. The present study confirms the interest of considering both emotional and motivational characteristics of visual stimuli. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Facing Complaining Customer and Suppressed Emotion at Worksite Related to Sleep Disturbance in Korea.

    Science.gov (United States)

    Lim, Sung Shil; Lee, Wanhyung; Hong, Kwanyoung; Jeung, Dayee; Chang, Sei Jin; Yoon, Jin Ha

    2016-11-01

    This study aimed to investigate the effect of facing complaining customer and suppressed emotion at worksite on sleep disturbance among working population. We enrolled 13,066 paid workers (male = 6,839, female = 6,227, age Working Condition Survey (2011). The odds ratio (OR) and 95% confidence intervals (CI) for sleep disturbance occurrence were calculated using multiple logistic regression models. Among workers in working environments where they always engage complaining customers had a significantly higher risk for sleep disturbance than rarely group (The OR [95% CI]; 5.46 [3.43-8.68] in male, 5.59 [3.30-9.46] in female workers). The OR (95% CI) for sleep disturbance was 1.78 (1.16-2.73) and 1.63 (1.02-2.63), for the male and female groups always suppressing their emotions at the workplace compared with those rarely group. Compared to those who both rarely engaged complaining customers and rarely suppressed their emotions at work, the OR (CI) for sleep disturbance was 9.66 (4.34-20.80) and 10.17 (4.46-22.07), for men and women always exposed to both factors. Sleep disturbance was affected by interactions of both emotional demands (engaging complaining customers and suppressing emotions at the workplace). The level of emotional demand, including engaging complaining customers and suppressing emotions at the workplace is significantly associated with sleep disturbance among Korean working population.

  9. Training Approach-Avoidance of Smiling Faces Affects Emotional Vulnerability in Socially Anxious Individuals

    Directory of Open Access Journals (Sweden)

    Mike eRinck

    2013-08-01

    Full Text Available Previous research revealed an automatic behavioral bias in high socially anxious individuals (HSAs: Although their explicit evaluations of smiling faces are positive, they show automatic avoidance of these faces. This is reflected by faster pushing than pulling of smiling faces in an Approach-Avoidance Task (AAT; Heuer, Rinck, & Becker, 2007. The current study addressed the causal role of this avoidance bias for social anxiety. To this end, we used the AAT to train HSAs, either to approach smiling faces or to avoid them. We examined whether such an AAT training could change HSAs’ automatic avoidance tendencies, and if yes, whether AAT effects would generalize to a new approach task with new facial stimuli, and to mood and anxiety in a social threat situation (a video-recorded self-presentation. We found that HSAs trained to approach smiling faces did indeed approach female faces faster after the training than HSAs trained to avoid smiling faces. Moreover, approach-faces training reduced emotional vulnerability: It led to more positive mood and lower anxiety after the self-presentation than avoid-faces training. These results suggest that automatic approach-avoidance tendencies have a causal role in social anxiety, and that they can be modified by a simple computerized training. This may open new avenues in the therapy of social phobia.

  10. Training approach-avoidance of smiling faces affects emotional vulnerability in socially anxious individuals

    Science.gov (United States)

    Rinck, Mike; Telli, Sibel; Kampmann, Isabel L.; Woud, Marcella L.; Kerstholt, Merel; te Velthuis, Sarai; Wittkowski, Matthias; Becker, Eni S.

    2013-01-01

    Previous research revealed an automatic behavioral bias in high socially anxious individuals (HSAs): although their explicit evaluations of smiling faces are positive, they show automatic avoidance of these faces. This is reflected by faster pushing than pulling of smiling faces in an Approach-Avoidance Task (AAT; Heuer et al., 2007). The current study addressed the causal role of this avoidance bias for social anxiety. To this end, we used the AAT to train HSAs, either to approach smiling faces or to avoid them. We examined whether such an AAT training could change HSAs' automatic avoidance tendencies, and if yes, whether AAT effects would generalize to a new approach task with new facial stimuli, and to mood and anxiety in a social threat situation (a video-recorded self-presentation). We found that HSAs trained to approach smiling faces did indeed approach female faces faster after the training than HSAs trained to avoid smiling faces. Moreover, approach-faces training reduced emotional vulnerability: it led to more positive mood and lower anxiety after the self-presentation than avoid-faces training. These results suggest that automatic approach-avoidance tendencies have a causal role in social anxiety, and that they can be modified by a simple computerized training. This may open new avenues in the therapy of social phobia. PMID:23970862

  11. ERP evidence for own-age effects on late stages of processing sad faces.

    Science.gov (United States)

    Fölster, Mara; Werheid, Katja

    2016-08-01

    Faces convey important information on interaction partners, such as their emotional state and age. Faces of the same age are, according to recent research, preferentially processed. The aim of the present study was to investigate whether the neural processes underlying this own-age effect are influenced by the emotional expression of the face, and to explore possible explanations such as the frequency or quality of contact to own-age versus other-age groups. Event-related potentials were recorded while 19 younger (18-30 years) and 19 older (64-86 years) observers watched younger and older sad and happy faces. Sad but not happy faces elicited higher late positive potential amplitudes for own-age than for other-age faces. This own-age effect was significant for older, but not for younger, observers, and correlated with the quality of contact with the own-age versus the other-age group. This pattern suggests that sad own-age faces are motivationally more relevant.

  12. The relationship of positive and negative expressiveness to the processing of emotion information.

    Science.gov (United States)

    Knyazev, Gennady G; Barchard, Kimberly A; Razumnikova, Olga M; Mitrofanova, Larisa G

    2012-06-01

    The tendency to express emotions non-verbally is positively related to perception of emotions in oneself. This study examined its relationship to perception of emotions in others. In 40 healthy adults, EEG theta synchronization was used to indicate emotion processing following presentation of happy, angry, and neutral faces. Both positive and negative expressiveness were associated with higher emotional sensitivity, as shown by cortical responses to facial expressions during the early, unconscious processing stage. At the late, conscious processing stage, positive expressiveness was associated with higher sensitivity to happy faces but lower sensitivity to angry faces. Thus, positive expressiveness predisposes people to allocate fewer attentional resources for conscious perception of angry faces. In contrast, negative expressiveness was consistently associated with higher sensitivity. The effects of positive expressiveness occurred in cortical areas that deal with emotions, but the effects of negative expressiveness occurred in areas engaged in self-referential processes in the context of social relationships. © 2012 The Authors. Scandinavian Journal of Psychology © 2012 The Scandinavian Psychological Associations.

  13. Automatic Processing of Changes in Facial Emotions in Dysphoria: A Magnetoencephalography Study.

    Science.gov (United States)

    Xu, Qianru; Ruohonen, Elisa M; Ye, Chaoxiong; Li, Xueqiao; Kreegipuu, Kairi; Stefanics, Gabor; Luo, Wenbo; Astikainen, Piia

    2018-01-01

    It is not known to what extent the automatic encoding and change detection of peripherally presented facial emotion is altered in dysphoria. The negative bias in automatic face processing in particular has rarely been studied. We used magnetoencephalography (MEG) to record automatic brain responses to happy and sad faces in dysphoric (Beck's Depression Inventory ≥ 13) and control participants. Stimuli were presented in a passive oddball condition, which allowed potential negative bias in dysphoria at different stages of face processing (M100, M170, and M300) and alterations of change detection (visual mismatch negativity, vMMN) to be investigated. The magnetic counterpart of the vMMN was elicited at all stages of face processing, indexing automatic deviance detection in facial emotions. The M170 amplitude was modulated by emotion, response amplitudes being larger for sad faces than happy faces. Group differences were found for the M300, and they were indexed by two different interaction effects. At the left occipital region of interest, the dysphoric group had larger amplitudes for sad than happy deviant faces, reflecting negative bias in deviance detection, which was not found in the control group. On the other hand, the dysphoric group showed no vMMN to changes in facial emotions, while the vMMN was observed in the control group at the right occipital region of interest. Our results indicate that there is a negative bias in automatic visual deviance detection, but also a general change detection deficit in dysphoria.

  14. Face-sensitive processes one hundred milliseconds after picture onset

    Directory of Open Access Journals (Sweden)

    Benjamin eDering

    2011-09-01

    Full Text Available The human face is the most studied object category in visual neuroscience. In a quest for markers of face processing, event-related potential (ERP studies have debated whether two peaks of activity –P1 and N170– are category-selective. Whilst most studies have used photographs of unaltered images of faces, others have used cropped faces in an attempt to reduce the influence of features surrounding the face-object sensu stricto. However, results from studies comparing cropped faces with unaltered objects from other categories are inconsistent with results from studies comparing whole faces and objects. Here, we recorded ERPs elicited by full-front views of faces and cars, either unaltered or cropped. We found that cropping artificially enhanced the N170 whereas it did not significantly modulate P1. In a second experiment, we compared faces and butterflies, either unaltered or cropped, matched for size and luminance across conditions, and within a narrow contrast bracket. Results of experiment 2 replicated the main findings of experiment 1. We then used face-car morphs in a third experiment to manipulate the perceived face-likeness of stimuli (100% face, 70% face and 30% car, 30% face and 70% car, or 100% car and the N170 failed to differentiate between faces and cars. Critically, in all three experiments, P1 amplitude was modulated in a face-sensitive fashion independent of cropping or morphing. Therefore, P1 is a reliable event sensitive to face processing as early as 100 ms after picture onset.

  15. KDEF-PT: Valence, Emotional Intensity, Familiarity and Attractiveness Ratings of Angry, Neutral, and Happy Faces.

    Science.gov (United States)

    Garrido, Margarida V; Prada, Marília

    2017-01-01

    The Karolinska Directed Emotional Faces (KDEF) is one of the most widely used human facial expressions database. Almost a decade after the original validation study (Goeleven et al., 2008), we present subjective rating norms for a sub-set of 210 pictures which depict 70 models (half female) each displaying an angry, happy and neutral facial expressions. Our main goals were to provide an additional and updated validation to this database, using a sample from a different nationality ( N = 155 Portuguese students, M = 23.73 years old, SD = 7.24) and to extend the number of subjective dimensions used to evaluate each image. Specifically, participants reported emotional labeling (forced-choice task) and evaluated the emotional intensity and valence of the expression, as well as the attractiveness and familiarity of the model (7-points rating scales). Overall, results show that happy faces obtained the highest ratings across evaluative dimensions and emotion labeling accuracy. Female (vs. male) models were perceived as more attractive, familiar and positive. The sex of the model also moderated the accuracy of emotional labeling and ratings of different facial expressions. Each picture of the set was categorized as low, moderate, or high for each dimension. Normative data for each stimulus (hits proportion, means, standard deviations, and confidence intervals per evaluative dimension) is available as supplementary material (available at https://osf.io/fvc4m/).

  16. KDEF-PT: Valence, Emotional Intensity, Familiarity and Attractiveness Ratings of Angry, Neutral, and Happy Faces

    Directory of Open Access Journals (Sweden)

    Margarida V. Garrido

    2017-12-01

    Full Text Available The Karolinska Directed Emotional Faces (KDEF is one of the most widely used human facial expressions database. Almost a decade after the original validation study (Goeleven et al., 2008, we present subjective rating norms for a sub-set of 210 pictures which depict 70 models (half female each displaying an angry, happy and neutral facial expressions. Our main goals were to provide an additional and updated validation to this database, using a sample from a different nationality (N = 155 Portuguese students, M = 23.73 years old, SD = 7.24 and to extend the number of subjective dimensions used to evaluate each image. Specifically, participants reported emotional labeling (forced-choice task and evaluated the emotional intensity and valence of the expression, as well as the attractiveness and familiarity of the model (7-points rating scales. Overall, results show that happy faces obtained the highest ratings across evaluative dimensions and emotion labeling accuracy. Female (vs. male models were perceived as more attractive, familiar and positive. The sex of the model also moderated the accuracy of emotional labeling and ratings of different facial expressions. Each picture of the set was categorized as low, moderate, or high for each dimension. Normative data for each stimulus (hits proportion, means, standard deviations, and confidence intervals per evaluative dimension is available as supplementary material (available at https://osf.io/fvc4m/.

  17. Variations in the serotonin-transporter gene are associated with attention bias patterns to positive and negative emotion faces.

    Science.gov (United States)

    Pérez-Edgar, Koraly; Bar-Haim, Yair; McDermott, Jennifer Martin; Gorodetsky, Elena; Hodgkinson, Colin A; Goldman, David; Ernst, Monique; Pine, Daniel S; Fox, Nathan A

    2010-03-01

    Both attention biases to threat and a serotonin-transporter gene polymorphism (5-HTTLPR) have been linked to heightened neural activation to threat and the emergence of anxiety. The short allele of 5-HTTLPR may act via its effect on neurotransmitter availability, while attention biases shape broad patterns of cognitive processing. We examined individual differences in attention bias to emotion faces as a function of 5-HTTLPR genotype. Adolescents (N=117) were classified for presumed SLC6A4 expression based on 5-HTTLPR-low (SS, SL(G), or L(G)L(G)), intermediate (SL(A) or L(A)L(G)), or high (L(A)L(A)). Participants completed the dot-probe task, measuring attention biases toward or away from angry and happy faces. Biases for angry faces increased with the genotype-predicted neurotransmission levels (low>intermediate>high). The reverse pattern was evident for happy faces. The data indicate a linear relation between 5-HTTLPR allelic status and attention biases to emotion, demonstrating a genetic mechanism for biased attention using ecologically valid stimuli that target socioemotional adaptation. Copyright 2009 Elsevier B.V. All rights reserved.

  18. When does subliminal affective image priming influence the ability of schizophrenic patients to perceive face emotions?

    Science.gov (United States)

    Vaina, Lucia Maria; Rana, Kunjan D; Cotos, Ionela; Li-Yang, Chen; Huang, Melissa A; Podea, Delia

    2014-12-24

    Deficits in face emotion perception are among the most pervasive aspects of schizophrenia impairments which strongly affects interpersonal communication and social skills. Schizophrenic patients (PSZ) and healthy control subjects (HCS) performed 2 psychophysical tasks. One, the SAFFIMAP test, was designed to determine the impact of subliminally presented affective or neutral images on the accuracy of face-expression (angry or neutral) perception. In the second test, FEP, subjects saw pictures of face-expression and were asked to rate them as angry, happy, or neutral. The following clinical scales were used to determine the acute symptoms in PSZ: Positive and Negative Syndrome (PANSS), Young Mania Rating (YMRS), Hamilton Depression (HAM-D), and Hamilton Anxiety (HAM-A). On the SAFFIMAP test, different from the HCS group, the PSZ group tended to categorize the neutral expression of test faces as angry and their response to the test-face expression was not influenced by the affective content of the primes. In PSZ, the PANSS-positive score was significantly correlated with correct perception of angry faces for aggressive or pleasant primes. YMRS scores were strongly correlated with PSZ's tendency to recognize angry face expressions when the prime was a pleasant or a neutral image. The HAM-D score was positively correlated with categorizing the test-faces as neutral, regardless of the affective content of the prime or of the test-face expression (angry or neutral). Despite its exploratory nature, this study provides the first evidence that conscious perception and categorization of facial emotions (neutral or angry) in PSZ is directly affected by their positive or negative symptoms of the disease as defined by their individual scores on the clinical diagnostic scales.

  19. Who cares? Offering emotion work as a 'gift' in the nursing labour process.

    Science.gov (United States)

    Bolton, S C

    2000-09-01

    Who cares? Offering emotion work as a 'gift' in the nursing labour process The emotional elements of the nursing labour process are being recognized increasingly. Many commentators stress that nurses' 'emotional labour' is hard and productive work and should be valued in the same way as physical or technical labour. However, the term 'emotional labour' fails to conceptualize the many occasions when nurses not only work hard on their emotions in order to present the detached face of a professional carer, but also to offer authentic caring behaviour to patients in their care. Using qualitative data collected from a group of gynaecology nurses in an English National Health Service (NHS) Trust hospital, this paper argues that nursing work is emotionally complex and may be better understood by utilizing a combination of Hochschild's concepts: emotion work as a 'gift' in addition to 'emotional labour'. The gynaecology nurses in this study describe their work as 'emotionful' and therefore it could be said that this particular group of nurses represent a distinct example. Nevertheless, though it is impossible to generalize from limited data, the research presented in this paper does highlight the emotional complexity of the nursing labour process, expands the current conceptual analysis, and offers a path for future research. The examination further emphasizes the need to understand and value the motivations behind nurses' emotion work and their wish to maintain caring as a central value in professional nursing.

  20. Cocaine users manifest impaired prosodic and cross-modal emotion processing

    Directory of Open Access Journals (Sweden)

    Lea M Hulka

    2013-09-01

    Full Text Available Background: A small number of previous studies have provided evidence that cocaine users exhibit impairments in complex social cognition tasks, while the more basic facial emotion recognition is widely unaffected. However, prosody and cross-modal emotion processing has not been systematically investigated in cocaine users so far. Therefore, the aim of the present study was to assess complex multisensory emotion processing in cocaine users in comparison to controls and to examine a potential association with drug use patterns.Method: The abbreviated version of the Comprehensive Affect Testing System (CATS-A was used to measure emotion perception across the three channels of facial affect, prosody, and semantic content in 58 cocaine users and 48 healthy control subjects who were matched for age, sex, verbal intelligence, and years of education.Results: Cocaine users had significantly lower scores than controls in the quotient scales of Emotion Recognition and Prosody Recognition and the subtests Conflicting Prosody/Meaning – Attend to Prosody and Match Emotional Prosody to Emotional Face either requiring to attend to prosody or to integrate cross-modal information. In contrast, no group difference emerged for the Affect Recognition Quotient. Cumulative cocaine doses and duration of cocaine use correlated negatively with emotion processing.Conclusion: Cocaine users show impaired cross-modal integration of different emotion processing channels particularly with regard to prosody, whereas more basic aspects of emotion processing such as facial affect perception are comparable to the performance of healthy controls.

  1. Human face processing is tuned to sexual age preferences

    DEFF Research Database (Denmark)

    Ponseti, J; Granert, O; van Eimeren, T

    2014-01-01

    Human faces can motivate nurturing behaviour or sexual behaviour when adults see a child or an adult face, respectively. This suggests that face processing is tuned to detecting age cues of sexual maturity to stimulate the appropriate reproductive behaviour: either caretaking or mating....... In paedophilia, sexual attraction is directed to sexually immature children. Therefore, we hypothesized that brain networks that normally are tuned to mature faces of the preferred gender show an abnormal tuning to sexual immature faces in paedophilia. Here, we use functional magnetic resonance imaging (f......MRI) to test directly for the existence of a network which is tuned to face cues of sexual maturity. During fMRI, participants sexually attracted to either adults or children were exposed to various face images. In individuals attracted to adults, adult faces activated several brain regions significantly more...

  2. Horizontal information drives the behavioural signatures of face processing

    Directory of Open Access Journals (Sweden)

    Valerie Goffaux

    2010-09-01

    Full Text Available Recent psychophysical evidence indicates that the vertical arrangement of horizontal information is particularly important for encoding facial identity. In this paper we extend this notion to examine the role that information at different (particularly cardinal orientations might play in a number of established phenomena each a behavioural “signature” of face processing. In particular we consider (a the face inversion effect (FIE, (b the facial identity after-effect, (c face-matching across viewpoint, and (d interactive, so-called holistic, processing of face parts. We report that filtering faces to remove all but the horizontal information largely preserves these effects but conversely, retaining vertical information generally diminishes or abolishes them. We conclude that preferential processing of horizontal information is a central feature of human face processing that supports many of the behavioural signatures of this critical visual operation.

  3. Neural Correlates of Task-Irrelevant First and Second Language Emotion Words — Evidence from the Face-Word Stroop Task

    Directory of Open Access Journals (Sweden)

    Lin Fan

    2016-11-01

    Full Text Available Emotionally valenced words have thus far not been empirically examined in a bilingual population with the emotional face-word Stroop paradigm. Chinese-English bilinguals were asked to identify the facial expressions of emotion with their first (L1 or second (L2 language task-irrelevant emotion words superimposed on the face pictures. We attempted to examine how the emotional content of words modulates behavioral performance and cerebral functioning in the bilinguals’ two languages. The results indicated that there were significant congruency effects for both L1 and L2 emotion words, and that identifiable differences in the magnitude of Stroop effect between the two languages were also observed, suggesting L1 is more capable of activating the emotional response to word stimuli. For event-related potentials (ERPs data, an N350-550 effect was observed only in L1 task with greater negativity for incongruent than congruent trials. The size of N350-550 effect differed across languages, whereas no identifiable language distinction was observed in the effect of conflict slow potential (conflict SP. Finally, more pronounced negative amplitude at 230-330 ms was observed in L1 than in L2, but only for incongruent trials. This negativity, likened to an orthographic decoding N250, may reflect the extent of attention to emotion word processing at word-form level, while N350-550 reflects a complicated set of processes in the conflict processing. Overall, the face-word congruency effect has reflected identifiable language distinction at 230-330 and 350-550 ms, which provides supporting evidence for the theoretical proposals assuming attenuated emotionality of L2 processing.

  4. Developmental changes in analytic and holistic processes in face perception

    Directory of Open Access Journals (Sweden)

    Jane Elizabeth Joseph

    2015-08-01

    Full Text Available Although infants demonstrate sensitivity to some kinds of perceptual information in faces, many face capacities continue to develop throughout childhood. One debate is the degree to which children perceive faces analytically versus holistically and how these processes undergo developmental change. In the present study, school-aged children and adults performed a perceptual matching task with upright and inverted face and house pairs that varied in similarity of featural or 2nd order configural information. Holistic processing was operationalized as the degree of serial processing when discriminating faces and houses (i.e., increased reaction time, RT, as more features or spacing relations were shared between stimuli. Analytical processing was operationalized as the degree of parallel processing (or no change in reaction time as a function of greater similarity of features or spatial relations. Adults showed the most evidence for holistic processing (most strongly for 2nd order faces and holistic processing was weaker for inverted faces and houses. Younger children (6-8 years, in contrast, showed analytical processing across all experimental manipulations. Older children (9-11 years showed an intermediate pattern with a trend toward holistic processing of 2nd order faces like adults, but parallel processing in other experimental conditions like younger children. These findings indicate that holistic face representations emerge around 10 years of age. In adults both 2nd order and featural information are incorporated into holistic representations, whereas older children only incorporate 2nd order information. Holistic processing was not evident in younger children. Hence, the development of holistic face representations relies on 2nd order processing initially then incorporates featural information by adulthood.

  5. Developmental changes in analytic and holistic processes in face perception

    Science.gov (United States)

    Joseph, Jane E.; DiBartolo, Michelle D.; Bhatt, Ramesh S.

    2015-01-01

    Although infants demonstrate sensitivity to some kinds of perceptual information in faces, many face capacities continue to develop throughout childhood. One debate is the degree to which children perceive faces analytically versus holistically and how these processes undergo developmental change. In the present study, school-aged children and adults performed a perceptual matching task with upright and inverted face and house pairs that varied in similarity of featural or 2nd order configural information. Holistic processing was operationalized as the degree of serial processing when discriminating faces and houses [i.e., increased reaction time (RT), as more features or spacing relations were shared between stimuli]. Analytical processing was operationalized as the degree of parallel processing (or no change in RT as a function of greater similarity of features or spatial relations). Adults showed the most evidence for holistic processing (most strongly for 2nd order faces) and holistic processing was weaker for inverted faces and houses. Younger children (6–8 years), in contrast, showed analytical processing across all experimental manipulations. Older children (9–11 years) showed an intermediate pattern with a trend toward holistic processing of 2nd order faces like adults, but parallel processing in other experimental conditions like younger children. These findings indicate that holistic face representations emerge around 10 years of age. In adults both 2nd order and featural information are incorporated into holistic representations, whereas older children only incorporate 2nd order information. Holistic processing was not evident in younger children. Hence, the development of holistic face representations relies on 2nd order processing initially then incorporates featural information by adulthood. PMID:26300838

  6. Developmental changes in analytic and holistic processes in face perception.

    Science.gov (United States)

    Joseph, Jane E; DiBartolo, Michelle D; Bhatt, Ramesh S

    2015-01-01

    Although infants demonstrate sensitivity to some kinds of perceptual information in faces, many face capacities continue to develop throughout childhood. One debate is the degree to which children perceive faces analytically versus holistically and how these processes undergo developmental change. In the present study, school-aged children and adults performed a perceptual matching task with upright and inverted face and house pairs that varied in similarity of featural or 2(nd) order configural information. Holistic processing was operationalized as the degree of serial processing when discriminating faces and houses [i.e., increased reaction time (RT), as more features or spacing relations were shared between stimuli]. Analytical processing was operationalized as the degree of parallel processing (or no change in RT as a function of greater similarity of features or spatial relations). Adults showed the most evidence for holistic processing (most strongly for 2(nd) order faces) and holistic processing was weaker for inverted faces and houses. Younger children (6-8 years), in contrast, showed analytical processing across all experimental manipulations. Older children (9-11 years) showed an intermediate pattern with a trend toward holistic processing of 2(nd) order faces like adults, but parallel processing in other experimental conditions like younger children. These findings indicate that holistic face representations emerge around 10 years of age. In adults both 2(nd) order and featural information are incorporated into holistic representations, whereas older children only incorporate 2(nd) order information. Holistic processing was not evident in younger children. Hence, the development of holistic face representations relies on 2(nd) order processing initially then incorporates featural information by adulthood.

  7. Differential effects of face-realism and emotion on event-related brain potentials and their implications for the uncanny valley theory

    Science.gov (United States)

    Schindler, Sebastian; Zell, Eduard; Botsch, Mario; Kissler, Johanna

    2017-03-01

    Cartoon characters are omnipresent in popular media. While few studies have scientifically investigated their processing, in computer graphics, efforts are made to increase realism. Yet, close approximations of reality have been suggested to evoke sometimes a feeling of eeriness, the “uncanny valley” effect. Here, we used high-density electroencephalography to investigate brain responses to professionally stylized happy, angry, and neutral character faces. We employed six face-stylization levels varying from abstract to realistic and investigated the N170, early posterior negativity (EPN), and late positive potential (LPP) event-related components. The face-specific N170 showed a u-shaped modulation, with stronger reactions towards both most abstract and most realistic compared to medium-stylized faces. For abstract faces, N170 was generated more occipitally than for real faces, implying stronger reliance on structural processing. Although emotional faces elicited highest amplitudes on both N170 and EPN, on the N170 realism and expression interacted. Finally, LPP increased linearly with face realism, reflecting activity increase in visual and parietal cortex for more realistic faces. Results reveal differential effects of face stylization on distinct face processing stages and suggest a perceptual basis to the uncanny valley hypothesis. They are discussed in relation to face perception, media design, and computer graphics.

  8. Differential effects of face-realism and emotion on event-related brain potentials and their implications for the uncanny valley theory.

    Science.gov (United States)

    Schindler, Sebastian; Zell, Eduard; Botsch, Mario; Kissler, Johanna

    2017-03-23

    Cartoon characters are omnipresent in popular media. While few studies have scientifically investigated their processing, in computer graphics, efforts are made to increase realism. Yet, close approximations of reality have been suggested to evoke sometimes a feeling of eeriness, the "uncanny valley" effect. Here, we used high-density electroencephalography to investigate brain responses to professionally stylized happy, angry, and neutral character faces. We employed six face-stylization levels varying from abstract to realistic and investigated the N170, early posterior negativity (EPN), and late positive potential (LPP) event-related components. The face-specific N170 showed a u-shaped modulation, with stronger reactions towards both most abstract and most realistic compared to medium-stylized faces. For abstract faces, N170 was generated more occipitally than for real faces, implying stronger reliance on structural processing. Although emotional faces elicited highest amplitudes on both N170 and EPN, on the N170 realism and expression interacted. Finally, LPP increased linearly with face realism, reflecting activity increase in visual and parietal cortex for more realistic faces. Results reveal differential effects of face stylization on distinct face processing stages and suggest a perceptual basis to the uncanny valley hypothesis. They are discussed in relation to face perception, media design, and computer graphics.

  9. Preattentive processing of audio-visual emotional signals

    DEFF Research Database (Denmark)

    Föcker, J.; Gondan, Matthias; Röder, B.

    2011-01-01

    Previous research has shown that redundant information in faces and voices leads to faster emotional categorization compared to incongruent emotional information even when attending to only one modality. The aim of the present study was to test whether these crossmodal effects are predominantly d...

  10. Distinct Temporal Processing of Task-Irrelevant Emotional Facial Expressions

    NARCIS (Netherlands)

    de Jong, Peter J.; Koster, Ernst H. W.; Wessel, Ineke; Martens, Sander

    There is an ongoing debate concerning the extent to which emotional faces automatically attract attention. Using a single-target Rapid Serial Visual Presentation (RSVP) methodology, it has been found that presentation of task-irrelevant positive or negative emotionally salient stimuli (e. g.,

  11. ‘Distracters’ do not always distract: Visual working memory for angry faces is enhanced by incidental emotional words.

    Directory of Open Access Journals (Sweden)

    Margaret Cecilia Jackson

    2012-10-01

    Full Text Available We are often required to filter out distraction in order to focus on a primary task during which working memory (WM is engaged. Previous research has shown that negative versus neutral distracters presented during a visual WM maintenance period significantly impair memory for neutral information. However, the contents of WM are often also emotional in nature. The question we address here is how incidental information might impact upon visual WM when both this and the memory items contain emotional information. We presented emotional versus neutral words during the maintenance interval of an emotional visual WM faces task. Participants encoded two angry or happy faces into WM, and several seconds into a 9 second maintenance period a negative, positive, or neutral word was flashed on the screen three times. A single neutral test face was presented for retrieval with a face identity that was either present or absent in the preceding study array. WM for angry face identities was significantly better when an emotional (negative or positive versus neutral (or no word was presented. In contrast, WM for happy face identities was not significantly affected by word valence. These findings suggest that the presence of emotion within an intervening stimulus boosts the emotional value of threat-related information maintained in visual WM and thus improves performance. In addition, we show that incidental events that are emotional in nature do not always distract from an ongoing WM task.

  12. Does a single session of electroconvulsive therapy alter the neural response to emotional faces in depression? A randomised sham-controlled functional magnetic resonance imaging study

    DEFF Research Database (Denmark)

    Miskowiak, Kamilla W; Kessing, Lars V; Ott, Caroline V

    2017-01-01

    neurocognitive bias in major depressive disorder. Patients with major depressive disorder were randomised to one active ( n=15) or sham electroconvulsive therapy ( n=12). The following day they underwent whole-brain functional magnetic resonance imaging at 3T while viewing emotional faces and performed facial...... expression recognition and dot-probe tasks. A single electroconvulsive therapy session had no effect on amygdala response to emotional faces. Whole-brain analysis revealed no effects of electroconvulsive therapy versus sham therapy after family-wise error correction at the cluster level, using a cluster...... to faces after a single electroconvulsive therapy session, the observed trend changes after a single electroconvulsive therapy session point to an early shift in emotional processing that may contribute to antidepressant effects of electroconvulsive therapy....

  13. Initial Orientation of Attention towards Emotional Faces in Children with Attention Deficit Hyperactivity Disorder

    Directory of Open Access Journals (Sweden)

    Mehrnoosh Ahmadi

    2011-09-01

    Full Text Available Objective: Early recognition of negative emotions is considered to be of vital importance. It seems that children with attention deficit hyperactivity disorder have some difficulties recognizing facial emotional expressions, especially negative ones. This study investigated the preference of children with attention deficit hyperactivity disorder for negative (angry, sad facial expressions compared to normal children.Method: Participants were 35 drug naive boys with ADHD, aged between 6-11 years ,and 31 matched healthy children. Visual orientation data were recorded while participants viewed face pairs (negative-neutral pairs shown for 3000ms. The number of first fixations made to each expression was considered as an index of initial orientation. Results: Group comparisons revealed no difference between attention deficit hyperactivity disorder group and their matched healthy counterparts in initial orientation of attention. A tendency towards negative emotions was found within the normal group, while no difference was observed between initial allocation of attention toward negative and neutral expressions in children with ADHD .Conclusion: Children with attention deficit hyperactivity disorder do not have significant preference for negative facial expressions. In contrast, normal children have a significant preference for negative facial emotions rather than neutral faces.

  14. Oxytocin effects on emotional response to others' faces via serotonin system in autism: A pilot study.

    Science.gov (United States)

    Fukai, Mina; Hirosawa, Tetsu; Kikuchi, Mitsuru; Ouchi, Yasuomi; Takahashi, Tetsuya; Yoshimura, Yuko; Miyagishi, Yoshiaki; Kosaka, Hirotaka; Yokokura, Masamichi; Yoshikawa, Etsuji; Bunai, Tomoyasu; Minabe, Yoshio

    2017-09-30

    The oxytocin (OT)-related serotonergic system is thought to play an important role in the etiology and social symptoms of autism spectrum disorder (ASD). However, no evidence exists for the relation between the prosocial effect of chronic OT administration and the brain serotonergic system. Ten male subjects with ASD were administered OT for 8-10 weeks in an open-label, single-arm, non-randomized, uncontrolled manner. Before and during the OT treatment, positron emission tomography was used with the ( 11 C)-3-amino-4-(2-[(demethylamino)methyl]phenylthio)benzonitrile( 11 C-DASB) radiotracer. Then binding of serotonin transporter ( 11 C-DASB BP ND ) was estimated. The main outcome measures were changes in 11 C-DASB BP ND and changes in the emotional response to others' faces. No significant change was found in the emotional response to others' faces after the 8-10 week OT treatment. However, the increased serotonin transporter (SERT) level in the striatum after treatment was correlated significantly with increased negative emotional response to human faces. This study revealed a relation between changes in the serotonergic system and in prosociality after chronic OT administration. Additional studies must be conducted to verify the chronic OT effects on social behavior via the serotonergic system. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  15. An fMRI study of facial emotion processing in patients with schizophrenia.

    Science.gov (United States)

    Gur, Raquel E; McGrath, Claire; Chan, Robin M; Schroeder, Lee; Turner, Travis; Turetsky, Bruce I; Kohler, Christian; Alsop, David; Maldjian, Joseph; Ragland, J Daniel; Gur, Ruben C

    2002-12-01

    Emotion processing deficits are notable in schizophrenia. The authors evaluated cerebral blood flow response in schizophrenia patients during facial emotion processing to test the hypothesis of diminished limbic activation related to emotional relevance of facial stimuli. Fourteen patients with schizophrenia and 14 matched comparison subjects viewed facial displays of happiness, sadness, anger, fear, and disgust as well as neutral faces. Functional magnetic resonance imaging was used to measure blood-oxygen-level-dependent signal changes as the subjects alternated between tasks of discriminating emotional valence (positive versus negative) and age (over 30 versus under 30) of the faces with an interleaved crosshair reference condition. The groups did not differ in performance on either task. For both tasks, healthy participants showed activation in the fusiform gyrus, occipital lobe, and inferior frontal cortex relative to the resting baseline condition. The increase was greater in the amygdala and hippocampus during the emotional valence discrimination task than during the age discrimination task. In the patients with schizophrenia, minimal focal response was observed for all tasks relative to the resting baseline condition. Contrasting patients and comparison subjects on the emotional valence discrimination task revealed voxels in the left amygdala and bilateral hippocampus in which the comparison subjects had significantly greater activation. Failure to activate limbic regions during emotional valence discrimination may explain emotion processing deficits in patients with schizophrenia. While the lack of limbic recruitment did not significantly impair simple valence discrimination performance in this clinically stable group, it may impact performance of more demanding tasks.

  16. Developing Cultural Differences in Face Processing

    Science.gov (United States)

    Kelly, David J.; Liu, Shaoying; Rodger, Helen; Miellet, Sebastien; Ge, Liezhong; Caldara, Roberto

    2011-01-01

    Perception and eye movements are affected by culture. Adults from Eastern societies (e.g. China) display a disposition to process information "holistically," whereas individuals from Western societies (e.g. Britain) process information "analytically." Recently, this pattern of cultural differences has been extended to face…

  17. Attention and emotion : An ERP analysis of facilitated emotional stimulus processing

    OpenAIRE

    Schupp, Harald Thomas; Junghöfer, Markus; Weike, Almut I.; Hamm, Alfons

    2003-01-01

    Recent event-related potential studies observed an early posterior negativity (EPN) reflecting facilitated processing of emotional images. The present study explored if the facilitated processing of emotional pictures is sustained while subjects perform an explicit non-emotional attention task. EEG was recorded from 129 channels while subjects viewed a rapid continuous stream of images containing emotional pictures as well as task-related checkerboard images. As expected, explicit selective a...

  18. Holistic Processing in the Composite Task Depends on Face Size.

    Science.gov (United States)

    Ross, David A; Gauthier, Isabel

    Holistic processing is a hallmark of face processing. There is evidence that holistic processing is strongest for faces at identification distance, 2 - 10 meters from the observer. However, this evidence is based on tasks that have been little used in the literature and that are indirect measures of holistic processing. We use the composite task- a well validated and frequently used paradigm - to measure the effect of viewing distance on holistic processing. In line with previous work, we find a congruency x alignment effect that is strongest for faces that are close (2m equivalent distance) than for faces that are further away (24m equivalent distance). In contrast, the alignment effect for same trials, used by several authors to measure holistic processing, produced results that are difficult to interpret. We conclude that our results converge with previous findings providing more direct evidence for an effect of size on holistic processing.

  19. Facial Emotion and Identity Processing Development in 5- to 15-Year-Old Children

    Directory of Open Access Journals (Sweden)

    Patrick eJohnston

    2011-02-01

    Full Text Available Most developmental studies of emotional face processing to date have focussed on infants and very young children. Additionally, studies that examine emotional face processing in older children do not distinguish development in emotion and identity face processing from more generic age-related cognitive improvement. In this study, we developed a paradigm that measures processing of facial expression in comparison to facial identity and complex visual stimuli. The three matching tasks were developed (i.e., facial emotion matching, facial identity matching and butterfly wing matching to include stimuli of similar level of discriminability and to be equated for task difficulty in earlier samples of young adults. Ninety two children aged 5 to 15 years and a new group of 24 young adults completed these three matching tasks. Young children were highly adept at the butterfly wing task relative to their performance on both face-related tasks. More importantly, in older children, development of facial emotion discrimination ability lagged behind that of facial identity discrimination.

  20. Unconscious processing of facial attractiveness: invisible attractive faces orient visual attention.

    Science.gov (United States)

    Hung, Shao-Min; Nieh, Chih-Hsuan; Hsieh, Po-Jang

    2016-11-16

    Past research has proven human's extraordinary ability to extract information from a face in the blink of an eye, including its emotion, gaze direction, and attractiveness. However, it remains elusive whether facial attractiveness can be processed and influences our behaviors in the complete absence of conscious awareness. Here we demonstrate unconscious processing of facial attractiveness with three distinct approaches. In Experiment 1, the time taken for faces to break interocular suppression was measured. The results showed that attractive faces enjoyed the privilege of breaking suppression and reaching consciousness earlier. In Experiment 2, we further showed that attractive faces had lower visibility thresholds, again suggesting that facial attractiveness could be processed more easily to reach consciousness. Crucially, in Experiment 3, a significant decrease of accuracy on an orientation discrimination task subsequent to an invisible attractive face showed that attractive faces, albeit suppressed and invisible, still exerted an effect by orienting attention. Taken together, for the first time, we show that facial attractiveness can be processed in the complete absence of consciousness, and an unconscious attractive face is still capable of directing our attention.

  1. Event-related brain responses to emotional words, pictures, and faces - a cross-domain comparison.

    Science.gov (United States)

    Bayer, Mareike; Schacht, Annekathrin

    2014-01-01

    Emotion effects in event-related brain potentials (ERPs) have previously been reported for a range of visual stimuli, including emotional words, pictures, and facial expressions. Still, little is known about the actual comparability of emotion effects across these stimulus classes. The present study aimed to fill this gap by investigating emotion effects in response to words, pictures, and facial expressions using a blocked within-subject design. Furthermore, ratings of stimulus arousal and valence were collected from an independent sample of participants. Modulations of early posterior negativity (EPN) and late positive complex (LPC) were visible for all stimulus domains, but showed clear differences, particularly in valence processing. While emotion effects were limited to positive stimuli for words, they were predominant for negative stimuli in pictures and facial expressions. These findings corroborate the notion of a positivity offset for words and a negativity bias for pictures and facial expressions, which was assumed to be caused by generally lower arousal levels of written language. Interestingly, however, these assumed differences were not confirmed by arousal ratings. Instead, words were rated as overall more positive than pictures and facial expressions. Taken together, the present results point toward systematic differences in the processing of written words and pictorial stimuli of emotional content, not only in terms of a valence bias evident in ERPs, but also concerning their emotional evaluation captured by ratings of stimulus valence and arousal.

  2. Breaking the Emotional Barrier through the Bibliotherapeutic Process.

    Science.gov (United States)

    Ouzts, Dan T.

    1984-01-01

    Reviews literature concerning bibliotherapy and concludes that it can be of value to a child's overall emotional development and may help in breaking emotional barriers to learning. Discusses the role of the reading teacher in the bibliotherapeutic process. (FL)

  3. Intranasal Oxytocin Administration Dampens Amygdala Reactivity towards Emotional Faces in Male and Female PTSD Patients.

    Science.gov (United States)

    Koch, Saskia Bj; van Zuiden, Mirjam; Nawijn, Laura; Frijling, Jessie L; Veltman, Dick J; Olff, Miranda

    2016-05-01

    Post-traumatic stress disorder (PTSD) is a disabling psychiatric disorder. As a substantial part of PTSD patients responds poorly to currently available psychotherapies, pharmacological interventions boosting treatment response are needed. Because of its anxiolytic and pro-social properties, the neuropeptide oxytocin (OT) has been proposed as promising strategy for treatment augmentation in PTSD. As a first step to investigate the therapeutic potential of OT in PTSD, we conducted a double-blind, placebo-controlled, cross-over functional MRI study examining OT administration effects (40 IU) on amygdala reactivity toward emotional faces in unmedicated male and female police officers with (n=37, 21 males) and without (n=40, 20 males) PTSD. Trauma-exposed controls were matched to PTSD patients based on age, sex, years of service and educational level. Under placebo, the expected valence-dependent amygdala reactivity (ie, greater activity toward fearful-angry faces compared with happy-neutral faces) was absent in PTSD patients. OT administration dampened amygdala reactivity toward all emotional faces in male and female PTSD patients, but enhanced amygdala reactivity in healthy male and female trauma-exposed controls, independent of sex and stimulus valence. In PTSD patients, greater anxiety prior to scanning and amygdala reactivity during the placebo session were associated with greater reduction of amygdala reactivity after OT administration. Taken together, our results indicate presumably beneficial neurobiological effects of OT administration in male and female PTSD patients. Future studies should investigate OT administration in clinical settings to fully appreciate its therapeutic potential.

  4. Romanian Insurance Market Facing Globalization Process

    Directory of Open Access Journals (Sweden)

    Dumitru G. Badea

    2008-09-01

    Full Text Available The Romanian insurance market has passed through a permanent process of growth which ended up in 2004 to exceed the threshold of 1 billion Euros, in the frame of a small awareness and confidence of the population towards insurance, even now after 15 years. The globalization process of the financial markets affected also the Romanian market even before Romania became member of the European Union. The globalization brought about benefits (especially under the form of increase in the quality of the services provided to clients but also disadvantages for local companies (significant costs in logistics and training in order to cope with the international groups.

  5. Judgment under emotional certainty and uncertainty: the effects of specific emotions on information processing.

    Science.gov (United States)

    Tiedens, L Z; Linton, S

    2001-12-01

    The authors argued that emotions characterized by certainty appraisals promote heuristic processing, whereas emotions characterized by uncertainty appraisals result in systematic processing. The 1st experiment demonstrated that the certainty associated with an emotion affects the certainty experienced in subsequent situations. The next 3 experiments investigated effects on processing of emotions associated with certainty and uncertainty. Compared with emotions associated with uncertainty, emotions associated with certainty resulted in greater reliance on the expertise of a source of a persuasive message in Experiment 2, more stereotyping in Experiment 3, and less attention to argument quality in Experiment 4. In contrast to previous theories linking valence and processing, these findings suggest that the certainty appraisal content of emotions is also important in determining whether people engage in systematic or heuristic processing.

  6. Cholinergic enhancement modulates neural correlates of selective attention and emotional processing.

    Science.gov (United States)

    Bentley, Paul; Vuilleumier, Patrik; Thiel, Christiane M; Driver, Jon; Dolan, Raymond J

    2003-09-01

    Neocortical cholinergic afferents are proposed to influence both selective attention and emotional processing. In a study of healthy adults we used event-related fMRI while orthogonally manipulating attention and emotionality to examine regions showing effects of cholinergic modulation by the anticholinesterase physostigmine. Either face or house pictures appeared at task-relevant locations, with the alternative picture type at irrelevant locations. Faces had either neutral or fearful expressions. Physostigmine increased relative activity within the anterior fusiform gyrus for faces at attended, versus unattended, locations, but decreased relative activity within the posterolateral occipital cortex for houses in attended, versus unattended, locations. A similar pattern of regional differences in the effect of physostigmine on cue-evoked responses was also present in the absence of stimuli. Cholinergic enhancement augmented the relative neuronal response within the middle fusiform gyrus to fearful faces, whether at attended or unattended locations. By contrast, physostigmine influenced responses in the orbitofrontal, intraparietal and cingulate cortices to fearful faces when faces occupied task-irrelevant locations. These findings suggest that acetylcholine may modulate both selective attention and emotional processes through independent, region-specific effects within the extrastriate cortex. Furthermore, cholinergic inputs to the frontoparietal cortex may influence the allocation of attention to emotional information.

  7. Dealing with feelings: characterization of trait alexithymia on emotion regulation strategies and cognitive-emotional processing.

    Directory of Open Access Journals (Sweden)

    Marte Swart

    Full Text Available BACKGROUND: Alexithymia, or "no words for feelings", is a personality trait which is associated with difficulties in emotion recognition and regulation. It is unknown whether this deficit is due primarily to regulation, perception, or mentalizing of emotions. In order to shed light on the core deficit, we tested our subjects on a wide range of emotional tasks. We expected the high alexithymics to underperform on all tasks. METHOD: Two groups of healthy individuals, high and low scoring on the cognitive component of the Bermond-Vorst Alexithymia Questionnaire, completed questionnaires of emotion regulation and performed several emotion processing tasks including a micro expression recognition task, recognition of emotional prosody and semantics in spoken sentences, an emotional and identity learning task and a conflicting beliefs and emotions task (emotional mentalizing. RESULTS: The two groups differed on the Emotion Regulation Questionnaire, Berkeley Expressivity Questionnaire and Empathy Quotient. Specifically, the Emotion Regulation Quotient showed that alexithymic individuals used more suppressive and less reappraisal strategies. On the behavioral tasks, as expected, alexithymics performed worse on recognition of micro expressions and emotional mentalizing. Surprisingly, groups did not differ on tasks of emotional semantics and prosody and associative emotional-learning. CONCLUSION: Individuals scoring high on the cognitive component of alexithymia are more prone to suppressive emotion regulation strategies rather than reappraisal strategies. Regarding emotional information processing, alexithymia is associated with reduced performance on measures of early processing as well as higher order mentalizing. However, difficulties in the processing of emotional language were not a core deficit in our alexithymic group.

  8. Visual attention to dynamic faces and objects is linked to face processing skills: A combined study of children with autism and controls

    Directory of Open Access Journals (Sweden)

    Julia eParish-Morris

    2013-04-01

    Full Text Available Although the extant literature on face recognition skills in Autism Spectrum Disorder (ASD shows clear impairments compared to typically developing controls (TDC at the group level, the distribution of scores within ASD is broad. In the present research, we take a dimensional approach and explore how differences in social attention during an eye tracking experiment correlate with face recognition skills across ASD and TDC. Emotional discrimination and person identity perception face processing skills were assessed using the Let’s Face It! Skills Battery in 110 children with and without ASD. Social attention was assessed using infrared eye gaze tracking during passive viewing of movies of facial expressions and objects displayed together on a computer screen. Face processing skills were significantly correlated with measures of attention to faces and with social skills as measured by the Social Communication Questionnaire. Consistent with prior research, children with ASD scored significantly lower on face processing skills tests but, unexpectedly, group differences in amount of attention to faces (versus objects were not found. We discuss possible methodological contributions to this null finding. We also highlight the importance of a dimensional approach for understanding the developmental origins of reduced face perception skills, and emphasize the need for longitudinal research to truly understand how social motivation and social attention influence the development of social perceptual skills.

  9. Attention and emotion: an ERP analysis of facilitated emotional stimulus processing.

    Science.gov (United States)

    Schupp, Harald T; Junghöfer, Markus; Weike, Almut I; Hamm, Alfons O

    2003-06-11

    Recent event-related potential studies observed an early posterior negativity (EPN) reflecting facilitated processing of emotional images. The present study explored if the facilitated processing of emotional pictures is sustained while subjects perform an explicit non-emotional attention task. EEG was recorded from 129 channels while subjects viewed a rapid continuous stream of images containing emotional pictures as well as task-related checkerboard images. As expected, explicit selective attention to target images elicited large P3 waves. Interestingly, emotional stimuli guided stimulus-driven selective encoding as reflected by augmented EPN amplitudes to emotional stimuli, in particular to stimuli of evolutionary significance (erotic contents, mutilations, and threat). These data demonstrate the selective encoding of emotional stimuli while top-down attentional control was directed towards non-emotional target stimuli.

  10. Emotion Processing by ERP Combined with Development and Plasticity

    Science.gov (United States)

    2017-01-01

    Emotions important for survival and social interaction have received wide and deep investigations. The application of the fMRI technique into emotion processing has obtained overwhelming achievements with respect to the localization of emotion processes. The ERP method, which possesses highly temporal resolution compared to fMRI, can be employed to investigate the time course of emotion processing. The emotional modulation of the ERP component has been verified across numerous researches. Emotions, described as dynamically developing along with the growing age, have the possibility to be enhanced through learning (or training) or to be damaged due to disturbances in growth, which is underlain by the neural plasticity of emotion-relevant nervous systems. And mood disorders with typical symptoms of emotion discordance probably have been caused by the dysfunctional neural plasticity. PMID:28831313

  11. Effect of positive emotion on consolidation of memory for faces: the modulation of facial valence and facial gender.

    Science.gov (United States)

    Wang, Bo

    2013-01-01

    Studies have shown that emotion elicited after learning enhances memory consolidation. However, no prior studies have used facial photos as stimuli. This study examined the effect of post-learning positive emotion on consolidation of memory for faces. During the learning participants viewed neutral, positive, or negative faces. Then they were assigned to a condition in which they either watched a 9-minute positive video clip, or a 9-minute neutral video. Then 30 minutes after the learning participants took a surprise memory test, in which they made "remember", "know", and "new" judgements. The findings are: (1) Positive emotion enhanced consolidation of recognition for negative male faces, but impaired consolidation of recognition for negative female faces; (2) For males, recognition for negative faces was equivalent to that for positive faces; for females, recognition for negative faces was better than that for positive faces. Our study provides the important evidence that effect of post-learning emotion on memory consolidation can extend to facial stimuli and such an effect can be modulated by facial valence and facial gender. The findings may shed light on establishing models concerning the influence of emotion on memory consolidation.

  12. Personality, Attentional Biases towards Emotional Faces and Symptoms of Mental Disorders in an Adolescent Sample.

    Science.gov (United States)

    O'Leary-Barrett, Maeve; Pihl, Robert O; Artiges, Eric; Banaschewski, Tobias; Bokde, Arun L W; Büchel, Christian; Flor, Herta; Frouin, Vincent; Garavan, Hugh; Heinz, Andreas; Ittermann, Bernd; Mann, Karl; Paillère-Martinot, Marie-Laure; Nees, Frauke; Paus, Tomas; Pausova, Zdenka; Poustka, Luise; Rietschel, Marcella; Robbins, Trevor W; Smolka, Michael N; Ströhle, Andreas; Schumann, Gunter; Conrod, Patricia J

    2015-01-01

    To investigate the role of personality factors and attentional biases towards emotional faces, in establishing concurrent and prospective risk for mental disorder diagnosis in adolescence. Data were obtained as part of the IMAGEN study, conducted across 8 European sites, with a community sample of 2257 adolescents. At 14 years, participants completed an emotional variant of the dot-probe task, as well two personality measures, namely the Substance Use Risk Profile Scale and the revised NEO Personality Inventory. At 14 and 16 years, participants and their parents were interviewed to determine symptoms of mental disorders. Personality traits were general and specific risk indicators for mental disorders at 14 years. Increased specificity was obtained when investigating the likelihood of mental disorders over a 2-year period, with the Substance Use Risk Profile Scale showing incremental validity over the NEO Personality Inventory. Attentional biases to emotional faces did not characterise or predict mental disorders examined in the current sample. Personality traits can indicate concurrent and prospective risk for mental disorders in a community youth sample, and identify at-risk youth beyond the impact of baseline symptoms. This study does not support the hypothesis that attentional biases mediate the relationship between personality and psychopathology in a community sample. Task and sample characteristics that contribute to differing results among studies are discussed.

  13. Neural Reactivity to Emotional Faces Mediates the Relationship Between Childhood Empathy and Adolescent Prosocial Behavior

    Science.gov (United States)

    Flournoy, John C.; Pfeifer, Jennifer H.; Moore, William E.; Tackman, Allison; Masten, Carrie L.; Mazziotta, John C.; Iacoboni, Marco; Dapretto, Mirella

    2017-01-01

    Reactivity to others' emotions can result in empathic concern (EC), an important motivator of prosocial behavior, but can also result in personal distress (PD), which may hinder prosocial behavior. Examining neural substrates of emotional reactivity may elucidate how EC and PD differentially influence prosocial behavior. Participants (N=57) provided measures of EC, PD, prosocial behavior, and neural responses to emotional expressions at age 10 and 13. Initial EC predicted subsequent prosocial behavior. Initial EC and PD predicted subsequent reactivity to emotions in the inferior frontal gyrus (IFG) and inferior parietal lobule, respectively. Activity in the IFG, a region linked to mirror neuron processes, as well as cognitive control and language, mediated the relation between initial EC and subsequent prosocial behavior. PMID:28262939

  14. Scanning patterns of faces do not explain impaired emotion recognition in Huntington Disease: Evidence for a high level mechanism

    Directory of Open Access Journals (Sweden)

    Marieke evan Asselen

    2012-02-01

    Full Text Available Previous studies in patients with amygdala lesions suggested that deficits in emotion recognition might be mediated by impaired scanning patterns of faces. Here we investigated whether scanning patterns also contribute to the selective impairment in recognition of disgust in Huntington disease (HD. To achieve this goal, we recorded eye movements during a two-alternative forced choice emotion recognition task. HD patients in presymptomatic (n=16 and symptomatic (n=9 disease stages were tested and their performance was compared to a control group (n=22. In our emotion recognition task, participants had to indicate whether a face reflected one of six basic emotions. In addition, and in order to define whether emotion recognition was altered when the participants were forced to look at a specific component of the face, we used a second task where only limited facial information was provided (eyes/mouth in partially masked faces. Behavioural results showed no differences in the ability to recognize emotions between presymptomatic gene carriers and controls. However, an emotion recognition deficit was found for all 6 basic emotion categories in early stage HD. Analysis of eye movement patterns showed that patient and controls used similar scanning strategies. Patterns of deficits were similar regardless of whether parts of the faces were masked or not, thereby confirming that selective attention to particular face parts is not underlying the deficits. These results suggest that the emotion recognition deficits in symptomatic HD patients cannot be explained by impaired scanning patterns of faces. Furthermore, no selective deficit for recognition of disgust was found in presymptomatic HD patients.

  15. Differentiating emotional processing and attention in psychopathy with functional neuroimaging.

    Science.gov (United States)

    Anderson, Nathaniel E; Steele, Vaughn R; Maurer, J Michael; Rao, Vikram; Koenigs, Michael R; Decety, Jean; Kosson, David S; Calhoun, Vince D; Kiehl, Kent A

    2017-06-01

    Individuals with psychopathy are often characterized by emotional processing deficits, and recent research has examined the specific contexts and cognitive mechanisms that underlie these abnormalities. Some evidence suggests that abnormal features of attention are fundamental to emotional deficits in persons with psychopathy, but few studies have demonstrated the neural underpinnings responsible for such effects. Here, we use functional neuroimaging to examine attention-emotion interactions among incarcerated individuals (n = 120) evaluated for psychopathic traits using the Hare Psychopathy Checklist-Revised (PCL-R). Using a task designed to manipulate attention to emotional features of visual stimuli, we demonstrate effects representing implicit emotional processing, explicit emotional processing, attention-facilitated emotional processing, and vigilance for emotional content. Results confirm the importance of considering mechanisms of attention when evaluating emotional processing differences related to psychopathic traits. The affective-interpersonal features of psychopathy (PCL-R Factor 1) were associated with relatively lower emotion-dependent augmentation of activity in visual processing areas during implicit emotional processing, while antisocial-lifestyle features (PCL-R Factor 2) were associated with elevated activity in the amygdala and related salience network regions. During explicit emotional processing, psychopathic traits were associated with upregulation in the medial prefrontal cortex, insula, and superior frontal regions. Isolating the impact of explicit attention to emotional content, only Factor 1 was related to upregulation of activity in the visual processing stream, which was accompanied by increased activity in the angular gyrus. These effects highlight some important mechanisms underlying abnormal features of attention and emotional processing that accompany psychopathic traits.

  16. Validation of the Vanderbilt Holistic Face Processing Test

    OpenAIRE

    Wang, Chao-Chih; Ross, David A.; Gauthier, Isabel; Richler, Jennifer J.

    2016-01-01

    The Vanderbilt Holistic Face Processing Test (VHPT-F) is a new measure of holistic face processing with better psychometric properties relative to prior measures developed for group studies (Richler et al., 2014). In fields where psychologists study individual differences, validation studies are commonplace and the concurrent validity of a new measure is established by comparing it to an older measure with established validity. We follow this approach and test whether the VHPT-F measures the ...

  17. Validation of the Vanderbilt Holistic Face Processing Test.

    OpenAIRE

    Chao-Chih Wang; Chao-Chih Wang; David Andrew Ross; Isabel Gauthier; Jennifer Joanna Richler

    2016-01-01

    The Vanderbilt Holistic Face Processing Test (VHPT-F) is a new measure of holistic face processing with better psychometric properties relative to prior measures developed for group studies (Richler et al., 2014). In fields where psychologists study individual differences, validation studies are commonplace and the concurrent validity of a new measure is established by comparing it to an older measure with established validity. We follow this approach and test whether the VHPT-F measures the ...

  18. Effects of Emotion on Writing Processes in Children

    Science.gov (United States)

    Fartoukh, Michael; Chanquoy, Lucile; Piolat, Annie

    2012-01-01

    The aim of this study was to analyze the consequences of emotion during narrative writing in accordance with Hayes's model. In this model, motivation and affect have an important role during the writing process. Moreover, according to the emotion-cognition literature, emotions are thought to create interferences in working memory, resulting in an…

  19. Reframing Teachers' Intercultural Learning as an Emotional Process

    Science.gov (United States)

    Jokikokko, Katri

    2016-01-01

    The importance of emotions in the process of intercultural learning has been recognised, but the topic has not been extensively theorised. This theoretical review article synthesises the research literature on emotions in the context of teachers' intercultural learning. The article argues that emotions are a vital part of any change, and thus play…

  20. Attentional Capture by Emotional Stimuli Is Modulated by Semantic Processing

    Science.gov (United States)

    Huang, Yang-Ming; Baddeley, Alan; Young, Andrew W.

    2008-01-01

    The attentional blink paradigm was used to examine whether emotional stimuli always capture attention. The processing requirement for emotional stimuli in a rapid sequential visual presentation stream was manipulated to investigate the circumstances under which emotional distractors capture attention, as reflected in an enhanced attentional blink…

  1. Trait anxiety modulates fronto-limbic processing of emotional interference in Borderline Personality Disorder

    Directory of Open Access Journals (Sweden)

    Jana eHoltmann

    2013-03-01

    Full Text Available Previous studies of cognitive alterations in Borderline Personality Disorder (BPD have yielded conflicting results. Given that a core feature of BPD is affective instability, which is characterized by emotional hyperreactivity and deficits in emotion regulation, it seems conceivable that short-lasting emotional distress might exert temporary detrimental effects on cognitive performance. Here we used functional magnetic resonance imaging (fMRI to investigate how task-irrelevant emotional stimuli (fearful faces affect performance and fronto-limbic neural activity patterns during attention-demanding cognitive processing in 16 female, unmedicated BPD patients relative to 24 age-matched healthy controls. In a modified flanker task, emotionally negative, socially salient pictures (fearful versus neutral faces were presented as distracters in the background. Patients, but not controls, showed an atypical response pattern of the right amygdala with increased activation during emotional interference in the (difficult incongruent flanker condition, but emotion-related amygdala deactivation in the congruent condition. A direct comparison of the emotional conditions between the two groups revealed that the strongest diagnosis-related differences could be observed in the dorsal and, to a lesser extent, also in the rostral anterior cingulate cortex (dACC, rACC where patients exhibited an increased neural response to emotional relative to neutral distracters. Moreover, in the incongruent condition, both the dACC and rACC fMRI responses during emotional interference were negatively correlated with trait anxiety in the patients, but not in the healthy controls. As higher trait anxiety was also associated with longer reaction times in the BPD patients, we suggest that in BPD patients the ACC might mediate compensatory cognitive processes during emotional interference and that such neurocognitive compensation that can be adversely affected by high levels of

  2. Serotonergic neurotransmission in emotional processing: New evidence from long-term recreational poly-drug ecstasy use.

    Science.gov (United States)

    Laursen, Helle Ruff; Henningsson, Susanne; Macoveanu, Julian; Jernigan, Terry L; Siebner, Hartwig R; Holst, Klaus K; Skimminge, Arnold; Knudsen, Gitte M; Ramsoy, Thomas Z; Erritzoe, David

    2016-12-01

    The brain's serotonergic system plays a crucial role in the processing of emotional stimuli, and several studies have shown that a reduced serotonergic neurotransmission is associated with an increase in amygdala activity during emotional face processing. Prolonged recreational use of ecstasy (3,4-methylene-dioxymethamphetamine [MDMA]) induces alterations in serotonergic neurotransmission that are comparable to those observed in a depleted state. In this functional magnetic resonance imaging (fMRI) study, we investigated the responsiveness of the amygdala to emotional face stimuli in recreational ecstasy users as a model of long-term serotonin depletion. Fourteen ecstasy users and 12 non-using controls underwent fMRI to measure the regional neural activity elicited in the amygdala by male or female faces expressing anger, disgust, fear, sadness, or no emotion. During fMRI, participants made a sex judgement on each face stimulus. Positron emission tomography with 11 C-DASB was additionally performed to assess serotonin transporter (SERT) binding in the brain. In the ecstasy users, SERT binding correlated negatively with amygdala activity, and accumulated lifetime intake of ecstasy tablets was associated with an increase in amygdala activity during angry face processing. Conversely, time since the last ecstasy intake was associated with a trend toward a decrease in amygdala activity during angry and sad face processing. These results indicate that the effects of long-term serotonin depletion resulting from ecstasy use are dose-dependent, affecting the functional neural basis of emotional face processing. © The Author(s) 2016.

  3. Tolerance for distorted faces: challenges to a configural processing account of familiar face recognition.

    Science.gov (United States)

    Sandford, Adam; Burton, A Mike

    2014-09-01

    Face recognition is widely held to rely on 'configural processing', an analysis of spatial relations between facial features. We present three experiments in which viewers were shown distorted faces, and asked to resize these to their correct shape. Based on configural theories appealing to metric distances between features, we reason that this should be an easier task for familiar than unfamiliar faces (whose subtle arrangements of features are unknown). In fact, participants were inaccurate at this task, making between 8% and 13% errors across experiments. Importantly, we observed no advantage for familiar faces: in one experiment participants were more accurate with unfamiliars, and in two experiments there was no difference. These findings were not due to general task difficulty - participants were able to resize blocks of colour to target shapes (squares) more accurately. We also found an advantage of familiarity for resizing other stimuli (brand logos). If configural processing does underlie face recognition, these results place constraints on the definition of 'configural'. Alternatively, familiar face recognition might rely on more complex criteria - based on tolerance to within-person variation rather than highly specific measurement. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Explicit versus implicit neural processing of musical emotions

    OpenAIRE

    Bogert, Brigitte; Numminen-Kontti, Taru; Gold, Benjamin; Sams, Mikko; Numminen, Jussi; Burunat, Iballa; Lampinen, Jouko; Brattico, Elvira

    2016-01-01

    Music is often used to regulate emotions and mood. Typically, music conveys and induces emotions even when one does not attend to them. Studies on the neural substrates of musical emotions have, however, only examined brain activity when subjects have focused on the emotional content of the music. Here we address with functional magnetic resonance imaging (fMRI) the neural processing of happy, sad, and fearful music with a paradigm in which 56 subjects were instructed to either classify the e...

  5. Face processing is gated by visual spatial attention

    Directory of Open Access Journals (Sweden)

    Roy E Crist

    2008-03-01

    Full Text Available Human perception of faces is widely believed to rely on automatic processing by a domain-specifi c, modular component of the visual system. Scalp-recorded event-related potential (ERP recordings indicate that faces receive special stimulus processing at around 170 ms poststimulus onset, in that faces evoke an enhanced occipital negative wave, known as the N170, relative to the activity elicited by other visual objects. As predicted by modular accounts of face processing, this early face-specifi c N170 enhancement has been reported to be largely immune to the infl uence of endogenous processes such as task strategy or attention. However, most studies examining the infl uence of attention on face processing have focused on non-spatial attention, such as object-based attention, which tend to have longer-latency effects. In contrast, numerous studies have demonstrated that visual spatial attention can modulate the processing of visual stimuli as early as 80 ms poststimulus – substantially earlier than the N170. These temporal characteristics raise the question of whether this initial face-specifi c processing is immune to the infl uence of spatial attention. This question was addressed in a dual-visualstream ERP study in which the infl uence of spatial attention on the face-specifi c N170 could be directly examined. As expected, early visual sensory responses to all stimuli presented in an attended location were larger than responses evoked by those same stimuli when presented in an unattended location. More importantly, a signifi cant face-specifi c N170 effect was elicited by faces that appeared in an attended location, but not in an unattended one. In summary, early face-specifi c processing is not automatic, but rather, like other objects, strongly depends on endogenous factors such as the allocation of spatial attention. Moreover, these fi ndings underscore the extensive infl uence that top-down attention exercises over the processing of

  6. Positively Biased Processing of Mother's Emotions Predicts Children's Social and Emotional Functioning.

    Science.gov (United States)

    Donohue, Meghan Rose; Goodman, Sherryl H; Tully, Erin C

    Risk for internalizing problems and social skills deficits likely emerges in early childhood when emotion processing and social competencies are developing. Positively biased processing of social information is typical during early childhood and may be protective against poorer psychosocial outcomes. We tested the hypothesis that young children with relatively less positively biased attention to, interpretations of, and attributions for their mother's emotions would exhibit poorer prosocial skills and more internalizing problems. A sample of 4- to 6-year-old children ( N =82) observed their mothers express happiness, sadness and anger during a simulated emotional phone conversation. Children's attention to their mother when she expressed each emotion was rated from video. Immediately following the phone conversation, children were asked questions about the conversation to assess their interpretations of the intensity of mother's emotions and misattributions of personal responsibility for her emotions. Children's prosocial skills and internalizing problems were assessed using mother-report rating scales. Interpretations of mother's positive emotions as relatively less intense than her negative emotions, misattributions of personal responsibility for her negative emotions, and lack of misattributions of personal responsibility for her positive emotions were associated with poorer prosocial skills. Children who attended relatively less to mother's positive than her negative emotions had higher levels of internalizing problems. These findings suggest that children's attention to, interpretations of, and attributions for their mother's emotions may be important targets of early interventions for preventing prosocial skills deficits and internalizing problems.

  7. Lateralization for Processing Facial Emotions in Gay Men, Heterosexual Men, and Heterosexual Women.

    Science.gov (United States)

    Rahman, Qazi; Yusuf, Sifat

    2015-07-01

    This study tested whether male sexual orientation and gender nonconformity influenced functional cerebral lateralization for the processing of facial emotions. We also tested for the effects of sex of poser and emotion displayed on putative differences. Thirty heterosexual men, 30 heterosexual women, and 40 gay men completed measures of demographic variables, recalled childhood gender nonconformity (CGN), IQ, and the Chimeric Faces Test (CFT). The CFT depicts vertically split chimeric faces, formed with one half showing a neutral expression and the other half showing an emotional expression and performance is measured using a "laterality quotient" (LQ) score. We found that heterosexual men were significantly more right-lateralized when viewing female faces compared to heterosexual women and gay men, who did not differ significantly from each other. Heterosexual women and gay men were more left-lateralized for processing female faces. There were no significant group differences in lateralization for male faces. These results remained when controlling for age and IQ scores. There was no significant effect of CGN on LQ scores. These data suggest that gay men are feminized in some aspects of functional cerebral lateralization for facial emotion. The results were discussed in relation to the selectivity of functional lateralization and putative brain mechanisms underlying sexual attraction towards opposite-sex and same-sex targets.

  8. A brain network processing the age of faces.

    Directory of Open Access Journals (Sweden)

    György A Homola

    Full Text Available Age is one of the most salient aspects in faces and of fundamental cognitive and social relevance. Although face processing has been studied extensively, brain regions responsive to age have yet to be localized. Using evocative face morphs and fMRI, we segregate two areas extending beyond the previously established face-sensitive core network, centered on the inferior temporal sulci and angular gyri bilaterally, both of which process changes of facial age. By means of probabilistic tractography, we compare their patterns of functional activation and structural connectivity. The ventral portion of Wernicke's understudied perpendicular association fasciculus is shown to interconnect the two areas, and activation within these clusters is related to the probability of fiber connectivity between them. In addition, post-hoc age-rating competence is found to be associated with high response magnitudes in the left angular gyrus. Our results provide the first evidence that facial age has a distinct representation pattern in the posterior human brain. We propose that particular face-sensitive nodes interact with additional object-unselective quantification modules to obtain individual estimates of facial age. This brain network processing the age of faces differs from the cortical areas that have previously been linked to less developmental but instantly changeable face aspects. Our probabilistic method of associating activations with connectivity patterns reveals an exemplary link that can be used to further study, assess and quantify structure-function relationships.

  9. The impact of emotion on perception: bias or enhanced processing?

    Science.gov (United States)

    Zeelenberg, René; Wagenmakers, Eric-Jan; Rotteveel, Mark

    2006-04-01

    Recent studies have shown that emotionally significant stimuli are often better identified than neutral stimuli. It is not clear, however, whether these results are due to enhanced perceptual processing or to a bias favoring the identification of emotionally significant stimuli over neutral stimuli. The present study used a two-alternative forced-choice perceptual identification task to disentangle the effects of bias and enhanced processing. We found that emotionally significant targets were better identified than neutral targets. In contrast, the emotional significance of the foil alternative had no effect on performance. The present results support the hypothesis that perceptual encoding of emotionally significant stimuli is enhanced.

  10. Altered organization of face processing networks in temporal lobe epilepsy

    Science.gov (United States)

    Riley, Jeffrey D.; Fling, Brett W.; Cramer, Steven C.; Lin, Jack J.

    2015-01-01

    SUMMARY Objective Deficits in social cognition are common and significant in people with temporal lobe epilepsy (TLE), but the functional and structural underpinnings remain unclear. The present study investigated how the side of seizure focus impacts face processing networks in temporal lobe epilepsy. Methods We used functional magnetic resonance imaging (fMRI) of a face processing paradigm to identify face responsive regions in 24 individuals with unilateral temporal lobe epilepsy (Left = 15; Right = 9) and 19 healthy controls. fMRI signals of face responsive regions ispilateral and contralateral to the side of seizure onset were delineated in TLE and compared to the healthy controls with right and left side combined. Diffusion tensor images were acquired to investigate structural connectivity between face regions that differed in fMRI signals between the two groups. Results In temporal lobe epilepsy, activation of the cortical face processing networks varied according to side of seizure onset. In temporal lobe epilepsy, the laterality of amygdala activation was shifted to the side contralateral to the seizure focus while controls showed no significant asymmetry. Furthermore, compared to controls, patients with TLE showed decreased activation of the occipital face responsive region in the ipsilateral side and an increased activity of the anterior temporal lobe in the contralateral side to the seizure focus. Probabilistic tractography revealed that the occipital face area and anterior temporal lobe are connected via the inferior longitudinal fasciculus, which in individuals with temporal lobe epilepsy showed reduced integrity. Significance Taken together, these findings suggest that brain function and white matter integrity of networks subserving face processing are impaired on the side of seizure onset, accompanied by altered responses on the side contralateral to the seizure. PMID:25823855

  11. Individual differences in emotion processing: how similar are diffusion model parameters across tasks?

    Science.gov (United States)

    Mueller, Christina J; White, Corey N; Kuchinke, Lars

    2017-11-27

    The goal of this study was to replicate findings of diffusion model parameters capturing emotion effects in a lexical decision task and investigating whether these findings extend to other tasks of implicit emotion processing. Additionally, we were interested in the stability of diffusion model parameters across emotional stimuli and tasks for individual subjects. Responses to words in a lexical decision task were compared with responses to faces in a gender categorization task for stimuli of the emotion categories: happy, neutral and fear. Main effects of emotion as well as stability of emerging response style patterns as evident in diffusion model parameters across these tasks were analyzed. Based on earlier findings, drift rates were assumed to be more similar in response to stimuli of the same emotion category compared to stimuli of a different emotion category. Results showed that emotion effects of the tasks differed with a processing advantage for happy followed by neutral and fear-related words in the lexical decision task and a processing advantage for neutral followed by happy and fearful faces in the gender categorization task. Both emotion effects were captured in estimated drift rate parameters-and in case of the lexical decision task also in the non-decision time parameters. A principal component analysis showed that contrary to our hypothesis drift rates were more similar within a specific task context than within a specific emotion category. Individual response patterns of subjects across tasks were evident in significant correlations regarding diffusion model parameters including response styles, non-decision times and information accumulation.

  12. Neural activity and emotional processing following military deployment: Effects of mild traumatic brain injury and posttraumatic stress disorder.

    Science.gov (United States)

    Zuj, Daniel V; Felmingham, Kim L; Palmer, Matthew A; Lawrence-Wood, Ellie; Van Hooff, Miranda; Lawrence, Andrew J; Bryant, Richard A; McFarlane, Alexander C

    2017-11-01

    Posttraumatic Stress Disorder (PTSD) and mild traumatic brain injury (mTBI) are common comorbidities during military deployment that affect emotional brain processing, yet few studies have examined the independent effects of mTBI and PTSD. The purpose of this study was to examine distinct differences in neural responses to emotional faces in mTBI and PTSD. Twenty-one soldiers reporting high PTSD symptoms were compared to 21 soldiers with low symptoms, and 16 soldiers who reported mTBI-consistent injury and symptoms were compared with 16 soldiers who did not sustain an mTBI. Participants viewed emotional face expressions while their neural activity was recorded (via event-related potentials) prior to and following deployment. The high-PTSD group displayed increased P1 and P2 amplitudes to threatening faces at post-deployment compared to the low-PTSD group. In contrast, the mTBI group displayed reduced face-specific processing (N170 amplitude) to all facial expressions compared to the no-mTBI group. Here, we identified distinctive neural patterns of emotional face processing, with attentional biases towards threatening faces in PTSD, and reduced emotional face processing in mTBI. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Callousness and affective face processing in adults: Behavioral and brain-potential indicators.

    Science.gov (United States)

    Brislin, Sarah J; Yancey, James R; Perkins, Emily R; Palumbo, Isabella M; Drislane, Laura E; Salekin, Randall T; Fanti, Kostas A; Kimonis, Eva R; Frick, Paul J; Blair, R James R; Patrick, Christopher J

    2018-03-01

    The investigation of callous-unemotional (CU) traits has been central to contemporary research on child behavior problems, and served as the impetus for inclusion of a specifier for conduct disorder in the latest edition of the official psychiatric diagnostic system. Here, we report results from 2 studies that evaluated the construct validity of callousness as assessed in adults, by testing for affiliated deficits in behavioral and neural processing of fearful faces, as have been shown in youthful samples. We hypothesized that scores on an established measure of callousness would predict reduced recognition accuracy and diminished electocortical reactivity for fearful faces in adult participants. In Study 1, 66 undergraduate participants performed an emotion recognition task in which they viewed affective faces of different types and indicated the emotion expressed by each. In Study 2, electrocortical data were collected from 254 adult twins during viewing of fearful and neutral face stimuli, and scored for event-related response components. Analyses of Study 1 data revealed that higher callousness was associated with decreased recognition accuracy for fearful faces specifically. In Study 2, callousness was associated with reduced amplitude of both N170 and P200 responses to fearful faces. Current findings demonstrate for the first time that callousness in adults is associated with both behavioral and physiological deficits in the processing of fearful faces. These findings support the validity of the CU construct with adults and highlight the possibility of a multidomain measurement framework for continued study of this important clinical construct. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  14. Holistic face processing can inhibit recognition of forensic facial composites.

    Science.gov (United States)

    McIntyre, Alex H; Hancock, Peter J B; Frowd, Charlie D; Langton, Stephen R H

    2016-04-01

    Facial composite systems help eyewitnesses to show the appearance of criminals. However, likenesses created by unfamiliar witnesses will not be completely accurate, and people familiar with the target can find them difficult to identify. Faces are processed holistically; we explore whether this impairs identification of inaccurate composite images and whether recognition can be improved. In Experiment 1 (n = 64) an imaging technique was used to make composites of celebrity faces more accurate and identification was contrasted with the original composite images. Corrected composites were better recognized, confirming that errors in production of the likenesses impair identification. The influence of holistic face processing was explored by misaligning the top and bottom parts of the composites (cf. Young, Hellawell, & Hay, 1987). Misalignment impaired recognition of corrected composites but identification of the original, inaccurate composites significantly improved. This effect was replicated with facial composites of noncelebrities in Experiment 2 (n = 57). We conclude that, like real faces, facial composites are processed holistically: recognition is impaired because unlike real faces, composites contain inaccuracies and holistic face processing makes it difficult to perceive identifiable features. This effect was consistent across composites of celebrities and composites of people who are personally familiar. Our findings suggest that identification of forensic facial composites can be enhanced by presenting composites in a misaligned format. (c) 2016 APA, all rights reserved).

  15. Hemispheric Lateralization in Processing Emotional and Non-Emotional Kanji Words

    OpenAIRE

    NAGAE, Seiji

    2013-01-01

    The purpose of this study was to investigate the contribution of both hemispheres to the processing of positive, negative, and non-emotional Kanji words in normal individuals. Right-handed subjects were asked to read aloud the Kanji word presented in the visual half-field. Results showed that responses to positive and non-emotional words were more accurate in RVF than those in LVF, but no difference was found fornegative emotional words. Reaction time results indicated that processing of nega...

  16. Abnormal GABAergic function and face processing in schizophrenia: A pharmacologic-fMRI study.

    Science.gov (United States)

    Tso, Ivy F; Fang, Yu; Phan, K Luan; Welsh, Robert C; Taylor, Stephan F

    2015-10-01

    The involvement of the gamma-aminobutyric acid (GABA) system in schizophrenia is suggested by postmortem studies and the common use of GABA receptor-potentiating agents in treatment. In a recent study, we used a benzodiazepine challenge to demonstrate abnormal GABAergic function during processing of negative visual stimuli in schizophrenia. This study extended this investigation by mapping GABAergic mechanisms associated with face processing and social appraisal in schizophrenia using a benzodiazepine challenge. Fourteen stable, medicated schizophrenia/schizoaffective patients (SZ) and 13 healthy controls (HC) underwent functional MRI using the blood oxygenation level-dependent (BOLD) technique while they performed the Socio-emotional Preference Task (SePT) on emotional face stimuli ("Do you like this face?"). Participants received single-blinded intravenous saline and lorazepam (LRZ) in two separate sessions separated by 1-3weeks. Both SZ and HC recruited medial prefrontal cortex/anterior cingulate during the SePT, relative to gender identification. A significant drug by group interaction was observed in the medial occipital cortex, such that SZ showed increased BOLD signal to LRZ challenge, while HC showed an expected decrease of signal; the interaction did not vary by task. The altered BOLD response to LRZ challenge in SZ was significantly correlated with increased negative affect across multiple measures. The altered response to LRZ challenge suggests that abnormal face processing and negative affect in SZ are associated with altered GABAergic function in the visual cortex, underscoring the role of impaired visual processing in socio-emotional deficits in schizophrenia. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. The human body odor compound androstadienone leads to anger-dependent effects in an emotional Stroop but not dot-probe task using human faces.

    Science.gov (United States)

    Hornung, Jonas; Kogler, Lydia; Wolpert, Stephan; Freiherr, Jessica; Derntl, Birgit

    2017-01-01

    The androgen derivative androstadienone is a substance found in human sweat and thus is a putative human chemosignal. Androstadienone has been studied with respect to effects on mood states, attractiveness ratings, physiological and neural activation. With the current experiment, we aimed to explore in which way androstadienone affects attention to social cues (human faces). Moreover, we wanted to test whether effects depend on specific emotions, the participants' sex and individual sensitivity to smell androstadienone. To do so, we investigated 56 healthy individuals (thereof 29 females taking oral contraceptives) with two attention tasks on two consecutive days (once under androstadienone, once under placebo exposure in pseudorandomized order). With an emotional dot-probe task we measured visuo-spatial cueing while an emotional Stroop task allowed us to investigate interference control. Our results suggest that androstadienone acts in a sex, task and emotion-specific manner as a reduction in interference processes in the emotional Stroop task was only apparent for angry faces in men under androstadienone exposure. More specifically, men showed a smaller difference in reaction times for congruent compared to incongruent trials. At the same time also women were slightly affected by smelling androstadienone as they classified angry faces more often correctly under androstadienone. For the emotional dot-probe task no modulation by androstadienone was observed. Furthermore, in both attention paradigms individual sensitivity to androstadienone was neither correlated with reaction times nor error rates in men and women. To conclude, exposure to androstadienone seems to potentiate the relevance of angry faces in both men and women in connection with interference control, while processes of visuo-spatial cueing remain unaffected.

  18. Using Regression to Measure Holistic Face Processing Reveals a Strong Link with Face Recognition Ability

    Science.gov (United States)

    DeGutis, Joseph; Wilmer, Jeremy; Mercado, Rogelio J.; Cohan, Sarah

    2013-01-01

    Although holistic processing is thought to underlie normal face recognition ability, widely discrepant reports have recently emerged about this link in an individual differences context. Progress in this domain may have been impeded by the widespread use of subtraction scores, which lack validity due to their contamination with control condition…

  19. The implicit processing of categorical and dimensional strategies: an fMRI study of facial emotion perception

    Science.gov (United States)

    Matsuda, Yoshi-Taka; Fujimura, Tomomi; Katahira, Kentaro; Okada, Masato; Ueno, Kenichi; Cheng, Kang; Okanoya, Kazuo

    2013-01-01

    Our understanding of facial emotion perception has been dominated by two seemingly opposing theories: the categorical and dimensional theories. However, we have recently demonstrated that hybrid processing involving both categorical and dimensional perception can be induced in an implicit manner (Fujimura etal., 2012). The underlying neural mechanisms of this hybrid processing remain unknown. In this study, we tested the hypothesis that separate neural loci might intrinsically encode categorical and dimensional processing functions that serve as a basis for hybrid processing. We used functional magnetic resonance imaging to measure neural correlates while subjects passively viewed emotional faces and performed tasks that were unrelated to facial emotion processing. Activity in the right fusiform face area (FFA) increased in response to psychologically obvious emotions and decreased in response to ambiguous expressions, demonstrating the role of the FFA in categorical processing. The amygdala, insula and medial prefrontal cortex exhibited evidence of dimensional (linear) processing that correlated with physical changes in the emotional face stimuli. The occipital face area and superior temporal sulcus did not respond to these changes in the presented stimuli. Our results indicated that distinct neural loci process the physical and psychological aspects of facial emotion perception in a region-specific and implicit manner. PMID:24133426

  20. The implicit processing of categorical and dimensional strategies: an fMRI study of facial emotion perception

    Directory of Open Access Journals (Sweden)

    Yoshi-Taka eMatsuda

    2013-09-01

    Full Text Available Our understanding of facial emotion perception has been dominated by two seemingly opposing theories: the categorical and dimensional theories. However, we have recently demonstrated that hybrid processing involving both categorical and dimensional perception can be induced in an implicit manner (Fujimura et al., 2012. The underlying neural mechanisms of this hybrid processing remain unknown. In this study, we tested the hypothesis that separate neural loci might intrinsically encode categorical and dimensional processing functions that serve as a basis for hybrid processing. We used functional magnetic resonance imaging (fMRI to measure neural correlates while subjects passively viewed emotional faces and performed tasks that were unrelated to facial emotion processing. Activity in the right fusiform face area (FFA increased in response to psychologically obvious emotions and decreased in response to ambiguous expressions, demonstrating the role of the FFA in categorical processing. The amygdala, insula and medial prefrontal cortex exhibited evidence of dimensional (linear processing that correlated with physical changes in the emotional face stimuli. The occipital face area and superior temporal sulcus did not respond to these changes in the presented stimuli. Our results indicated that distinct neural loci process the physical and psychological aspects of facial emotion perception in a region-specific and implicit manner.

  1. Following the time course of face gender and expression processing: a task-dependent ERP study.

    Science.gov (United States)

    Valdés-Conroy, Berenice; Aguado, Luis; Fernández-Cahill, María; Romero-Ferreiro, Verónica; Diéguez-Risco, Teresa

    2014-05-01

    The effects of task demands and the interaction between gender and expression in face perception were studied using event-related potentials (ERPs). Participants performed three different tasks with male and female faces that were emotionally inexpressive or that showed happy or angry expressions. In two of the tasks (gender and expression categorization) facial properties were task-relevant while in a third task (symbol discrimination) facial information was irrelevant. Effects of expression were observed on the visual P100 component under all task conditions, suggesting the operation of an automatic process that is not influenced by task demands. The earliest interaction between expression and gender was observed later in the face-sensitive N170 component. This component showed differential modulations by specific combinations of gender and expression (e.g., angry male vs. angry female faces). Main effects of expression and task were observed in a later occipito-temporal component peaking around 230 ms post-stimulus onset (EPN or early posterior negativity). Less positive amplitudes in the presence of angry faces and during performance of the gender and expression tasks were observed. Finally, task demands also modulated a positive component peaking around 400 ms (LPC, or late positive complex) that showed enhanced amplitude for the gender task. The pattern of results obtained here adds new evidence about the sequence of operations involved in face processing and the interaction of facial properties (gender and expression) in response to different task demands. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Service with a smile: do emotional intelligence, gender, and autonomy moderate the emotional labor process?

    Science.gov (United States)

    Johnson, Hazel-Anne M; Spector, Paul E

    2007-10-01

    This survey study of 176 participants from eight customer service organizations investigated how individual factors moderate the impact of emotional labor strategies on employee well-being. Hierarchical regression analyses indicated that gender and autonomy were significant moderators of the relationships between emotional labor strategies and the personal outcomes of emotional exhaustion, affective well-being, and job satisfaction. Females were more likely to experience negative consequences when engaging in surface acting. Autonomy served to alleviate negative outcomes for individuals who used emotional labor strategies often. Contrary to our hypotheses, emotional intelligence did not moderate the relationship between the emotional labor strategies and personal outcomes. Results demonstrated how the emotional labor process can influence employee well-being. (c) 2007 APA, all rights reserved.

  3. Neural correlates of processing "self-conscious" vs. "basic" emotions.

    Science.gov (United States)

    Gilead, Michael; Katzir, Maayan; Eyal, Tal; Liberman, Nira

    2016-01-29

    Self-conscious emotions are prevalent in our daily lives and play an important role in both normal and pathological behavior. Despite their immense significance, the neural substrates that are involved in the processing of such emotions are surprisingly under-studied. In light of this, we conducted an fMRI study in which participants thought of various personal events which elicited feelings of negative and positive self-conscious (i.e., guilt, pride) or basic (i.e., anger, joy) emotions. We performed a conjunction analysis to investigate the neural correlates associated with processing events that are related to self-conscious vs. basic emotions, irrespective of valence. The results show that processing self-conscious emotions resulted in activation within frontal areas associated with self-processing and self-control, namely, the mPFC extending to the dACC, and within the lateral-dorsal prefrontal cortex. Processing basic emotions resulted in activation throughout relatively phylogenetically-ancient regions of the cortex, namely in visual and tactile processing areas and in the insular cortex. Furthermore, self-conscious emotions differentially activated the mPFC such that the negative self-conscious emotion (guilt) was associated with a more dorsal activation, and the positive self-conscious emotion (pride) was associated with a more ventral activation. We discuss how these results shed light on the nature of mental representations and neural systems involved in self-reflective and affective processing. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Emotion talk in the context of young people self‐harming: facing the feelings in family therapy

    Science.gov (United States)

    Schmidt, Petra

    2016-01-01

    This article describes the use of emotion talk in the context of using a manualised approach to family therapy where the presenting problem is self‐harm. Whilst we understand that there is an internal aspect to emotion, we also consider emotions to be socially purposeful, culturally constructed and interactional. We found that within the presenting families, negative emotions were often talked about as located within the young person. Through using ‘emotion talk’ (Fredman, 2004) in deconstructing and tracking emotions and exploring how emotions connected to family‐of‐origin and cultural contexts, we developed an interactional understanding of these emotions. This led to better emotional regulation within the family and offered alternative ways of relating. The article discusses the use of relational reflexivity, and using the therapist and team's emotions to enable the therapeutic process, encouraging reflexivity on the self of the therapist in relation to work with emotions. Practitioner points Emotions can be seen as both a reflection of feelings experienced by the individual and as a communication.An interactional understanding of emotions can be used therapeutically.Therapists should explore emotional displays and track the interactional patterns within the therapeutic system.Therapists should self‐reflexive about ways of doing emotions and use this awareness in practice. PMID:27667879

  5. Emotion talk in the context of young people self-harming: facing the feelings in family therapy.

    Science.gov (United States)

    Rogers, Alice; Schmidt, Petra

    2016-04-01

    This article describes the use of emotion talk in the context of using a manualised approach to family therapy where the presenting problem is self-harm. Whilst we understand that there is an internal aspect to emotion, we also consider emotions to be socially purposeful, culturally constructed and interactional. We found that within the presenting families, negative emotions were often talked about as located within the young person. Through using 'emotion talk' (Fredman, 2004) in deconstructing and tracking emotions and exploring how emotions connected to family-of-origin and cultural contexts, we developed an interactional understanding of these emotions. This led to better emotional regulation within the family and offered alternative ways of relating. The article discusses the use of relational reflexivity, and using the therapist and team's emotions to enable the therapeutic process, encouraging reflexivity on the self of the therapist in relation to work with emotions. Emotions can be seen as both a reflection of feelings experienced by the individual and as a communication.An interactional understanding of emotions can be used therapeutically.Therapists should explore emotional displays and track the interactional patterns within the therapeutic system.Therapists should self-reflexive about ways of doing emotions and use this awareness in practice.

  6. The role of emotion in dynamic audiovisual integration of faces and voices.

    Science.gov (United States)

    Kokinous, Jenny; Kotz, Sonja A; Tavano, Alessandro; Schröger, Erich

    2015-05-01

    We used human electroencephalogram to study early audiovisual integration of dynamic angry and neutral expressions. An auditory-only condition served as a baseline for the interpretation of integration effects. In the audiovisual conditions, the validity of visual information was manipulated using facial expressions that were either emotionally congruent or incongruent with the vocal expressions. First, we report an N1 suppression effect for angry compared with neutral vocalizations in the auditory-only condition. Second, we confirm early integration of congruent visual and auditory information as indexed by a suppression of the auditory N1 and P2 components in the audiovisual compared with the auditory-only condition. Third, audiovisual N1 suppression was modulated by audiovisual congruency in interaction with emotion: for neutral vocalizations, there was N1 suppression in both the congruent and the incongruent audiovisual conditions. For angry vocalizations, there was N1 suppression only in the congruent but not in the incongruent condition. Extending previous findings of dynamic audiovisual integration, the current results suggest that audiovisual N1 suppression is congruency- and emotion-specific and indicate that dynamic emotional expressions compared with non-emotional expressions are preferentially processed in early audiovisual integration. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  7. Neural markers of emotional face perception across psychotic disorders and general population.

    Science.gov (United States)

    Sabharwal, Amri; Kotov, Roman; Szekely, Akos; Leung, Hoi-Chung; Barch, Deanna M; Mohanty, Aprajita

    2017-07-01

    There is considerable variation in negative and positive symptoms of psychosis, global functioning, and emotional face perception (EFP), not only in schizophrenia but also in other psychotic disorders and healthy individuals. However, EFP impairment and its association with worse symptoms and global functioning have been examined largely in the domain of schizophrenia. The present study adopted a dimensional approach to examine the association of behavioral and neural measures of EFP with symptoms of psychosis and global functioning across individuals with schizophrenia spectrum (SZ; N = 28) and other psychotic (OP; N = 29) disorders, and never-psychotic participants (NP; N = 21). Behavioral and functional MRI data were recorded as participants matched emotional expressions of faces and geometrical shapes. Lower accuracy and increased activity in early visual regions, hippocampus, and amygdala during emotion versus shape matching were associated with higher negative, but not positive, symptoms and lower global functioning, across all participants. This association remained even after controlling for group-related (SZ, OP, and NP) variance, dysphoria, and antipsychotic medication status, except in amygdala. Furthermore, negative symptoms mediated the relationship between behavioral and brain EFP measures and global functioning. This study provides some of the first evidence supporting the specific relationship of EFP measures with negative symptoms and global functioning across psychotic and never-psychotic samples, and transdiagnostically across different psychotic disorders. Present findings help bridge the gap between basic EFP-related neuroscience research and clinical research in psychosis, and highlight EFP as a potential symptom-specific marker that tracks global functioning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. Emotional Processing in Borderline Personality Disorder

    Science.gov (United States)

    Suvak, Michael K.; Sege, Christopher T.; Sloan, Denise M.; Shea, M. Tracie; Yen, Shirley; Litz, Brett T.

    2014-01-01

    This study examined whether individuals with borderline personality disorder (BPD) would exhibit augmented emotional responses to picture stimuli after being challenged with an ideographic interpersonal conflict script. Participants were 24 adults diagnosed with BPD, 23 adults diagnosed with obsessive compulsive personality disorder (OCPD), and 28 normal controls. Participants viewed emotionally evocative pictures before and after listening to the interpersonal script while a variety of physiological measures were recorded. Findings indicated that the interpersonal script was effective in eliciting enduring emotional responses from the BPD group relative to the control groups. However, despite the effectiveness of the interpersonal challenge task, there were no group differences in emotional responding to the affect eliciting stimuli. The findings underscore the complexities involved in examining emotional dysregulation in BPD in a laboratory setting. PMID:22449065

  9. The Process Model of Group-Based Emotion : Integrating Intergroup Emotion and Emotion Regulation Perspectives

    NARCIS (Netherlands)

    Goldenberg, Amit; Halperin, Eran; van Zomeren, Martijn; Gross, James J.

    Scholars interested in emotion regulation have documented the different goals and strategies individuals have for regulating their emotions. However, little attention has been paid to the regulation of group-based emotions, which are based on individuals' self-categorization as a group member and

  10. Emotional expectations influence neural sensitivity to fearful faces in humans:An event-related potential study

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The present study tested whether neural sensitivity to salient emotional facial expressions was influenced by emotional expectations induced by a cue that validly predicted the expression of a subsequently presented target face. Event-related potentials (ERPs) elicited by fearful and neutral faces were recorded while participants performed a gender discrimination task under cued (‘expected’) and uncued (‘unexpected’) conditions. The behavioral results revealed that accuracy was lower for fearful compared with neutral faces in the unexpected condition, while accuracy was similar for fearful and neutral faces in the expected condition. ERP data revealed increased amplitudes in the P2 component and 200–250 ms interval for unexpected fearful versus neutral faces. By contrast, ERP responses were similar for fearful and neutral faces in the expected condition. These findings indicate that human neural sensitivity to fearful faces is modulated by emotional expectations. Although the neural system is sensitive to unpredictable emotionally salient stimuli, sensitivity to salient stimuli is reduced when these stimuli are predictable.

  11. What's good for the goose is not good for the gander: Age and gender differences in scanning emotion faces.

    Science.gov (United States)

    Sullivan, Susan; Campbell, Anna; Hutton, Sam B; Ruffman, Ted

    2017-05-01

    Research indicates that older adults' (≥60 years) emotion recognition is worse than that of young adults, young and older men's emotion recognition is worse than that of young and older women (respectively), older adults' looking at mouths compared with eyes is greater than that of young adults. Nevertheless, previous research has not compared older men's and women's looking at emotion faces so the present study had two aims: (a) to examine whether the tendency to look at mouths is stronger amongst older men compared with older women and (b) to examine whether men's mouth looking correlates with better emotion recognition. We examined the emotion recognition abilities and spontaneous gaze patterns of young (n = 60) and older (n = 58) males and females as they labelled emotion faces. Older men spontaneously looked more to mouths than older women, and older men's looking at mouths correlated with their emotion recognition, whereas women's looking at eyes correlated with their emotion recognition. The findings are discussed in relation to a growing body of research suggesting both age and gender differences in response to emotional stimuli and the differential efficacy of mouth and eyes looking for men and women. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Validation of the Vanderbilt Holistic Face Processing Test.

    Science.gov (United States)

    Wang, Chao-Chih; Ross, David A; Gauthier, Isabel; Richler, Jennifer J

    2016-01-01

    The Vanderbilt Holistic Face Processing Test (VHPT-F) is a new measure of holistic face processing with better psychometric properties relative to prior measures developed for group studies (Richler et al., 2014). In fields where psychologists study individual differences, validation studies are commonplace and the concurrent validity of a new measure is established by comparing it to an older measure with established validity. We follow this approach and test whether the VHPT-F measures the same construct as the composite task, which is group-based measure at the center of the large literature on holistic face processing. In Experiment 1, we found a significant correlation between holistic processing measured in the VHPT-F and the composite task. Although this correlation was small, it was comparable to the correlation between holistic processing measured in the composite task with the same faces, but different target parts (top or bottom), which represents a reasonable upper limit for correlations between the composite task and another measure of holistic processing. These results confirm the validity of the VHPT-F by demonstrating shared variance with another measure of holistic processing based on the same operational definition. These results were replicated in Experiment 2, but only when the demographic profile of our sample matched that of Experiment 1.

  13. Validation of the Vanderbilt Holistic Face Processing Test.

    Directory of Open Access Journals (Sweden)

    Chao-Chih Wang

    2016-11-01

    Full Text Available The Vanderbilt Holistic Face Processing Test (VHPT-F is a new measure of holistic face processing with better psychometric properties relative to prior measures developed for group studies (Richler et al., 2014. In fields where psychologists study individual differences, validation studies are commonplace and the concurrent validity of a new measure is established by comparing it to an older measure with established validity. We follow this approach and test whether the VHPT-F measures the same construct as the composite task, which is group-based measure at the center of the large literature on holistic face processing. In Experiment 1, we found a significant correlation between holistic processing measured in the VHPT-F and the composite task. Although this correlation was small, it was comparable to the correlation between holistic processing measured in the composite task with the same faces, but different target parts (top or bottom, which represents a reasonable upper limit for correlations between the composite task and another measure of holistic processing. These results confirm the validity of the VHPT-F by demonstrating shared variance with another measure of holistic processing based on the same operational definition. These results were replicated in Experiment 2, but only when the demographic profile of our sample matched that of Experiment 1.

  14. An Event-Related Potential Study on the Effects of Cannabis on Emotion Processing

    Science.gov (United States)

    Troup, Lucy J.; Bastidas, Stephanie; Nguyen, Maia T.; Andrzejewski, Jeremy A.; Bowers, Matthew; Nomi, Jason S.

    2016-01-01

    The effect of cannabis on emotional processing was investigated using event-related potential paradigms (ERPs). ERPs associated with emotional processing of cannabis users, and non-using controls, were recorded and compared during an implicit and explicit emotional expression recognition and empathy task. Comparisons in P3 component mean amplitudes were made between cannabis users and controls. Results showed a significant decrease in the P3 amplitude in cannabis users compared to controls. Specifically, cannabis users showed reduced P3 amplitudes for implicit compared to explicit processing over centro-parietal sites which reversed, and was enhanced, at fronto-central sites. Cannabis users also showed a decreased P3 to happy faces, with an increase to angry faces, compared to controls. These effects appear to increase with those participants that self-reported the highest levels of cannabis consumption. Those cannabis users with the greatest consumption rates showed the largest P3 deficits for explicit processing and negative emotions. These data suggest that there is a complex relationship between cannabis consumption and emotion processing that appears to be modulated by attention. PMID:26926868

  15. Impaired Integration of Emotional Faces and Affective Body Context in a Rare Case of Developmental Visual Agnosia

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran. R.; Bentin, Shlomo

    2011-01-01

    In the current study we examined the recognition of facial expressions embedded in emotionally expressive bodies in case LG, an individual with a rare form of developmental visual agnosia who suffers from severe prosopagnosia. Neuropsychological testing demonstrated that LG‘s agnosia is characterized by profoundly impaired visual integration. Unlike individuals with typical developmental prosopagnosia who display specific difficulties with face identity (but typically not expression) recognition, LG was also impaired at recognizing isolated facial expressions. By contrast, he successfully recognized the expressions portrayed by faceless emotional bodies handling affective paraphernalia. When presented with contextualized faces in emotional bodies his ability to detect the emotion expressed by a face did not improve even if it was embedded in an emotionally-congruent body context. Furthermore, in contrast to controls, LG displayed an abnormal pattern of contextual influence from emotionally-incongruent bodies. The results are interpreted in the context of a general integration deficit in developmental visual agnosia, suggesting that impaired integration may extend from the level of the face to the level of the full person. PMID:21482423

  16. Approach and withdrawal tendencies during written word processing: effects of task, emotional valence and emotional arousal

    OpenAIRE

    Citron, Francesca Maria Marina; Abugaber, David; Herbert, Cornelia

    2016-01-01

    The affective dimensions of emotional valence and emotional arousal affect processing of verbal and pictorial stimuli. Traditional emotional theories assume a linear relationship between these dimensions, with valence determining the direction of a behaviour (approach vs. withdrawal) and arousal its intensity or strength. In contrast, according to the valence-arousal conflict theory, both dimensions are interactively related: positive valence and low arousal (PL) are associated with an implic...

  17. Linking children's neuropsychological processing of emotion with their knowledge of emotion expression regulation.

    OpenAIRE

    Watling, Dawn; Bourne, Victoria

    2007-01-01

    Understanding of emotions has been shown to develop between the ages of 4 and 10 years; however, individual differences exist in this development. While previous research has typically examined these differences in terms of developmental and/or social factors, little research has considered the possible impact of neuropsychological development on the behavioural understanding of emotions. Emotion processing tends to be lateralised to the right hemisphere of the brain in adults, yet this patt...

  18. Emotion and Cognition: An Intricately Bound Developmental Process

    Science.gov (United States)

    Bell, Martha Ann; Wolfe, Christy D.

    2004-01-01

    Regulatory aspects of development can best be understood by research that conceptualizes relations between cognition and emotion. The neural mechanisms associated with regulatory processes may be the same as those associated with higher order cognitive processes. Thus, from a developmental cognitive neuroscience perspective, emotion and cognition…

  19. Impact of gender and genetics on emotion processing in Parkinson's disease - A multimodal study

    Directory of Open Access Journals (Sweden)

    Julia Heller

    Full Text Available Background: Parkinson's disease (PD has been suggested to affect males and females differently. Neuropsychiatric symptoms are common and disabling in PD. However, previous studies focusing on emotion recognition in PD have neglected the confounder of gender and lack evidence on the underlying endocrinal and genetic mechanisms. Moreover, while there are many imaging studies on emotion processing in PD, gender-related analyses of neural data are scarce. We therefore aimed at exploring the interplay of the named factors on emotion recognition and processing in PD. Methods: 51 non-demented PD patients (26 male and 44 age- and gender-matched healthy controls (HC; 25 male were examined clinically and neuropsychologically including an emotion recognition task (Ekman 60faces test. A subsample of 25 patients and 31 HC underwent task-based functional magnetic resonance imaging (fMRI comprised of videos of emotional facial expressions. To examine the impact of hormones and genetics on emotion processing, blood samples were taken for endocrinal (testosterone, estradiol, progesterone and genetic testing (5-HTTLPR, Val158Met COMT polymorphisms. Results: No group or gender differences emerged regarding cognitive abilities. Male but not female PD patients exhibited confined impairments in recognizing the emotion anger accompanied by diminished neural response to facial expressions (e.g. in the putamen and insula. Endocrinologically, fear recognition was positively correlated with estrogen levels in female patients, while on the genetic level we found an effect of Val158Met COMT genotype on the recognition of fear in PD patients. Conclusions: Our study provides evidence that impaired emotion processing in PD specifically affects male patients, and that hormones and genetics contribute to emotion recognition performance. Further research on the underlying neural, endocrinological and genetic mechanisms of specific symptoms in PD is of clinical relevance, as it

  20. Processing emotion from abstract art in frontotemporal lobar degeneration.

    Science.gov (United States)

    Cohen, Miriam H; Carton, Amelia M; Hardy, Christopher J; Golden, Hannah L; Clark, Camilla N; Fletcher, Phillip D; Jaisin, Kankamol; Marshall, Charles R; Henley, Susie M D; Rohrer, Jonathan D; Crutch, Sebastian J; Warren, Jason D

    2016-01-29

    art may signal emotions independently of a biological or social carrier: it might therefore constitute a test case for defining brain mechanisms of generic emotion decoding and the impact of disease states on those mechanisms. This is potentially of particular relevance to diseases in the frontotemporal lobar degeneration (FTLD) spectrum. These diseases are often led by emotional impairment despite retained or enhanced artistic interest in at least some patients. However, the processing of emotion from art has not been studied systematically in FTLD. Here we addressed this issue using a novel emotional valence matching task on abstract paintings in patients representing major syndromes of FTLD (behavioural variant frontotemporal dementia, n=11; sematic variant primary progressive aphasia (svPPA), n=7; nonfluent variant primary progressive aphasia (nfvPPA), n=6) relative to healthy older individuals (n=39). Performance on art emotion valence matching was compared between groups taking account of perceptual matching performance and assessed in relation to facial emotion matching using customised control tasks. Neuroanatomical correlates of art emotion processing were assessed using voxel-based morphometry of patients' brain MR images. All patient groups had a deficit of art emotion processing relative to healthy controls; there were no significant interactions between syndromic group and emotion modality. Poorer art emotion valence matching performance was associated with reduced grey matter volume in right lateral occopitotemporal cortex in proximity to regions previously implicated in the processing of dynamic visual signals. Our findings suggest that abstract art may be a useful model system for investigating mechanisms of generic emotion decoding and aesthetic processing in neurodegenerative diseases. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Attentional Processing and Recall of Emotional Words

    OpenAIRE

    Fraga Carou, Isabel; Redondo, Jaime; Piñeiro, Ana; Padrón, Isabel; Fernández-Rey, José; Alcaraz, Miguel

    2011-01-01

    Three experiments were carried out in order to evaluate the attention paid to words of different emotional value. A dual-task experimental paradigm was employed, registering response times to acoustic tones which were presented during the reading of words. The recall was also evaluated by means of an intentional immediate recall test. The results reveal that neither the emotional valence nor the arousal of words on their own affected the attention paid by participants. Only in the third exper...

  2. The Interplay Among Children's Negative Family Representations, Visual Processing of Negative Emotions, and Externalizing Symptoms.

    Science.gov (United States)

    Davies, Patrick T; Coe, Jesse L; Hentges, Rochelle F; Sturge-Apple, Melissa L; van der Kloet, Erika

    2018-03-01

    This study examined the transactional interplay among children's negative family representations, visual processing of negative emotions, and externalizing symptoms in a sample of 243 preschool children (M age  = 4.60 years). Children participated in three annual measurement occasions. Cross-lagged autoregressive models were conducted with multimethod, multi-informant data to identify mediational pathways. Consistent with schema-based top-down models, negative family representations were associated with attention to negative faces in an eye-tracking task and their externalizing symptoms. Children's negative representations of family relationships specifically predicted decreases in their attention to negative emotions, which, in turn, was associated with subsequent increases in their externalizing symptoms. Follow-up analyses indicated that the mediational role of diminished attention to negative emotions was particularly pronounced for angry faces. © 2017 The Authors. Child Development © 2017 Society for Research in Child Development, Inc.

  3. Emotional face recognition deficits and medication effects in pre-manifest through stage-II Huntington's disease.

    Science.gov (United States)

    Labuschagne, Izelle; Jones, Rebecca; Callaghan, Jenny; Whitehead, Daisy; Dumas, Eve M; Say, Miranda J; Hart, Ellen P; Justo, Damian; Coleman, Allison; Dar Santos, Rachelle C; Frost, Chris; Craufurd, David; Tabrizi, Sarah J; Stout, Julie C

    2013-05-15

    Facial emotion recognition impairments have been reported in Huntington's disease (HD). However, the nature of the impairments across the spectrum of HD remains unclear. We report on emotion recognition data from 344 participants comprising premanifest HD (PreHD) and early HD patients, and controls. In a test of recognition of facial emotions, we examined responses to six basic emotional expressions and neutral expressions. In addition, and within the early HD sample, we tested for differences on emotion recognition performance between those 'on' vs. 'off' neuroleptic or selective serotonin reuptake inhibitor (SSRI) medications. The PreHD groups showed significant (precognition, compared to controls, on fearful, angry and surprised faces; whereas the early HD groups were significantly impaired across all emotions including neutral expressions. In early HD, neuroleptic use was associated with worse facial emotion recognition, whereas SSRI use was associated with better facial emotion recognition. The findings suggest that emotion recognition impairments exist across the HD spectrum, but are relatively more widespread in manifest HD than in the premanifest period. Commonly prescribed medications to treat HD-related symptoms also appear to affect emotion recognition. These findings have important implications for interpersonal communication and medication usage in HD. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  4. Suprasegmental information affects processing of talking faces at birth.

    Science.gov (United States)

    Guellai, Bahia; Mersad, Karima; Streri, Arlette

    2015-02-01

    From birth, newborns show a preference for faces talking a native language compared to silent faces. The present study addresses two questions that remained unanswered by previous research: (a) Does the familiarity with the language play a role in this process and (b) Are all the linguistic and paralinguistic cues necessary in this case? Experiment 1 extended newborns' preference for native speakers to non-native ones. Given that fetuses and newborns are sensitive to the prosodic characteristics of speech, Experiments 2 and 3 presented faces talking native and nonnative languages with the speech stream being low-pass filtered. Results showed that newborns preferred looking at a person who talked to them even when only the prosodic cues were provided for both languages. Nonetheless, a familiarity preference for the previously talking face is observed in the "normal speech" condition (i.e., Experiment 1) and a novelty preference in the "filtered speech" condition (Experiments 2 and 3). This asymmetry reveals that newborns process these two types of stimuli differently and that they may already be sensitive to a mismatch between the articulatory movements of the face and the corresponding speech sounds. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Arguments Against a Configural Processing Account of Familiar Face Recognition.

    Science.gov (United States)

    Burton, A Mike; Schweinberger, Stefan R; Jenkins, Rob; Kaufmann, Jürgen M

    2015-07-01

    Face recognition is a remarkable human ability, which underlies a great deal of people's social behavior. Individuals can recognize family members, friends, and acquaintances over a very large range of conditions, and yet the processes by which they do this remain poorly understood, despite decades of research. Although a detailed understanding remains elusive, face recognition is widely thought to rely on configural processing, specifically an analysis of spatial relations between facial features (so-called second-order configurations). In this article, we challenge this traditional view, raising four problems: (1) configural theories are underspecified; (2) large configural changes leave recognition unharmed; (3) recognition is harmed by nonconfigural changes; and (4) in separate analyses of face shape and face texture, identification tends to be dominated by texture. We review evidence from a variety of sources and suggest that failure to acknowledge the impact of familiarity on facial representations may have led to an overgeneralization of the configural account. We argue instead that second-order configural information is remarkably unimportant for familiar face recognition. © The Author(s) 2015.

  6. Processing emotion from abstract art in frontotemporal lobar degeneration

    OpenAIRE

    Cohen, Miriam H.; Carton, Amelia M.; Hardy, Christopher J.; Golden, Hannah L.; Clark, Camilla N.; Fletcher, Phillip D.; Jaisin, Kankamol; Marshall, Charles R.; Henley, Susie M.D.; Rohrer, Jonathan D.; Crutch, Sebastian J.; Warren, Jason D.

    2016-01-01

    Abstract art may signal emotions independently of a biological or social carrier: it might therefore constitute a test case for defining brain mechanisms of generic emotion decoding and the impact of disease states on those mechanisms. This is potentially of particular relevance to diseases in the frontotemporal lobar degeneration (FTLD) spectrum. These diseases are often led by emotional impairment despite retained or enhanced artistic interest in at least some patients. However, the process...

  7. Big Questions Facing Vocational Psychology: A Cognitive Information Processing Perspective

    Science.gov (United States)

    Reardon, Robert C.; Lenz, Janet G.; Sampson, James P., Jr.; Peterson, Gary W.

    2011-01-01

    This article draws upon the authors' experience in developing cognitive information processing theory in order to examine three important questions facing vocational psychology and assessment: (a) Where should new knowledge for vocational psychology come from? (b) How do career theories and research find their way into practice? and (c) What is…

  8. Emotion perception and executive control interact in the salience network during emotionally charged working memory processing

    NARCIS (Netherlands)

    Luo, Y.; Qin, S.; Fernandez, G.S.E.; Zhang, Y.; Klumpers, F.; Li, H.

    2014-01-01

    Processing of emotional stimuli can either hinder or facilitate ongoing working memory (WM); however, the neural basis of these effects remains largely unknown. Here we examined the neural mechanisms of these paradoxical effects by implementing a novel emotional WM task in an fMRI study. Twenty-five

  9. Crossmodal processing of emotions in alcohol-dependence and Korsakoff syndrome.

    Science.gov (United States)

    Brion, Mélanie; D'Hondt, Fabien; Lannoy, Séverine; Pitel, Anne-Lise; Davidoff, Donald A; Maurage, Pierre

    2017-09-01

    Decoding emotional information from faces and voices is crucial for efficient interpersonal communication. Emotional decoding deficits have been found in alcohol-dependence (ALC), particularly in crossmodal situations (with simultaneous stimulations from different modalities), but are still underexplored in Korsakoff syndrome (KS). The aim of this study is to determine whether the continuity hypothesis, postulating a gradual worsening of cognitive and brain impairments from ALC to KS, is valid for emotional crossmodal processing. Sixteen KS, 17 ALC and 19 matched healthy controls (CP) had to detect the emotion (anger or happiness) displayed by auditory, visual or crossmodal auditory-visual stimuli. Crossmodal stimuli were either emotionally congruent (leading to a facilitation effect, i.e. enhanced performance for crossmodal condition compared to unimodal ones) or incongruent (leading to an interference effect, i.e. decreased performance for crossmodal condition due to discordant information across modalities). Reaction times and accuracy were recorded. Crossmodal integration for congruent information was dampened only in ALC, while both ALC and KS demonstrated, compared to CP, decreased performance for decoding emotional facial expressions in the incongruent condition. The crossmodal integration appears impaired in ALC but preserved in KS. Both alcohol-related disorders present an increased interference effect. These results show the interest of more ecological designs, using crossmodal stimuli, to explore emotional decoding in alcohol-related disorders. They also suggest that the continuum hypothesis cannot be generalised to emotional decoding abilities.

  10. Holistic processing for other-race faces in chinese participants occurs for upright but not inverted faces.

    Science.gov (United States)

    Crookes, Kate; Favelle, Simone; Hayward, William G

    2013-01-01

    Recent evidence suggests stronger holistic processing for own-race faces may underlie the own-race advantage in face memory. In previous studies Caucasian participants have demonstrated larger holistic processing effects for Caucasian over Asian faces. However, Asian participants have consistently shown similar sized effects for both Asian and Caucasian faces. We investigated two proposed explanations for the holistic processing of other-race faces by Asian participants: (1) greater other-race exposure, (2) a general global processing bias. Holistic processing was tested using the part-whole task. Participants were living in predominantly own-race environments and other-race contact was evaluated. Despite reporting significantly greater contact with own-race than other-race people, Chinese participants displayed strong holistic processing for both Asian and Caucasian upright faces. In addition, Chinese participants showed no evidence of holistic processing for inverted faces arguing against a general global processing bias explanation. Caucasian participants, in line with previous studies, displayed stronger holistic processing for Caucasian than Asian upright faces. For inverted faces there were no race-of-face differences. These results are used to suggest that Asians may make more general use of face-specific mechanisms than Caucasians.

  11. Positively Biased Processing of Mother’s Emotions Predicts Children’s Social and Emotional Functioning

    Science.gov (United States)

    Donohue, Meghan Rose; Goodman, Sherryl H.; Tully, Erin C.

    2016-01-01

    Risk for internalizing problems and social skills deficits likely emerges in early childhood when emotion processing and social competencies are developing. Positively biased processing of social information is typical during early childhood and may be protective against poorer psychosocial outcomes. We tested the hypothesis that young children with relatively less positively biased attention to, interpretations of, and attributions for their mother’s emotions would exhibit poorer prosocial skills and more internalizing problems. A sample of 4- to 6-year-old children (N=82) observed their mothers express happiness, sadness and anger during a simulated emotional phone conversation. Children’s attention to their mother when she expressed each emotion was rated from video. Immediately following the phone conversation, children were asked questions about the conversation to assess their interpretations of the intensity of mother’s emotions and misattributions of personal responsibility for her emotions. Children’s prosocial skills and internalizing problems were assessed using mother-report rating scales. Interpretations of mother’s positive emotions as relatively less intense than her negative emotions, misattributions of personal responsibility for her negative emotions, and lack of misattributions of personal responsibility for her positive emotions were associated with poorer prosocial skills. Children who attended relatively less to mother’s positive than her negative emotions had higher levels of internalizing problems. These findings suggest that children’s attention to, interpretations of, and attributions for their mother’s emotions may be important targets of early interventions for preventing prosocial skills deficits and internalizing problems. PMID:28348456

  12. Functional MRI of music emotion processing in frontotemporal dementia.

    Science.gov (United States)

    Agustus, Jennifer L; Mahoney, Colin J; Downey, Laura E; Omar, Rohani; Cohen, Miriam; White, Mark J; Scott, Sophie K; Mancini, Laura; Warren, Jason D

    2015-03-01

    Frontotemporal dementia is an important neurodegenerative disorder of younger life led by profound emotional and social dysfunction. Here we used fMRI to assess brain mechanisms of music emotion processing in a cohort of patients with frontotemporal dementia (n = 15) in relation to healthy age-matched individuals (n = 11). In a passive-listening paradigm, we manipulated levels of emotion processing in simple arpeggio chords (mode versus dissonance) and emotion modality (music versus human emotional vocalizations). A complex profile of disease-associated functional alterations was identified with separable signatures of musical mode, emotion level, and emotion modality within a common, distributed brain network, including posterior and anterior superior temporal and inferior frontal cortices and dorsal brainstem effector nuclei. Separable functional signatures were identified post-hoc in patients with and without abnormal craving for music (musicophilia): a model for specific abnormal emotional behaviors in frontotemporal dementia. Our findings indicate the potential of music to delineate neural mechanisms of altered emotion processing in dementias, with implications for future disease tracking and therapeutic strategies. © 2014 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals Inc. on behalf of The New York Academy of Sciences.

  13. Individual differences in emotion word processing: A diffusion model analysis.

    Science.gov (United States)

    Mueller, Christina J; Kuchinke, Lars

    2016-06-01

    The exploratory study investigated individual differences in implicit processing of emotional words in a lexical decision task. A processing advantage for positive words was observed, and differences between happy and fear-related words in response times were predicted by individual differences in specific variables of emotion processing: Whereas more pronounced goal-directed behavior was related to a specific slowdown in processing of fear-related words, the rate of spontaneous eye blinks (indexing brain dopamine levels) was associated with a processing advantage of happy words. Estimating diffusion model parameters revealed that the drift rate (rate of information accumulation) captures unique variance of processing differences between happy and fear-related words, with highest drift rates observed for happy words. Overall emotion recognition ability predicted individual differences in drift rates between happy and fear-related words. The findings emphasize that a significant amount of variance in emotion processing is explained by individual differences in behavioral data.

  14. Emotion-processing deficit in alexithymia.

    Science.gov (United States)

    Roedema, T M; Simons, R F

    1999-05-01

    College undergraduates were identified as alexithymic or control, based on their scores on the Toronto Alexithymia Scale (TAS; Taylor, Ryan, & Bagby, 1985). All subjects were presented standardized emotion-eliciting color slides for 6 s while facial muscle, heart rate, and skin conductance activity were recorded. Stimuli were presented a second time while subjects were asked to provide emotion self-reports using a paper-and-pencil version of the Self-Assessment Manikin (SAM; Lang, 1980) and to generate a list of words describing their emotional reaction to each slide. Consistent with the definition of alexithymia as a syndrome characterized, in part, by a deficit in the identification of emotion states, high TAS subjects supplied fewer emotion-related words than did controls to describe their response to the slides. Alexithymics also indicated less variation along the arousal dimension of the SAM, produced fewer specific skin conductance responses and showed less heart rate deceleration to the slides, regardless of category. No valence-related differences between alexithymic and control subjects were noted.

  15. Task demands modulate decision and eye movement responses in the chimeric face test: examining the right hemisphere processing account

    Directory of Open Access Journals (Sweden)

    Jason eCoronel

    2014-03-01

    Full Text Available A large and growing body of work, conducted in both brain-intact and brain-damaged populations, has used the free viewing chimeric face test as a measure of hemispheric dominance for the extraction of emotional information from faces. These studies generally show that normal right-handed individuals tend to perceive chimeric faces as more emotional if the emotional expression is presented on the half of the face to the viewer’s left (left hemiface. However, the mechanisms underlying this lateralized bias remain unclear. Here, we examine the extent to which this bias is driven by right hemisphere processing advantages versus default scanning biases in a unique way -- by changing task demands. In particular, we compare the original task with one in which right-hemisphere-biased processing cannot provide a decision advantage. Our behavioral and eye-movement data are inconsistent with the predictions of a default scanning bias account and support the idea that the left hemiface bias found in the chimeric face test is largely due to strategic use of right hemisphere processing mechanisms.

  16. A specialized face-processing model inspired by the organization of monkey face patches explains several face-specific phenomena observed in humans.

    Science.gov (United States)

    Farzmahdi, Amirhossein; Rajaei, Karim; Ghodrati, Masoud; Ebrahimpour, Reza; Khaligh-Razavi, Seyed-Mahdi

    2016-04-26

    Converging reports indicate that face images are processed through specialized neural networks in the brain -i.e. face patches in monkeys and the fusiform face area (FFA) in humans. These studies were designed to find out how faces are processed in visual system compared to other objects. Yet, the underlying mechanism of face processing is not completely revealed. Here, we show that a hierarchical computational model, inspired by electrophysiological evidence on face processing in primates, is able to generate representational properties similar to those observed in monkey face patches (posterior, middle and anterior patches). Since the most important goal of sensory neuroscience is linking the neural responses with behavioral outputs, we test whether the proposed model, which is designed to account for neural responses in monkey face patches, is also able to predict well-documented behavioral face phenomena observed in humans. We show that the proposed model satisfies several cognitive face effects such as: composite face effect and the idea of canonical face views. Our model provides insights about the underlying computations that transfer visual information from posterior to anterior face patches.

  17. Priming Facial Gender and Emotional Valence: The Influence of Spatial Frequency on Face Perception in ASD

    Science.gov (United States)

    Vanmarcke, Steven; Wagemans, Johan

    2017-01-01

    Adolescents with and without autism spectrum disorder (ASD) performed two priming experiments in which they implicitly processed a prime stimulus, containing high and/or low spatial frequency information, and then explicitly categorized a target face either as male/female (gender task) or as positive/negative (Valence task). Adolescents with ASD…

  18. Characterization and recognition of mixed emotional expressions in thermal face image

    Science.gov (United States)

    Saha, Priya; Bhattacharjee, Debotosh; De, Barin K.; Nasipuri, Mita

    2016-05-01

    Facial expressions in infrared imaging have been introduced to solve the problem of illumination, which is an integral constituent of visual imagery. The paper investigates facial skin temperature distribution on mixed thermal facial expressions of our created face database where six are basic expressions and rest 12 are a mixture of those basic expressions. Temperature analysis has been performed on three facial regions of interest (ROIs); periorbital, supraorbital and mouth. Temperature variability of the ROIs in different expressions has been measured using statistical parameters. The temperature variation measurement in ROIs of a particular expression corresponds to a vector, which is later used in recognition of mixed facial expressions. Investigations show that facial features in mixed facial expressions can be characterized by positive emotion induced facial features and negative emotion induced facial features. Supraorbital is a useful facial region that can differentiate basic expressions from mixed expressions. Analysis and interpretation of mixed expressions have been conducted with the help of box and whisker plot. Facial region containing mixture of two expressions is generally less temperature inducing than corresponding facial region containing basic expressions.

  19. An emotional processing writing intervention and heart rate variability: the role of emotional approach.

    Science.gov (United States)

    Seeley, Saren H; Yanez, Betina; Stanton, Annette L; Hoyt, Michael A

    2017-08-01

    Expressing and understanding one's own emotional responses to negative events, particularly those that challenge the attainment of important life goals, is thought to confer physiological benefit. Individual preferences and/or abilities in approaching emotions might condition the efficacy of interventions designed to encourage written emotional processing (EP). This study examines the physiological impact (as indexed by heart rate variability (HRV)) of an emotional processing writing (EPW) task as well as the moderating influence of a dispositional preference for coping through emotional approach (EP and emotional expression (EE)), in response to a laboratory stress task designed to challenge an important life goal. Participants (n = 98) were randomly assigned to either EPW or fact control writing (FCW) following the stress task. Regression analyses revealed a significant dispositional EP by condition interaction, such that high EP participants in the EPW condition demonstrated higher HRV after writing compared to low EP participants. No significant main effects of condition or EE coping were observed. These findings suggest that EPW interventions may be best suited for those with preference or ability to process emotions related to a stressor or might require adaptation for those who less often cope through emotional approach.

  20. Does a single session of electroconvulsive therapy alter the neural response to emotional faces in depression? A randomised sham-controlled functional magnetic resonance imaging study.

    Science.gov (United States)

    Miskowiak, Kamilla W; Kessing, Lars V; Ott, Caroline V; Macoveanu, Julian; Harmer, Catherine J; Jørgensen, Anders; Revsbech, Rasmus; Jensen, Hans M; Paulson, Olaf B; Siebner, Hartwig R; Jørgensen, Martin B

    2017-09-01

    Negative neurocognitive bias is a core feature of major depressive disorder that is reversed by pharmacological and psychological treatments. This double-blind functional magnetic resonance imaging study investigated for the first time whether electroconvulsive therapy modulates negative neurocognitive bias in major depressive disorder. Patients with major depressive disorder were randomised to one active ( n=15) or sham electroconvulsive therapy ( n=12). The following day they underwent whole-brain functional magnetic resonance imaging at 3T while viewing emotional faces and performed facial expression recognition and dot-probe tasks. A single electroconvulsive therapy session had no effect on amygdala response to emotional faces. Whole-brain analysis revealed no effects of electroconvulsive therapy versus sham therapy after family-wise error correction at the cluster level, using a cluster-forming threshold of Z>3.1 ( p2.3; pelectroconvulsive therapy-induced changes in parahippocampal and superior frontal responses to fearful versus happy faces as well as in fear-specific functional connectivity between amygdala and occipito-temporal regions. Across all patients, greater fear-specific amygdala - occipital coupling correlated with lower fear vigilance. Despite no statistically significant shift in neural response to faces after a single electroconvulsive therapy session, the observed trend changes after a single electroconvulsive therapy session point to an early shift in emotional processing that may contribute to antidepressant effects of electroconvulsive therapy.

  1. Beyond face value: does involuntary emotional anticipation shape the perception of dynamic facial expressions?

    Directory of Open Access Journals (Sweden)

    Letizia Palumbo

    Full Text Available Emotional facial expressions are immediate indicators of the affective dispositions of others. Recently it has been shown that early stages of social perception can already be influenced by (implicit attributions made by the observer about the agent's mental state and intentions. In the current study possible mechanisms underpinning distortions in the perception of dynamic, ecologically-valid, facial expressions were explored. In four experiments we exa