WorldWideScience

Sample records for face-emotion labeling deficits

  1. Your emotion or mine: Labeling feelings alters emotional face perception- An ERP study on automatic and intentional affect labeling

    Directory of Open Access Journals (Sweden)

    Cornelia eHerbert

    2013-07-01

    Full Text Available Empirical evidence suggests that words are powerful regulators of emotion processing. Although a number of studies have used words as contextual cues for emotion processing, the role of what is being labeled by the words (i.e. one’s own emotion as compared to the emotion expressed by the sender is poorly understood. The present study reports results from two experiments which used ERP methodology to evaluate the impact of emotional faces and self- versus sender-related emotional pronoun-noun pairs (e.g. my fear vs. his fear as cues for emotional face processing. The influence of self- and sender-related cues on the processing of fearful, angry and happy faces was investigated in two contexts: an automatic (experiment 1 and intentional affect labeling task (experiment 2, along with control conditions of passive face processing. ERP patterns varied as a function of the label’s reference (self vs. sender and the intentionality of the labelling task (experiment 1 vs. experiment 2. In experiment 1, self-related labels increased the motivational relevance of the emotional faces in the time-window of the EPN component. Processing of sender-related labels improved emotion recognition specifically for fearful faces in the N170 time-window. Spontaneous processing of affective labels modulated later stages of face processing as well. Amplitudes of the late positive potential (LPP were reduced for fearful, happy, and angry faces relative to the control condition of passive viewing. During intentional regulation (experiment 2 amplitudes of the LPP were enhanced for emotional faces when subjects used the self-related emotion labels to label their own emotion during face processing, and they rated the faces as higher in arousal than the emotional faces that had been presented in the label sender’s emotion condition or the passive viewing condition. The present results argue in favor of a differentiated view of language-as-context for emotion processing.

  2. Developmental differences in the neural mechanisms of facial emotion labeling

    Science.gov (United States)

    Adleman, Nancy E.; Kim, Pilyoung; Oakes, Allison H.; Hsu, Derek; Reynolds, Richard C.; Chen, Gang; Pine, Daniel S.; Brotman, Melissa A.; Leibenluft, Ellen

    2016-01-01

    Adolescence is a time of increased risk for the onset of psychological disorders associated with deficits in face emotion labeling. We used functional magnetic resonance imaging (fMRI) to examine age-related differences in brain activation while adolescents and adults labeled the emotion on fearful, happy and angry faces of varying intensities [0% (i.e. neutral), 50%, 75%, 100%]. Adolescents and adults did not differ on accuracy to label emotions. In the superior temporal sulcus, ventrolateral prefrontal cortex and middle temporal gyrus, adults show an inverted-U-shaped response to increasing intensities of fearful faces and a U-shaped response to increasing intensities of happy faces, whereas adolescents show the opposite patterns. In addition, adults, but not adolescents, show greater inferior occipital gyrus activation to negative (angry, fearful) vs positive (happy) emotions. In sum, when subjects classify subtly varying facial emotions, developmental differences manifest in several ‘ventral stream’ brain regions. Charting the typical developmental course of the brain mechanisms of socioemotional processes, such as facial emotion labeling, is an important focus for developmental psychopathology research. PMID:26245836

  3. Developmental differences in the neural mechanisms of facial emotion labeling.

    Science.gov (United States)

    Wiggins, Jillian Lee; Adleman, Nancy E; Kim, Pilyoung; Oakes, Allison H; Hsu, Derek; Reynolds, Richard C; Chen, Gang; Pine, Daniel S; Brotman, Melissa A; Leibenluft, Ellen

    2016-01-01

    Adolescence is a time of increased risk for the onset of psychological disorders associated with deficits in face emotion labeling. We used functional magnetic resonance imaging (fMRI) to examine age-related differences in brain activation while adolescents and adults labeled the emotion on fearful, happy and angry faces of varying intensities [0% (i.e. neutral), 50%, 75%, 100%]. Adolescents and adults did not differ on accuracy to label emotions. In the superior temporal sulcus, ventrolateral prefrontal cortex and middle temporal gyrus, adults show an inverted-U-shaped response to increasing intensities of fearful faces and a U-shaped response to increasing intensities of happy faces, whereas adolescents show the opposite patterns. In addition, adults, but not adolescents, show greater inferior occipital gyrus activation to negative (angry, fearful) vs positive (happy) emotions. In sum, when subjects classify subtly varying facial emotions, developmental differences manifest in several 'ventral stream' brain regions. Charting the typical developmental course of the brain mechanisms of socioemotional processes, such as facial emotion labeling, is an important focus for developmental psychopathology research. Published by Oxford University Press 2015. This work is written by US Government employees and is in the public domain in the US.

  4. Specificity of Facial Expression Labeling Deficits in Childhood Psychopathology

    Science.gov (United States)

    Guyer, Amanda E.; McClure, Erin B.; Adler, Abby D.; Brotman, Melissa A.; Rich, Brendan A.; Kimes, Alane S.; Pine, Daniel S.; Ernst, Monique; Leibenluft, Ellen

    2007-01-01

    Background: We examined whether face-emotion labeling deficits are illness-specific or an epiphenomenon of generalized impairment in pediatric psychiatric disorders involving mood and behavioral dysregulation. Method: Two hundred fifty-two youths (7-18 years old) completed child and adult facial expression recognition subtests from the Diagnostic…

  5. Initial Orientation of Attention towards Emotional Faces in Children with Attention Deficit Hyperactivity Disorder

    Directory of Open Access Journals (Sweden)

    Mehrnoosh Ahmadi

    2011-09-01

    Full Text Available Objective: Early recognition of negative emotions is considered to be of vital importance. It seems that children with attention deficit hyperactivity disorder have some difficulties recognizing facial emotional expressions, especially negative ones. This study investigated the preference of children with attention deficit hyperactivity disorder for negative (angry, sad facial expressions compared to normal children.Method: Participants were 35 drug naive boys with ADHD, aged between 6-11 years ,and 31 matched healthy children. Visual orientation data were recorded while participants viewed face pairs (negative-neutral pairs shown for 3000ms. The number of first fixations made to each expression was considered as an index of initial orientation. Results: Group comparisons revealed no difference between attention deficit hyperactivity disorder group and their matched healthy counterparts in initial orientation of attention. A tendency towards negative emotions was found within the normal group, while no difference was observed between initial allocation of attention toward negative and neutral expressions in children with ADHD .Conclusion: Children with attention deficit hyperactivity disorder do not have significant preference for negative facial expressions. In contrast, normal children have a significant preference for negative facial emotions rather than neutral faces.

  6. Recognition of Face and Emotional Facial Expressions in Autism

    Directory of Open Access Journals (Sweden)

    Muhammed Tayyib Kadak

    2013-03-01

    Full Text Available Autism is a genetically transferred neurodevelopmental disorder characterized by severe and permanent deficits in many interpersonal relation areas like communication, social interaction and emotional responsiveness. Patients with autism have deficits in face recognition, eye contact and recognition of emotional expression. Both recognition of face and expression of facial emotion carried on face processing. Structural and functional impairment in fusiform gyrus, amygdala, superior temporal sulcus and other brain regions lead to deficits in recognition of face and facial emotion. Therefore studies suggest that face processing deficits resulted in problems in areas of social interaction and emotion in autism. Studies revealed that children with autism had problems in recognition of facial expression and used mouth region more than eye region. It was also shown that autistic patients interpreted ambiguous expressions as negative emotion. In autism, deficits related in various stages of face processing like detection of gaze, face identity, recognition of emotional expression were determined, so far. Social interaction impairments in autistic spectrum disorders originated from face processing deficits during the periods of infancy, childhood and adolescence. Recognition of face and expression of facial emotion could be affected either automatically by orienting towards faces after birth, or by “learning” processes in developmental periods such as identity and emotion processing. This article aimed to review neurobiological basis of face processing and recognition of emotional facial expressions during normal development and in autism.

  7. Cortical deficits of emotional face processing in adults with ADHD: its relation to social cognition and executive function.

    Science.gov (United States)

    Ibáñez, Agustin; Petroni, Agustin; Urquina, Hugo; Torrente, Fernando; Torralva, Teresa; Hurtado, Esteban; Guex, Raphael; Blenkmann, Alejandro; Beltrachini, Leandro; Muravchik, Carlos; Baez, Sandra; Cetkovich, Marcelo; Sigman, Mariano; Lischinsky, Alicia; Manes, Facundo

    2011-01-01

    Although it has been shown that adults with attention-deficit hyperactivity disorder (ADHD) have impaired social cognition, no previous study has reported the brain correlates of face valence processing. This study looked for behavioral, neuropsychological, and electrophysiological markers of emotion processing for faces (N170) in adult ADHD compared to controls matched by age, gender, educational level, and handedness. We designed an event-related potential (ERP) study based on a dual valence task (DVT), in which faces and words were presented to test the effects of stimulus type (faces, words, or face-word stimuli) and valence (positive versus negative). Individual signatures of cognitive functioning in participants with ADHD and controls were assessed with a comprehensive neuropsychological evaluation, including executive functioning (EF) and theory of mind (ToM). Compared to controls, the adult ADHD group showed deficits in N170 emotion modulation for facial stimuli. These N170 impairments were observed in the absence of any deficit in facial structural processing, suggesting a specific ADHD impairment in early facial emotion modulation. The cortical current density mapping of N170 yielded a main neural source of N170 at posterior section of fusiform gyrus (maximum at left hemisphere for words and right hemisphere for faces and simultaneous stimuli). Neural generators of N170 (fusiform gyrus) were reduced in ADHD. In those patients, N170 emotion processing was associated with performance on an emotional inference ToM task, and N170 from simultaneous stimuli was associated with EF, especially working memory. This is the first report to reveal an adult ADHD-specific impairment in the cortical modulation of emotion for faces and an association between N170 cortical measures and ToM and EF.

  8. Emotional Face Identification in Youths with Primary Bipolar Disorder or Primary Attention-Deficit/Hyperactivity Disorder

    Science.gov (United States)

    Seymour, Karen E.; Pescosolido, Matthew F.; Reidy, Brooke L.; Galvan, Thania; Kim, Kerri L.; Young, Matthew; Dickstein, Daniel P.

    2013-01-01

    Objective: Bipolar disorder (BD) and attention-deficit/hyperactivity disorder (ADHD) are often comorbid or confounded; therefore, we evaluated emotional face identification to better understand brain/behavior interactions in children and adolescents with either primary BD, primary ADHD, or typically developing controls (TDC). Method: Participants…

  9. Face processing in chronic alcoholism: a specific deficit for emotional features.

    Science.gov (United States)

    Maurage, P; Campanella, S; Philippot, P; Martin, S; de Timary, P

    2008-04-01

    It is well established that chronic alcoholism is associated with a deficit in the decoding of emotional facial expression (EFE). Nevertheless, it is still unclear whether this deficit is specifically for emotions or due to a more general impairment in visual or facial processing. This study was designed to clarify this issue using multiple control tasks and the subtraction method. Eighteen patients suffering from chronic alcoholism and 18 matched healthy control subjects were asked to perform several tasks evaluating (1) Basic visuo-spatial and facial identity processing; (2) Simple reaction times; (3) Complex facial features identification (namely age, emotion, gender, and race). Accuracy and reaction times were recorded. Alcoholic patients had a preserved performance for visuo-spatial and facial identity processing, but their performance was impaired for visuo-motor abilities and for the detection of complex facial aspects. More importantly, the subtraction method showed that alcoholism is associated with a specific EFE decoding deficit, still present when visuo-motor slowing down is controlled for. These results offer a post hoc confirmation of earlier data showing an EFE decoding deficit in alcoholism by strongly suggesting a specificity of this deficit for emotions. This may have implications for clinical situations, where emotional impairments are frequently observed among alcoholic subjects.

  10. Emotional face recognition deficit in amnestic patients with mild cognitive impairment: behavioral and electrophysiological evidence

    Directory of Open Access Journals (Sweden)

    Yang L

    2015-08-01

    Full Text Available Linlin Yang, Xiaochuan Zhao, Lan Wang, Lulu Yu, Mei Song, Xueyi Wang Department of Mental Health, The First Hospital of Hebei Medical University, Hebei Medical University Institute of Mental Health, Shijiazhuang, People’s Republic of China Abstract: Amnestic mild cognitive impairment (MCI has been conceptualized as a transitional stage between healthy aging and Alzheimer’s disease. Thus, understanding emotional face recognition deficit in patients with amnestic MCI could be useful in determining progression of amnestic MCI. The purpose of this study was to investigate the features of emotional face processing in amnestic MCI by using event-related potentials (ERPs. Patients with amnestic MCI and healthy controls performed a face recognition task, giving old/new responses to previously studied and novel faces with different emotional messages as the stimulus material. Using the learning-recognition paradigm, the experiments were divided into two steps, ie, a learning phase and a test phase. ERPs were analyzed on electroencephalographic recordings. The behavior data indicated high emotion classification accuracy for patients with amnestic MCI and for healthy controls. The mean percentage of correct classifications was 81.19% for patients with amnestic MCI and 96.46% for controls. Our ERP data suggest that patients with amnestic MCI were still be able to undertake personalizing processing for negative faces, but not for neutral or positive faces, in the early frontal processing stage. In the early time window, no differences in frontal old/new effect were found between patients with amnestic MCI and normal controls. However, in the late time window, the three types of stimuli did not elicit any old/new parietal effects in patients with amnestic MCI, suggesting their recollection was impaired. This impairment may be closely associated with amnestic MCI disease. We conclude from our data that face recognition processing and emotional memory is

  11. One Size Does Not Fit All: Face Emotion Processing Impairments in Semantic Dementia, Behavioural-Variant Frontotemporal Dementia and Alzheimer?s Disease Are Mediated by Distinct Cognitive Deficits

    OpenAIRE

    Miller, Laurie A.; Hsieh, Sharpley; Lah, Suncica; Savage, Sharon; Hodges, John R.; Piguet, Olivier

    2011-01-01

    Patients with frontotemporal dementia (both behavioural variant [bvFTD] and semantic dementia [SD]) as well as those with Alzheimer's disease (AD) show deficits on tests of face emotion processing, yet the mechanisms underlying these deficits have rarely been explored. We compared groups of patients with bvFTD (n = 17), SD (n = 12) or AD (n = 20) to an age- and education-matched group of healthy control subjects (n = 36) on three face emotion processing tasks (Ekman 60, Emotion Matching and E...

  12. Alcoholism and dampened temporal limbic activation to emotional faces.

    Science.gov (United States)

    Marinkovic, Ksenija; Oscar-Berman, Marlene; Urban, Trinity; O'Reilly, Cara E; Howard, Julie A; Sawyer, Kayle; Harris, Gordon J

    2009-11-01

    Excessive chronic drinking is accompanied by a broad spectrum of emotional changes ranging from apathy and emotional flatness to deficits in comprehending emotional information, but their neural bases are poorly understood. Emotional abnormalities associated with alcoholism were examined with functional magnetic resonance imaging in abstinent long-term alcoholic men in comparison to healthy demographically matched controls. Participants were presented with emotionally valenced words and photographs of faces during deep (semantic) and shallow (perceptual) encoding tasks followed by recognition. Overall, faces evoked stronger activation than words, with the expected material-specific laterality (left hemisphere for words, and right for faces) and depth of processing effects. However, whereas control participants showed stronger activation in the amygdala and hippocampus when viewing faces with emotional (relative to neutral) expressions, the alcoholics responded in an undifferentiated manner to all facial expressions. In the alcoholic participants, amygdala activity was inversely correlated with an increase in lateral prefrontal activity as a function of their behavioral deficits. Prefrontal modulation of emotional function as a compensation for the blunted amygdala activity during a socially relevant face appraisal task is in agreement with a distributed network engagement during emotional face processing. Deficient activation of amygdala and hippocampus may underlie impaired processing of emotional faces associated with long-term alcoholism and may be a part of the wide array of behavioral problems including disinhibition, concurring with previously documented interpersonal difficulties in this population. Furthermore, the results suggest that alcoholics may rely on prefrontal rather than temporal limbic areas in order to compensate for reduced limbic responsivity and to maintain behavioral adequacy when faced with emotionally or socially challenging situations.

  13. Neural basis of emotion recognition deficits in first-episode major depression

    NARCIS (Netherlands)

    van Wingen, G. A.; van Eijndhoven, P.; Tendolkar, I.; Buitelaar, J.; Verkes, R. J.; Fernández, G.

    2011-01-01

    Depressed individuals demonstrate a poorer ability to recognize the emotions of others, which could contribute to difficulties in interpersonal behaviour. This emotion recognition deficit appears related to the depressive state and is particularly pronounced when emotions are labelled semantically.

  14. Neural basis of emotion recognition deficits in first-episode major depression

    NARCIS (Netherlands)

    Wingen, G.A. van; Eijndhoven, P.F.P. van; Tendolkar, I.; Buitelaar, J.K.; Verkes, R.J.; Fernandez, G.S.E.

    2011-01-01

    BACKGROUND: Depressed individuals demonstrate a poorer ability to recognize the emotions of others, which could contribute to difficulties in interpersonal behaviour. This emotion recognition deficit appears related to the depressive state and is particularly pronounced when emotions are labelled

  15. Mapping the emotional face. How individual face parts contribute to successful emotion recognition.

    Directory of Open Access Journals (Sweden)

    Martin Wegrzyn

    Full Text Available Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes to disgust and happiness (mouth. The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation.

  16. Mapping the emotional face. How individual face parts contribute to successful emotion recognition

    Science.gov (United States)

    Wegrzyn, Martin; Vogt, Maria; Kireclioglu, Berna; Schneider, Julia; Kissler, Johanna

    2017-01-01

    Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation. PMID:28493921

  17. Neural circuitry of masked emotional face processing in youth with bipolar disorder, severe mood dysregulation, and healthy volunteers.

    Science.gov (United States)

    Thomas, Laura A; Brotman, Melissa A; Bones, Brian L; Chen, Gang; Rosen, Brooke H; Pine, Daniel S; Leibenluft, Ellen

    2014-04-01

    Youth with bipolar disorder (BD) and those with severe, non-episodic irritability (severe mood dysregulation, SMD) show face-emotion labeling deficits. These groups differ from healthy volunteers (HV) in neural responses to emotional faces. It is unknown whether awareness is required to elicit these differences. We compared activation in BD (N=20), SMD (N=18), and HV (N=22) during "Aware" and "Non-aware" priming of shapes by emotional faces. Subjects rated how much they liked the shape. In aware, a face (angry, fearful, happy, neutral, blank oval) appeared (187 ms) before the shape. In non-aware, a face appeared (17 ms), followed by a mask (170 ms), and shape. A Diagnosis-by-Awareness-by-Emotion ANOVA was not significant. There were significant Diagnosis-by-Awareness interactions in occipital regions. BD and SMD showed increased activity for non-aware vs. aware; HV showed the reverse pattern. When subjects viewed angry or neutral faces, there were Emotion-by-Diagnosis interactions in face-emotion processing regions, including the L precentral gyrus, R posterior cingulate, R superior temporal gyrus, R middle occipital gyrus, and L medial frontal gyrus. Regardless of awareness, BD and SMD differ in activation patterns from HV and each other in multiple brain regions, suggesting that BD and SMD are distinct developmental mood disorders. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Neural circuitry of masked emotional face processing in youth with bipolar disorder, severe mood dysregulation, and healthy volunteers

    Directory of Open Access Journals (Sweden)

    Laura A. Thomas

    2014-04-01

    Full Text Available Youth with bipolar disorder (BD and those with severe, non-episodic irritability (severe mood dysregulation, SMD show face-emotion labeling deficits. These groups differ from healthy volunteers (HV in neural responses to emotional faces. It is unknown whether awareness is required to elicit these differences. We compared activation in BD (N = 20, SMD (N = 18, and HV (N = 22 during “Aware” and “Non-aware” priming of shapes by emotional faces. Subjects rated how much they liked the shape. In aware, a face (angry, fearful, happy, neutral, blank oval appeared (187 ms before the shape. In non-aware, a face appeared (17 ms, followed by a mask (170 ms, and shape. A Diagnosis-by-Awareness-by-Emotion ANOVA was not significant. There were significant Diagnosis-by-Awareness interactions in occipital regions. BD and SMD showed increased activity for non-aware vs. aware; HV showed the reverse pattern. When subjects viewed angry or neutral faces, there were Emotion-by-Diagnosis interactions in face-emotion processing regions, including the L precentral gyrus, R posterior cingulate, R superior temporal gyrus, R middle occipital gyrus, and L medial frontal gyrus. Regardless of awareness, BD and SMD differ in activation patterns from HV and each other in multiple brain regions, suggesting that BD and SMD are distinct developmental mood disorders.

  19. Processing of emotional faces in congenital amusia: An emotional music priming event-related potential study.

    Science.gov (United States)

    Zhishuai, Jin; Hong, Liu; Daxing, Wu; Pin, Zhang; Xuejing, Lu

    2017-01-01

    Congenital amusia is characterized by lifelong impairments in music perception and processing. It is unclear whether pitch detection deficits impact amusic individuals' perception of musical emotion. In the current work, 19 amusics and 21 healthy controls were subjected to electroencephalography (EEG) while being exposed to music excerpts and emotional faces. We assessed each individual's ability to discriminate positive- and negative-valenced emotional faces and analyzed electrophysiological indices, in the form of event-related potentials (ERPs) recorded at 32 sites, following exposure to emotionally positive or negative music excerpts. We observed smaller N2 amplitudes in response to facial expressions in the amusia group than in the control group, suggesting that amusics were less affected by the musical stimuli. The late-positive component (LPC) in amusics was similar to that in controls. Our results suggest that the neurocognitive deficit characteristic of congenital amusia is fundamentally an impairment in musical information processing rather than an impairment in emotional processing.

  20. Processing of emotional faces in congenital amusia: An emotional music priming event-related potential study

    Directory of Open Access Journals (Sweden)

    Jin Zhishuai

    2017-01-01

    Full Text Available Congenital amusia is characterized by lifelong impairments in music perception and processing. It is unclear whether pitch detection deficits impact amusic individuals' perception of musical emotion. In the current work, 19 amusics and 21 healthy controls were subjected to electroencephalography (EEG while being exposed to music excerpts and emotional faces. We assessed each individual's ability to discriminate positive- and negative-valenced emotional faces and analyzed electrophysiological indices, in the form of event-related potentials (ERPs recorded at 32 sites, following exposure to emotionally positive or negative music excerpts. We observed smaller N2 amplitudes in response to facial expressions in the amusia group than in the control group, suggesting that amusics were less affected by the musical stimuli. The late-positive component (LPC in amusics was similar to that in controls. Our results suggest that the neurocognitive deficit characteristic of congenital amusia is fundamentally an impairment in musical information processing rather than an impairment in emotional processing.

  1. Scanning patterns of faces do not explain impaired emotion recognition in Huntington Disease: Evidence for a high level mechanism

    Directory of Open Access Journals (Sweden)

    Marieke evan Asselen

    2012-02-01

    Full Text Available Previous studies in patients with amygdala lesions suggested that deficits in emotion recognition might be mediated by impaired scanning patterns of faces. Here we investigated whether scanning patterns also contribute to the selective impairment in recognition of disgust in Huntington disease (HD. To achieve this goal, we recorded eye movements during a two-alternative forced choice emotion recognition task. HD patients in presymptomatic (n=16 and symptomatic (n=9 disease stages were tested and their performance was compared to a control group (n=22. In our emotion recognition task, participants had to indicate whether a face reflected one of six basic emotions. In addition, and in order to define whether emotion recognition was altered when the participants were forced to look at a specific component of the face, we used a second task where only limited facial information was provided (eyes/mouth in partially masked faces. Behavioural results showed no differences in the ability to recognize emotions between presymptomatic gene carriers and controls. However, an emotion recognition deficit was found for all 6 basic emotion categories in early stage HD. Analysis of eye movement patterns showed that patient and controls used similar scanning strategies. Patterns of deficits were similar regardless of whether parts of the faces were masked or not, thereby confirming that selective attention to particular face parts is not underlying the deficits. These results suggest that the emotion recognition deficits in symptomatic HD patients cannot be explained by impaired scanning patterns of faces. Furthermore, no selective deficit for recognition of disgust was found in presymptomatic HD patients.

  2. More Pronounced Deficits in Facial Emotion Recognition for Schizophrenia than Bipolar Disorder

    Science.gov (United States)

    Goghari, Vina M; Sponheim, Scott R

    2012-01-01

    Schizophrenia and bipolar disorder are typically separated in diagnostic systems. Behavioural, cognitive, and brain abnormalities associated with each disorder nonetheless overlap. We evaluated the diagnostic specificity of facial emotion recognition deficits in schizophrenia and bipolar disorder to determine whether select aspects of emotion recognition differed for the two disorders. The investigation used an experimental task that included the same facial images in an emotion recognition condition and an age recognition condition (to control for processes associated with general face recognition) in 27 schizophrenia patients, 16 bipolar I patients, and 30 controls. Schizophrenia and bipolar patients exhibited both shared and distinct aspects of facial emotion recognition deficits. Schizophrenia patients had deficits in recognizing angry facial expressions compared to healthy controls and bipolar patients. Compared to control participants, both schizophrenia and bipolar patients were more likely to mislabel facial expressions of anger as fear. Given that schizophrenia patients exhibited a deficit in emotion recognition for angry faces, which did not appear due to generalized perceptual and cognitive dysfunction, improving recognition of threat-related expression may be an important intervention target to improve social functioning in schizophrenia. PMID:23218816

  3. Social anhedonia is associated with neural abnormalities during face emotion processing.

    Science.gov (United States)

    Germine, Laura T; Garrido, Lucia; Bruce, Lori; Hooker, Christine

    2011-10-01

    Human beings are social organisms with an intrinsic desire to seek and participate in social interactions. Social anhedonia is a personality trait characterized by a reduced desire for social affiliation and reduced pleasure derived from interpersonal interactions. Abnormally high levels of social anhedonia prospectively predict the development of schizophrenia and contribute to poorer outcomes for schizophrenia patients. Despite the strong association between social anhedonia and schizophrenia, the neural mechanisms that underlie individual differences in social anhedonia have not been studied and are thus poorly understood. Deficits in face emotion recognition are related to poorer social outcomes in schizophrenia, and it has been suggested that face emotion recognition deficits may be a behavioral marker for schizophrenia liability. In the current study, we used functional magnetic resonance imaging (fMRI) to see whether there are differences in the brain networks underlying basic face emotion processing in a community sample of individuals low vs. high in social anhedonia. We isolated the neural mechanisms related to face emotion processing by comparing face emotion discrimination with four other baseline conditions (identity discrimination of emotional faces, identity discrimination of neutral faces, object discrimination, and pattern discrimination). Results showed a group (high/low social anhedonia) × condition (emotion discrimination/control condition) interaction in the anterior portion of the rostral medial prefrontal cortex, right superior temporal gyrus, and left somatosensory cortex. As predicted, high (relative to low) social anhedonia participants showed less neural activity in face emotion processing regions during emotion discrimination as compared to each control condition. The findings suggest that social anhedonia is associated with abnormalities in networks responsible for basic processes associated with social cognition, and provide a

  4. A facial expression of pax: Assessing children's "recognition" of emotion from faces.

    Science.gov (United States)

    Nelson, Nicole L; Russell, James A

    2016-01-01

    In a classic study, children were shown an array of facial expressions and asked to choose the person who expressed a specific emotion. Children were later asked to name the emotion in the face with any label they wanted. Subsequent research often relied on the same two tasks--choice from array and free labeling--to support the conclusion that children recognize basic emotions from facial expressions. Here five studies (N=120, 2- to 10-year-olds) showed that these two tasks produce illusory recognition; a novel nonsense facial expression was included in the array. Children "recognized" a nonsense emotion (pax or tolen) and two familiar emotions (fear and jealousy) from the same nonsense face. Children likely used a process of elimination; they paired the unknown facial expression with a label given in the choice-from-array task and, after just two trials, freely labeled the new facial expression with the new label. These data indicate that past studies using this method may have overestimated children's expression knowledge. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Are patients with schizophrenia impaired in processing non-emotional features of human faces?

    Directory of Open Access Journals (Sweden)

    Hayley eDarke

    2013-08-01

    Full Text Available It is known that individuals with schizophrenia exhibit signs of impaired face processing, however, the exact perceptual and cognitive mechanisms underlying these deficits are yet to be elucidated. One possible source of confusion in the current literature is the methodological and conceptual inconsistencies that can arise from the varied treatment of different aspects of face processing relating to emotional and non-emotional aspects of face perception. This review aims to disentangle the literature by focusing on the performance of patients with schizophrenia in a range of tasks that required processing of non-emotional features of face stimuli (e.g. identity or gender. We also consider the performance of patients on non-face stimuli that share common elements such as familiarity (e.g. cars and social relevance (e.g. gait. We conclude by exploring whether observed deficits are best considered as face-specific and note that further investigation is required to properly assess the potential contribution of more generalised attentional or perceptual impairments.

  6. Facial emotion recognition, face scan paths, and face perception in children with neurofibromatosis type 1.

    Science.gov (United States)

    Lewis, Amelia K; Porter, Melanie A; Williams, Tracey A; Bzishvili, Samantha; North, Kathryn N; Payne, Jonathan M

    2017-05-01

    This study aimed to investigate face scan paths and face perception abilities in children with Neurofibromatosis Type 1 (NF1) and how these might relate to emotion recognition abilities in this population. The authors investigated facial emotion recognition, face scan paths, and face perception in 29 children with NF1 compared to 29 chronological age-matched typically developing controls. Correlations between facial emotion recognition, face scan paths, and face perception in children with NF1 were examined. Children with NF1 displayed significantly poorer recognition of fearful expressions compared to controls, as well as a nonsignificant trend toward poorer recognition of anger. Although there was no significant difference between groups in time spent viewing individual core facial features (eyes, nose, mouth, and nonfeature regions), children with NF1 spent significantly less time than controls viewing the face as a whole. Children with NF1 also displayed significantly poorer face perception abilities than typically developing controls. Facial emotion recognition deficits were not significantly associated with aberrant face scan paths or face perception abilities in the NF1 group. These results suggest that impairments in the perception, identification, and interpretation of information from faces are important aspects of the social-cognitive phenotype of NF1. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. Face Age and Eye Gaze Influence Older Adults' Emotion Recognition.

    Science.gov (United States)

    Campbell, Anna; Murray, Janice E; Atkinson, Lianne; Ruffman, Ted

    2017-07-01

    Eye gaze has been shown to influence emotion recognition. In addition, older adults (over 65 years) are not as influenced by gaze direction cues as young adults (18-30 years). Nevertheless, these differences might stem from the use of young to middle-aged faces in emotion recognition research because older adults have an attention bias toward old-age faces. Therefore, using older face stimuli might allow older adults to process gaze direction cues to influence emotion recognition. To investigate this idea, young and older adults completed an emotion recognition task with young and older face stimuli displaying direct and averted gaze, assessing labeling accuracy for angry, disgusted, fearful, happy, and sad faces. Direct gaze rather than averted gaze improved young adults' recognition of emotions in young and older faces, but for older adults this was true only for older faces. The current study highlights the impact of stimulus face age and gaze direction on emotion recognition in young and older adults. The use of young face stimuli with direct gaze in most research might contribute to age-related emotion recognition differences. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. Modulation of the composite face effect by unintended emotion cues.

    Science.gov (United States)

    Gray, Katie L H; Murphy, Jennifer; Marsh, Jade E; Cook, Richard

    2017-04-01

    When upper and lower regions from different emotionless faces are aligned to form a facial composite, observers 'fuse' the two halves together, perceptually. The illusory distortion induced by task-irrelevant ('distractor') halves hinders participants' judgements about task-relevant ('target') halves. This composite-face effect reveals a tendency to integrate feature information from disparate regions of intact upright faces, consistent with theories of holistic face processing. However, observers frequently perceive emotion in ostensibly neutral faces, contrary to the intentions of experimenters. This study sought to determine whether this 'perceived emotion' influences the composite-face effect. In our first experiment, we confirmed that the composite effect grows stronger as the strength of distractor emotion increased. Critically, effects of distractor emotion were induced by weak emotion intensities, and were incidental insofar as emotion cues hindered image matching, not emotion labelling per se . In Experiment 2, we found a correlation between the presence of perceived emotion in a set of ostensibly neutral distractor regions sourced from commonly used face databases, and the strength of illusory distortion they induced. In Experiment 3, participants completed a sequential matching composite task in which half of the distractor regions were rated high and low for perceived emotion, respectively. Significantly stronger composite effects were induced by the high-emotion distractor halves. These convergent results suggest that perceived emotion increases the strength of the composite-face effect induced by supposedly emotionless faces. These findings have important implications for the study of holistic face processing in typical and atypical populations.

  9. The perception and identification of facial emotions in individuals with autism spectrum disorders using the Let's Face It! Emotion Skills Battery.

    Science.gov (United States)

    Tanaka, James W; Wolf, Julie M; Klaiman, Cheryl; Koenig, Kathleen; Cockburn, Jeffrey; Herlihy, Lauren; Brown, Carla; Stahl, Sherin S; South, Mikle; McPartland, James C; Kaiser, Martha D; Schultz, Robert T

    2012-12-01

    Although impaired social-emotional ability is a hallmark of autism spectrum disorder (ASD), the perceptual skills and mediating strategies contributing to the social deficits of autism are not well understood. A perceptual skill that is fundamental to effective social communication is the ability to accurately perceive and interpret facial emotions. To evaluate the expression processing of participants with ASD, we designed the Let's Face It! Emotion Skills Battery (LFI! Battery), a computer-based assessment composed of three subscales measuring verbal and perceptual skills implicated in the recognition of facial emotions. We administered the LFI! Battery to groups of participants with ASD and typically developing control (TDC) participants that were matched for age and IQ. On the Name Game labeling task, participants with ASD (N = 68) performed on par with TDC individuals (N = 66) in their ability to name the facial emotions of happy, sad, disgust and surprise and were only impaired in their ability to identify the angry expression. On the Matchmaker Expression task that measures the recognition of facial emotions across different facial identities, the ASD participants (N = 66) performed reliably worse than TDC participants (N = 67) on the emotions of happy, sad, disgust, frighten and angry. In the Parts-Wholes test of perceptual strategies of expression, the TDC participants (N = 67) displayed more holistic encoding for the eyes than the mouths in expressive faces whereas ASD participants (N = 66) exhibited the reverse pattern of holistic recognition for the mouth and analytic recognition of the eyes. In summary, findings from the LFI! Battery show that participants with ASD were able to label the basic facial emotions (with the exception of angry expression) on par with age- and IQ-matched TDC participants. However, participants with ASD were impaired in their ability to generalize facial emotions across different identities and showed a tendency to recognize

  10. Visual attention to emotional face in schizophrenia: an eye tracking study.

    Directory of Open Access Journals (Sweden)

    Mania Asgharpour

    2015-03-01

    Full Text Available Deficits in the processing of facial emotions have been reported extensively in patients with schizophrenia. To explore whether restricted attention is the cause of impaired emotion processing in these patients, we examined visual attention through tracking eye movements in response to emotional and neutral face stimuli in a group of patients with schizophrenia and healthy individuals. We also examined the correlation between visual attention allocation and symptoms severity in our patient group.Thirty adult patients with schizophrenia and 30 matched healthy controls participated in this study. Visual attention data were recorded while participants passively viewed emotional-neutral face pairs for 500 ms. The relationship between the visual attention and symptoms severity were assessed by the Positive and Negative Syndrome Scale (PANSS in the schizophrenia group. Repeated Measures ANOVAs were used to compare the groups.Comparing the number of fixations made during face-pairs presentation, we found that patients with schizophrenia made fewer fixations on faces, regardless of the expression of the face. Analysis of the number of fixations on negative-neutral pairs also revealed that the patients made fewer fixations on both neutral and negative faces. Analysis of number of fixations on positive-neutral pairs only showed more fixations on positive relative to neutral expressions in both groups. We found no correlations between visual attention pattern to faces and symptom severity in schizophrenic patients.The results of this study suggest that the facial recognition deficit in schizophrenia is related to decreased attention to face stimuli. Finding of no difference in visual attention for positive-neutral face pairs between the groups is in line with studies that have shown increased ability to positive emotional perception in these patients.

  11. Testing the effects of expression, intensity and age on emotional face processing in ASD.

    Science.gov (United States)

    Luyster, Rhiannon J; Bick, Johanna; Westerlund, Alissa; Nelson, Charles A

    2017-06-21

    Individuals with autism spectrum disorder (ASD) commonly show global deficits in the processing of facial emotion, including impairments in emotion recognition and slowed processing of emotional faces. Growing evidence has suggested that these challenges may increase with age, perhaps due to minimal improvement with age in individuals with ASD. In the present study, we explored the role of age, emotion type and emotion intensity in face processing for individuals with and without ASD. Twelve- and 18-22- year-old children with and without ASD participated. No significant diagnostic group differences were observed on behavioral measures of emotion processing for younger versus older individuals with and without ASD. However, there were significant group differences in neural responses to emotional faces. Relative to TD, at 12 years of age and during adulthood, individuals with ASD showed slower N170 to emotional faces. While the TD groups' P1 latency was significantly shorter in adults when compared to 12 year olds, there was no significant age-related difference in P1 latency among individuals with ASD. Findings point to potential differences in the maturation of cortical networks that support visual processing (whether of faces or stimuli more broadly), among individuals with and without ASD between late childhood and adulthood. Finally, associations between ERP amplitudes and behavioral responses on emotion processing tasks suggest possible neural markers for emotional and behavioral deficits among individuals with ASD. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. The contribution of emotional empathy to approachability judgements assigned to emotional faces is context specific

    Directory of Open Access Journals (Sweden)

    Megan L Willis

    2015-08-01

    Full Text Available Previous research on approachability judgements has indicated that facial expressions modulate how these judgements are made, but the relationship between emotional empathy and context in this decision-making process has not yet been examined. This study examined the contribution of emotional empathy to approachability judgements assigned to emotional faces in different contexts. One hundred and twenty female participants completed the Questionnaire Measure of Emotional Empathy. Participants provided approachability judgements to faces displaying angry, disgusted, fearful, happy, neutral and sad expressions, in three different contexts – when evaluating whether they would approach another individual to: 1 receive help; 2 give help; or 3 when no contextual information was provided. In addition, participants were also required to provide ratings of perceived threat, emotional intensity and label facial expressions. Emotional empathy significantly predicted approachability ratings for specific emotions in each context, over and above the contribution of perceived threat and intensity, which were associated with emotional empathy. Higher emotional empathy predicted less willingness to approach people with angry and disgusted faces to receive help, and a greater willingness to approach people with happy faces to receive help. Higher emotional empathy also predicted a greater willingness to approach people with sad faces to offer help, and more willingness to approach people with happy faces when no contextual information was provided. These results highlight the important contribution of individual differences in emotional empathy in predicting how approachability judgements are assigned to facial expressions in context.

  13. Are there differential deficits in facial emotion recognition between paranoid and non-paranoid schizophrenia? A signal detection analysis.

    Science.gov (United States)

    Huang, Charles Lung-Cheng; Hsiao, Sigmund; Hwu, Hai-Gwo; Howng, Shen-Long

    2013-10-30

    This study assessed facial emotion recognition abilities in subjects with paranoid and non-paranoid schizophrenia (NPS) using signal detection theory. We explore the differential deficits in facial emotion recognition in 44 paranoid patients with schizophrenia (PS) and 30 non-paranoid patients with schizophrenia (NPS), compared to 80 healthy controls. We used morphed faces with different intensities of emotion and computed the sensitivity index (d') of each emotion. The results showed that performance differed between the schizophrenia and healthy controls groups in the recognition of both negative and positive affects. The PS group performed worse than the healthy controls group but better than the NPS group in overall performance. Performance differed between the NPS and healthy controls groups in the recognition of all basic emotions and neutral faces; between the PS and healthy controls groups in the recognition of angry faces; and between the PS and NPS groups in the recognition of happiness, anger, sadness, disgust, and neutral affects. The facial emotion recognition impairment in schizophrenia may reflect a generalized deficit rather than a negative-emotion specific deficit. The PS group performed worse than the control group, but better than the NPS group in facial expression recognition, with differential deficits between PS and NPS patients. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  14. Disrupted neural processing of emotional faces in psychopathy.

    Science.gov (United States)

    Contreras-Rodríguez, Oren; Pujol, Jesus; Batalla, Iolanda; Harrison, Ben J; Bosque, Javier; Ibern-Regàs, Immaculada; Hernández-Ribas, Rosa; Soriano-Mas, Carles; Deus, Joan; López-Solà, Marina; Pifarré, Josep; Menchón, José M; Cardoner, Narcís

    2014-04-01

    Psychopaths show a reduced ability to recognize emotion facial expressions, which may disturb the interpersonal relationship development and successful social adaptation. Behavioral hypotheses point toward an association between emotion recognition deficits in psychopathy and amygdala dysfunction. Our prediction was that amygdala dysfunction would combine deficient activation with disturbances in functional connectivity with cortical regions of the face-processing network. Twenty-two psychopaths and 22 control subjects were assessed and functional magnetic resonance maps were generated to identify both brain activation and task-induced functional connectivity using psychophysiological interaction analysis during an emotional face-matching task. Results showed significant amygdala activation in control subjects only, but differences between study groups did not reach statistical significance. In contrast, psychopaths showed significantly increased activation in visual and prefrontal areas, with this latest activation being associated with psychopaths' affective-interpersonal disturbances. Psychophysiological interaction analyses revealed a reciprocal reduction in functional connectivity between the left amygdala and visual and prefrontal cortices. Our results suggest that emotional stimulation may evoke a relevant cortical response in psychopaths, but a disruption in the processing of emotional faces exists involving the reciprocal functional interaction between the amygdala and neocortex, consistent with the notion of a failure to integrate emotion into cognition in psychopathic individuals.

  15. Deficits in Degraded Facial Affect Labeling in Schizophrenia and Borderline Personality Disorder.

    Science.gov (United States)

    van Dijke, Annemiek; van 't Wout, Mascha; Ford, Julian D; Aleman, André

    2016-01-01

    Although deficits in facial affect processing have been reported in schizophrenia as well as in borderline personality disorder (BPD), these disorders have not yet been directly compared on facial affect labeling. Using degraded stimuli portraying neutral, angry, fearful and angry facial expressions, we hypothesized more errors in labeling negative facial expressions in patients with schizophrenia compared to healthy controls. Patients with BPD were expected to have difficulty in labeling neutral expressions and to display a bias towards a negative attribution when wrongly labeling neutral faces. Patients with schizophrenia (N = 57) and patients with BPD (N = 30) were compared to patients with somatoform disorder (SoD, a psychiatric control group; N = 25) and healthy control participants (N = 41) on facial affect labeling accuracy and type of misattributions. Patients with schizophrenia showed deficits in labeling angry and fearful expressions compared to the healthy control group and patients with BPD showed deficits in labeling neutral expressions compared to the healthy control group. Schizophrenia and BPD patients did not differ significantly from each other when labeling any of the facial expressions. Compared to SoD patients, schizophrenia patients showed deficits on fearful expressions, but BPD did not significantly differ from SoD patients on any of the facial expressions. With respect to the type of misattributions, BPD patients mistook neutral expressions more often for fearful expressions compared to schizophrenia patients and healthy controls, and less often for happy compared to schizophrenia patients. These findings suggest that although schizophrenia and BPD patients demonstrate different as well as similar facial affect labeling deficits, BPD may be associated with a tendency to detect negative affect in neutral expressions.

  16. Deficits in Degraded Facial Affect Labeling in Schizophrenia and Borderline Personality Disorder.

    Directory of Open Access Journals (Sweden)

    Annemiek van Dijke

    Full Text Available Although deficits in facial affect processing have been reported in schizophrenia as well as in borderline personality disorder (BPD, these disorders have not yet been directly compared on facial affect labeling. Using degraded stimuli portraying neutral, angry, fearful and angry facial expressions, we hypothesized more errors in labeling negative facial expressions in patients with schizophrenia compared to healthy controls. Patients with BPD were expected to have difficulty in labeling neutral expressions and to display a bias towards a negative attribution when wrongly labeling neutral faces. Patients with schizophrenia (N = 57 and patients with BPD (N = 30 were compared to patients with somatoform disorder (SoD, a psychiatric control group; N = 25 and healthy control participants (N = 41 on facial affect labeling accuracy and type of misattributions. Patients with schizophrenia showed deficits in labeling angry and fearful expressions compared to the healthy control group and patients with BPD showed deficits in labeling neutral expressions compared to the healthy control group. Schizophrenia and BPD patients did not differ significantly from each other when labeling any of the facial expressions. Compared to SoD patients, schizophrenia patients showed deficits on fearful expressions, but BPD did not significantly differ from SoD patients on any of the facial expressions. With respect to the type of misattributions, BPD patients mistook neutral expressions more often for fearful expressions compared to schizophrenia patients and healthy controls, and less often for happy compared to schizophrenia patients. These findings suggest that although schizophrenia and BPD patients demonstrate different as well as similar facial affect labeling deficits, BPD may be associated with a tendency to detect negative affect in neutral expressions.

  17. Emotion perception accuracy and bias in face-to-face versus cyberbullying.

    Science.gov (United States)

    Ciucci, Enrica; Baroncelli, Andrea; Nowicki, Stephen

    2014-01-01

    The authors investigated the association of traditional and cyber forms of bullying and victimization with emotion perception accuracy and emotion perception bias. Four basic emotions were considered (i.e., happiness, sadness, anger, and fear); 526 middle school students (280 females; M age = 12.58 years, SD = 1.16 years) were recruited, and emotionality was controlled. Results indicated no significant findings for girls. Boys with higher levels of traditional bullying did not show any deficit in perception accuracy of emotions, but they were prone to identify happiness and fear in faces when a different emotion was expressed; in addition, male cyberbullying was related to greater accuracy in recognizing fear. In terms of the victims, cyber victims had a global problem in recognizing emotions and a specific problem in processing anger and fear. It was concluded that emotion perception accuracy and bias were associated with bullying and victimization for boys not only in traditional settings but also in the electronic ones. Implications of these findings for possible intervention are discussed.

  18. The right place at the right time: priming facial expressions with emotional face components in developmental visual agnosia.

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran R; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo

    2012-04-01

    The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG's impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face's emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG's performance was strongly influenced by the diagnosticity of the components: his emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Emotion Words: Adding Face Value.

    Science.gov (United States)

    Fugate, Jennifer M B; Gendron, Maria; Nakashima, Satoshi F; Barrett, Lisa Feldman

    2017-06-12

    Despite a growing number of studies suggesting that emotion words affect perceptual judgments of emotional stimuli, little is known about how emotion words affect perceptual memory for emotional faces. In Experiments 1 and 2 we tested how emotion words (compared with control words) affected participants' abilities to select a target emotional face from among distractor faces. Participants were generally more likely to false alarm to distractor emotional faces when primed with an emotion word congruent with the face (compared with a control word). Moreover, participants showed both decreased sensitivity (d') to discriminate between target and distractor faces, as well as altered response biases (c; more likely to answer "yes") when primed with an emotion word (compared with a control word). In Experiment 3 we showed that emotion words had more of an effect on perceptual memory judgments when the structural information in the target face was limited, as well as when participants were only able to categorize the face with a partially congruent emotion word. The overall results are consistent with the idea that emotion words affect the encoding of emotional faces in perceptual memory. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. The Right Place at the Right Time: Priming Facial Expressions with Emotional Face Components in Developmental Visual Agnosia

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran. R.; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo

    2012-01-01

    The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG’s impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face’s emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG’s performance was strongly influenced by the diagnosticity of the components: His emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. PMID:22349446

  1. A face a mother could love: depression-related maternal neural responses to infant emotion faces.

    Science.gov (United States)

    Laurent, Heidemarie K; Ablow, Jennifer C

    2013-01-01

    Depressed mothers show negatively biased responses to their infants' emotional bids, perhaps due to faulty processing of infant cues. This study is the first to examine depression-related differences in mothers' neural response to their own infant's emotion faces, considering both effects of perinatal depression history and current depressive symptoms. Primiparous mothers (n = 22), half of whom had a history of major depressive episodes (with one episode occurring during pregnancy and/or postpartum), were exposed to images of their own and unfamiliar infants' joy and distress faces during functional neuroimaging. Group differences (depression vs. no-depression) and continuous effects of current depressive symptoms were tested in relation to neural response to own infant emotion faces. Compared to mothers with no psychiatric diagnoses, those with depression showed blunted responses to their own infant's distress faces in the dorsal anterior cingulate cortex. Mothers with higher levels of current symptomatology showed reduced responses to their own infant's joy faces in the orbitofrontal cortex and insula. Current symptomatology also predicted lower responses to own infant joy-distress in left-sided prefrontal and insula/striatal regions. These deficits in self-regulatory and motivational response circuits may help explain parenting difficulties in depressed mothers.

  2. Emotion processing deficits in alexithymia and response to a depth of processing intervention.

    Science.gov (United States)

    Constantinou, Elena; Panayiotou, Georgia; Theodorou, Marios

    2014-12-01

    Findings on alexithymic emotion difficulties have been inconsistent. We examined potential differences between alexithymic and control participants in general arousal, reactivity, facial and subjective expression, emotion labeling, and covariation between emotion response systems. A depth of processing intervention was introduced. Fifty-four participants (27 alexithymic), selected using the Toronto Alexithymia Scale-20, completed an imagery experiment (imagining joy, fear and neutral scripts), under instructions for shallow or deep emotion processing. Heart rate, skin conductance, facial electromyography and startle reflex were recorded along with subjective ratings. Results indicated hypo-reactivity to emotion among high alexithymic individuals, smaller and slower startle responses, and low covariation between physiology and self-report. No deficits in facial expression, labeling and emotion ratings were identified. Deep processing was associated with increased physiological reactivity and lower perceived dominance and arousal in high alexithymia. Findings suggest a tendency for avoidance of intense, unpleasant emotions and less defensive action preparation in alexithymia. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Emotional bias of cognitive control in adults with childhood attention-deficit/hyperactivity disorder

    Directory of Open Access Journals (Sweden)

    Kurt P. Schulz

    2014-01-01

    Full Text Available Affect recognition deficits found in individuals with attention-deficit/hyperactivity disorder (ADHD across the lifespan may bias the development of cognitive control processes implicated in the pathophysiology of the disorder. This study aimed to determine the mechanism through which facial expressions influence cognitive control in young adults diagnosed with ADHD in childhood. Fourteen probands with childhood ADHD and 14 comparison subjects with no history of ADHD were scanned with functional magnetic resonance imaging while performing a face emotion go/no-go task. Event-related analyses contrasted activation and functional connectivity for cognitive control collapsed over face valence and tested for variations in activation for response execution and inhibition as a function of face valence. Probands with childhood ADHD made fewer correct responses and inhibitions overall than comparison subjects, but demonstrated comparable effects of face emotion on response execution and inhibition. The two groups showed similar frontotemporal activation for cognitive control collapsed across face valence, but differed in the functional connectivity of the right dorsolateral prefrontal cortex, with fewer interactions with the subgenual cingulate cortex, inferior frontal gyrus, and putamen in probands than in comparison subjects. Further, valence-dependent activation for response execution was seen in the amygdala, ventral striatum, subgenual cingulate cortex, and orbitofrontal cortex in comparison subjects but not in probands. The findings point to functional anomalies in limbic networks for both the valence-dependent biasing of cognitive control and the valence-independent cognitive control of face emotion processing in probands with childhood ADHD. This limbic dysfunction could impact cognitive control in emotional contexts and may contribute to the social and emotional problems associated with ADHD.

  4. Emotional bias of cognitive control in adults with childhood attention-deficit/hyperactivity disorder.

    Science.gov (United States)

    Schulz, Kurt P; Bédard, Anne-Claude V; Fan, Jin; Clerkin, Suzanne M; Dima, Danai; Newcorn, Jeffrey H; Halperin, Jeffrey M

    2014-01-01

    Affect recognition deficits found in individuals with attention-deficit/hyperactivity disorder (ADHD) across the lifespan may bias the development of cognitive control processes implicated in the pathophysiology of the disorder. This study aimed to determine the mechanism through which facial expressions influence cognitive control in young adults diagnosed with ADHD in childhood. Fourteen probands with childhood ADHD and 14 comparison subjects with no history of ADHD were scanned with functional magnetic resonance imaging while performing a face emotion go/no-go task. Event-related analyses contrasted activation and functional connectivity for cognitive control collapsed over face valence and tested for variations in activation for response execution and inhibition as a function of face valence. Probands with childhood ADHD made fewer correct responses and inhibitions overall than comparison subjects, but demonstrated comparable effects of face emotion on response execution and inhibition. The two groups showed similar frontotemporal activation for cognitive control collapsed across face valence, but differed in the functional connectivity of the right dorsolateral prefrontal cortex, with fewer interactions with the subgenual cingulate cortex, inferior frontal gyrus, and putamen in probands than in comparison subjects. Further, valence-dependent activation for response execution was seen in the amygdala, ventral striatum, subgenual cingulate cortex, and orbitofrontal cortex in comparison subjects but not in probands. The findings point to functional anomalies in limbic networks for both the valence-dependent biasing of cognitive control and the valence-independent cognitive control of face emotion processing in probands with childhood ADHD. This limbic dysfunction could impact cognitive control in emotional contexts and may contribute to the social and emotional problems associated with ADHD.

  5. Association of Irritability and Anxiety With the Neural Mechanisms of Implicit Face Emotion Processing in Youths With Psychopathology.

    Science.gov (United States)

    Stoddard, Joel; Tseng, Wan-Ling; Kim, Pilyoung; Chen, Gang; Yi, Jennifer; Donahue, Laura; Brotman, Melissa A; Towbin, Kenneth E; Pine, Daniel S; Leibenluft, Ellen

    2017-01-01

    Psychiatric comorbidity complicates clinical care and confounds efforts to elucidate the pathophysiology of commonly occurring symptoms in youths. To our knowledge, few studies have simultaneously assessed the effect of 2 continuously distributed traits on brain-behavior relationships in children with psychopathology. To determine shared and unique effects of 2 major dimensions of child psychopathology, irritability and anxiety, on neural responses to facial emotions during functional magnetic resonance imaging. Cross-sectional functional magnetic resonance imaging study in a large, well-characterized clinical sample at a research clinic at the National Institute of Mental Health. The referred sample included youths ages 8 to 17 years, 93 youths with anxiety, disruptive mood dysregulation, and/or attention-deficit/hyperactivity disorders and 22 healthy youths. The child's irritability and anxiety were rated by both parent and child on the Affective Reactivity Index and Screen for Child Anxiety Related Disorders, respectively. Using functional magnetic resonance imaging, neural response was measured across the brain during gender labeling of varying intensities of angry, happy, or fearful face emotions. In mixed-effects analyses, the shared and unique effects of irritability and anxiety were tested on amygdala functional connectivity and activation to face emotions. The mean (SD) age of participants was 13.2 (2.6) years; of the 115 included, 64 were male. Irritability and/or anxiety influenced amygdala connectivity to the prefrontal and temporal cortex. Specifically, irritability and anxiety jointly influenced left amygdala to left medial prefrontal cortex connectivity during face emotion viewing (F4,888 = 9.20; P differences in neural response to face emotions in several areas (F2, 888 ≥ 13.45; all P emotion dysregulation when very anxious and irritable youth process threat-related faces. Activation in the ventral visual circuitry suggests a mechanism

  6. Neurodevelopmental changes across adolescence in viewing and labeling dynamic peer emotions

    Directory of Open Access Journals (Sweden)

    Jessica E. Flannery

    2017-06-01

    Full Text Available Adolescence is a sensitive period of social-affective development, characterized by biological, neurological, and social changes. The field currently conceptualizes these changes in terms of an imbalance between systems supporting reactivity and regulation, specifically nonlinear changes in reactivity networks and linear changes in regulatory networks. Previous research suggests that the labeling or reappraisal of emotion increases activity in lateral prefrontal cortex (LPFC, and decreases activity in amygdala relative to passive viewing of affective stimuli. However, past work in this area has relied heavily on paradigms using static, adult faces, as well as explicit regulation. In the current study, we assessed cross-sectional trends in neural responses to viewing and labeling dynamic peer emotional expressions in adolescent girls 10–23 years old. Our dynamic adolescent stimuli set reliably and robustly recruited key brain regions involved in emotion reactivity (medial orbital frontal cortex/ventral medial prefrontal cortex; MOFC/vMPFC, bilateral amygdala and regulation (bilateral dorsal and ventral LPFC. However, contrary to the age-trends predicted by the dominant models in studies of risk/reward, the LPFC showed a nonlinear age trend across adolescence to labeling dynamic peer faces, whereas the MOFC/vMPFC showed a linear decrease with age to viewing dynamic peer faces. There were no significant age trends observed in the amygdala.

  7. Neural Temporal Dynamics of Facial Emotion Processing: Age Effects and Relationship to Cognitive Function

    Directory of Open Access Journals (Sweden)

    Xiaoyan Liao

    2017-06-01

    Full Text Available This study used event-related potentials (ERPs to investigate the effects of age on neural temporal dynamics of processing task-relevant facial expressions and their relationship to cognitive functions. Negative (sad, afraid, angry, and disgusted, positive (happy, and neutral faces were presented to 30 older and 31 young participants who performed a facial emotion categorization task. Behavioral and ERP indices of facial emotion processing were analyzed. An enhanced N170 for negative faces, in addition to intact right-hemispheric N170 for positive faces, was observed in older adults relative to their younger counterparts. Moreover, older adults demonstrated an attenuated within-group N170 laterality effect for neutral faces, while younger adults showed the opposite pattern. Furthermore, older adults exhibited sustained temporo-occipital negativity deflection over the time range of 200–500 ms post-stimulus, while young adults showed posterior positivity and subsequent emotion-specific frontal negativity deflections. In older adults, decreased accuracy for labeling negative faces was positively correlated with Montreal Cognitive Assessment Scores, and accuracy for labeling neutral faces was negatively correlated with age. These findings suggest that older people may exert more effort in structural encoding for negative faces and there are different response patterns for the categorization of different facial emotions. Cognitive functioning may be related to facial emotion categorization deficits observed in older adults. This may not be attributable to positivity effects: it may represent a selective deficit for the processing of negative facial expressions in older adults.

  8. Neural activation to emotional faces in adolescents with autism spectrum disorders.

    Science.gov (United States)

    Weng, Shih-Jen; Carrasco, Melisa; Swartz, Johnna R; Wiggins, Jillian Lee; Kurapati, Nikhil; Liberzon, Israel; Risi, Susan; Lord, Catherine; Monk, Christopher S

    2011-03-01

    Autism spectrum disorders (ASD) involve a core deficit in social functioning and impairments in the ability to recognize face emotions. In an emotional faces task designed to constrain group differences in attention, the present study used functional MRI to characterize activation in the amygdala, ventral prefrontal cortex (vPFC), and striatum, three structures involved in socio-emotional processing in adolescents with ASD. Twenty-two adolescents with ASD and 20 healthy adolescents viewed facial expressions (happy, fearful, sad and neutral) that were briefly presented (250 ms) during functional MRI acquisition. To monitor attention, subjects pressed a button to identify the gender of each face. The ASD group showed greater activation to the faces relative to the control group in the amygdala, vPFC and striatum. Follow-up analyses indicated that the ASD relative to control group showed greater activation in the amygdala, vPFC and striatum (p gender identification task. When group differences in attention to facial expressions were limited, adolescents with ASD showed greater activation in structures involved in socio-emotional processing. © 2010 The Authors. Journal of Child Psychology and Psychiatry © 2010 Association for Child and Adolescent Mental Health.

  9. Amygdala Hyperactivation During Face Emotion Processing in Unaffected Youth at Risk for Bipolar Disorder

    Science.gov (United States)

    Olsavsky, Aviva K.; Brotman, Melissa A.; Rutenberg, Julia G.; Muhrer, Eli J.; Deveney, Christen M.; Fromm, Stephen J.; Towbin, Kenneth; Pine, Daniel S.; Leibenluft, Ellen

    2012-01-01

    Objective: Youth at familial risk for bipolar disorder (BD) show deficits in face emotion processing, but the neural correlates of these deficits have not been examined. This preliminary study tests the hypothesis that, relative to healthy comparison (HC) subjects, both BD subjects and youth at risk for BD (i.e., those with a first-degree BD…

  10. Similar representations of emotions across faces and voices.

    Science.gov (United States)

    Kuhn, Lisa Katharina; Wydell, Taeko; Lavan, Nadine; McGettigan, Carolyn; Garrido, Lúcia

    2017-09-01

    [Correction Notice: An Erratum for this article was reported in Vol 17(6) of Emotion (see record 2017-18585-001). In the article, the copyright attribution was incorrectly listed and the Creative Commons CC-BY license disclaimer was incorrectly omitted from the author note. The correct copyright is "© 2017 The Author(s)" and the omitted disclaimer is below. All versions of this article have been corrected. "This article has been published under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Copyright for this article is retained by the author(s). Author(s) grant(s) the American Psychological Association the exclusive right to publish the article and identify itself as the original publisher."] Emotions are a vital component of social communication, carried across a range of modalities and via different perceptual signals such as specific muscle contractions in the face and in the upper respiratory system. Previous studies have found that emotion recognition impairments after brain damage depend on the modality of presentation: recognition from faces may be impaired whereas recognition from voices remains preserved, and vice versa. On the other hand, there is also evidence for shared neural activation during emotion processing in both modalities. In a behavioral study, we investigated whether there are shared representations in the recognition of emotions from faces and voices. We used a within-subjects design in which participants rated the intensity of facial expressions and nonverbal vocalizations for each of the 6 basic emotion labels. For each participant and each modality, we then computed a representation matrix with the intensity ratings of each emotion. These matrices allowed us to examine the patterns of confusions between emotions and to characterize the representations

  11. Young Adults with Autism Spectrum Disorder Show Early Atypical Neural Activity during Emotional Face Processing

    Directory of Open Access Journals (Sweden)

    Rachel C. Leung

    2018-02-01

    Full Text Available Social cognition is impaired in autism spectrum disorder (ASD. The ability to perceive and interpret affect is integral to successful social functioning and has an extended developmental course. However, the neural mechanisms underlying emotional face processing in ASD are unclear. Using magnetoencephalography (MEG, the present study explored neural activation during implicit emotional face processing in young adults with and without ASD. Twenty-six young adults with ASD and 26 healthy controls were recruited. Participants indicated the location of a scrambled pattern (target that was presented alongside a happy or angry face. Emotion-related activation sources for each emotion were estimated using the Empirical Bayes Beamformer (pcorr ≤ 0.001 in Statistical Parametric Mapping 12 (SPM12. Emotional faces elicited elevated fusiform, amygdala and anterior insula and reduced anterior cingulate cortex (ACC activity in adults with ASD relative to controls. Within group comparisons revealed that angry vs. happy faces elicited distinct neural activity in typically developing adults; there was no distinction in young adults with ASD. Our data suggest difficulties in affect processing in ASD reflect atypical recruitment of traditional emotional processing areas. These early differences may contribute to difficulties in deriving social reward from faces, ascribing salience to faces, and an immature threat processing system, which collectively could result in deficits in emotional face processing.

  12. Is the emotion recognition deficit associated with frontotemporal dementia caused by selective inattention to diagnostic facial features?

    Science.gov (United States)

    Oliver, Lindsay D; Virani, Karim; Finger, Elizabeth C; Mitchell, Derek G V

    2014-07-01

    Frontotemporal dementia (FTD) is a debilitating neurodegenerative disorder characterized by severely impaired social and emotional behaviour, including emotion recognition deficits. Though fear recognition impairments seen in particular neurological and developmental disorders can be ameliorated by reallocating attention to critical facial features, the possibility that similar benefits can be conferred to patients with FTD has yet to be explored. In the current study, we examined the impact of presenting distinct regions of the face (whole face, eyes-only, and eyes-removed) on the ability to recognize expressions of anger, fear, disgust, and happiness in 24 patients with FTD and 24 healthy controls. A recognition deficit was demonstrated across emotions by patients with FTD relative to controls. Crucially, removal of diagnostic facial features resulted in an appropriate decline in performance for both groups; furthermore, patients with FTD demonstrated a lack of disproportionate improvement in emotion recognition accuracy as a result of isolating critical facial features relative to controls. Thus, unlike some neurological and developmental disorders featuring amygdala dysfunction, the emotion recognition deficit observed in FTD is not likely driven by selective inattention to critical facial features. Patients with FTD also mislabelled negative facial expressions as happy more often than controls, providing further evidence for abnormalities in the representation of positive affect in FTD. This work suggests that the emotional expression recognition deficit associated with FTD is unlikely to be rectified by adjusting selective attention to diagnostic features, as has proven useful in other select disorders. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Emotion recognition deficits as predictors of transition in individuals at clinical high risk for schizophrenia: a neurodevelopmental perspective.

    Science.gov (United States)

    Corcoran, C M; Keilp, J G; Kayser, J; Klim, C; Butler, P D; Bruder, G E; Gur, R C; Javitt, D C

    2015-10-01

    Schizophrenia is characterized by profound and disabling deficits in the ability to recognize emotion in facial expression and tone of voice. Although these deficits are well documented in established schizophrenia using recently validated tasks, their predictive utility in at-risk populations has not been formally evaluated. The Penn Emotion Recognition and Discrimination tasks, and recently developed measures of auditory emotion recognition, were administered to 49 clinical high-risk subjects prospectively followed for 2 years for schizophrenia outcome, and 31 healthy controls, and a developmental cohort of 43 individuals aged 7-26 years. Deficit in emotion recognition in at-risk subjects was compared with deficit in established schizophrenia, and with normal neurocognitive growth curves from childhood to early adulthood. Deficits in emotion recognition significantly distinguished at-risk patients who transitioned to schizophrenia. By contrast, more general neurocognitive measures, such as attention vigilance or processing speed, were non-predictive. The best classification model for schizophrenia onset included both face emotion processing and negative symptoms, with accuracy of 96%, and area under the receiver-operating characteristic curve of 0.99. In a parallel developmental study, emotion recognition abilities were found to reach maturity prior to traditional age of risk for schizophrenia, suggesting they may serve as objective markers of early developmental insult. Profound deficits in emotion recognition exist in at-risk patients prior to schizophrenia onset. They may serve as an index of early developmental insult, and represent an effective target for early identification and remediation. Future studies investigating emotion recognition deficits at both mechanistic and predictive levels are strongly encouraged.

  14. Method for Face-Emotion Retrieval Using A Cartoon Emotional Expression Approach

    Science.gov (United States)

    Kostov, Vlaho; Yanagisawa, Hideyoshi; Johansson, Martin; Fukuda, Shuichi

    A simple method for extracting emotion from a human face, as a form of non-verbal communication, was developed to cope with and optimize mobile communication in a globalized and diversified society. A cartoon face based model was developed and used to evaluate emotional content of real faces. After a pilot survey, basic rules were defined and student subjects were asked to express emotion using the cartoon face. Their face samples were then analyzed using principal component analysis and the Mahalanobis distance method. Feature parameters considered as having relations with emotions were extracted and new cartoon faces (based on these parameters) were generated. The subjects evaluated emotion of these cartoon faces again and we confirmed these parameters were suitable. To confirm how these parameters could be applied to real faces, we asked subjects to express the same emotions which were then captured electronically. Simple image processing techniques were also developed to extract these features from real faces and we then compared them with the cartoon face parameters. It is demonstrated via the cartoon face that we are able to express the emotions from very small amounts of information. As a result, real and cartoon faces correspond to each other. It is also shown that emotion could be extracted from still and dynamic real face images using these cartoon-based features.

  15. Different Neural Correlates of Emotion-Label Words and Emotion-Laden Words: An ERP Study

    Directory of Open Access Journals (Sweden)

    Juan Zhang

    2017-09-01

    Full Text Available It is well-documented that both emotion-label words (e.g., sadness, happiness and emotion-laden words (e.g., death, wedding can induce emotion activation. However, the neural correlates of emotion-label words and emotion-laden words recognition have not been examined. The present study aimed to compare the underlying neural responses when processing the two kinds of words by employing event-related potential (ERP measurements. Fifteen Chinese native speakers were asked to perform a lexical decision task in which they should judge whether a two-character compound stimulus was a real word or not. Results showed that (1 emotion-label words and emotion-laden words elicited similar P100 at the posteriors sites, (2 larger N170 was found for emotion-label words than for emotion-laden words at the occipital sites on the right hemisphere, and (3 negative emotion-label words elicited larger Late Positivity Complex (LPC on the right hemisphere than on the left hemisphere while such effect was not found for emotion-laden words and positive emotion-label words. The results indicate that emotion-label words and emotion-laden words elicit different cortical responses at both early (N170 and late (LPC stages. In addition, right hemisphere advantage for emotion-label words over emotion-laden words can be observed in certain time windows (i.e., N170 and LPC while fails to be detected in some other time window (i.e., P100. The implications of the current findings for future emotion research were discussed.

  16. Different Neural Correlates of Emotion-Label Words and Emotion-Laden Words: An ERP Study.

    Science.gov (United States)

    Zhang, Juan; Wu, Chenggang; Meng, Yaxuan; Yuan, Zhen

    2017-01-01

    It is well-documented that both emotion-label words (e.g., sadness, happiness) and emotion-laden words (e.g., death, wedding) can induce emotion activation. However, the neural correlates of emotion-label words and emotion-laden words recognition have not been examined. The present study aimed to compare the underlying neural responses when processing the two kinds of words by employing event-related potential (ERP) measurements. Fifteen Chinese native speakers were asked to perform a lexical decision task in which they should judge whether a two-character compound stimulus was a real word or not. Results showed that (1) emotion-label words and emotion-laden words elicited similar P100 at the posteriors sites, (2) larger N170 was found for emotion-label words than for emotion-laden words at the occipital sites on the right hemisphere, and (3) negative emotion-label words elicited larger Late Positivity Complex (LPC) on the right hemisphere than on the left hemisphere while such effect was not found for emotion-laden words and positive emotion-label words. The results indicate that emotion-label words and emotion-laden words elicit different cortical responses at both early (N170) and late (LPC) stages. In addition, right hemisphere advantage for emotion-label words over emotion-laden words can be observed in certain time windows (i.e., N170 and LPC) while fails to be detected in some other time window (i.e., P100). The implications of the current findings for future emotion research were discussed.

  17. KDEF-PT: Valence, Emotional Intensity, Familiarity and Attractiveness Ratings of Angry, Neutral, and Happy Faces.

    Science.gov (United States)

    Garrido, Margarida V; Prada, Marília

    2017-01-01

    The Karolinska Directed Emotional Faces (KDEF) is one of the most widely used human facial expressions database. Almost a decade after the original validation study (Goeleven et al., 2008), we present subjective rating norms for a sub-set of 210 pictures which depict 70 models (half female) each displaying an angry, happy and neutral facial expressions. Our main goals were to provide an additional and updated validation to this database, using a sample from a different nationality ( N = 155 Portuguese students, M = 23.73 years old, SD = 7.24) and to extend the number of subjective dimensions used to evaluate each image. Specifically, participants reported emotional labeling (forced-choice task) and evaluated the emotional intensity and valence of the expression, as well as the attractiveness and familiarity of the model (7-points rating scales). Overall, results show that happy faces obtained the highest ratings across evaluative dimensions and emotion labeling accuracy. Female (vs. male) models were perceived as more attractive, familiar and positive. The sex of the model also moderated the accuracy of emotional labeling and ratings of different facial expressions. Each picture of the set was categorized as low, moderate, or high for each dimension. Normative data for each stimulus (hits proportion, means, standard deviations, and confidence intervals per evaluative dimension) is available as supplementary material (available at https://osf.io/fvc4m/).

  18. KDEF-PT: Valence, Emotional Intensity, Familiarity and Attractiveness Ratings of Angry, Neutral, and Happy Faces

    Directory of Open Access Journals (Sweden)

    Margarida V. Garrido

    2017-12-01

    Full Text Available The Karolinska Directed Emotional Faces (KDEF is one of the most widely used human facial expressions database. Almost a decade after the original validation study (Goeleven et al., 2008, we present subjective rating norms for a sub-set of 210 pictures which depict 70 models (half female each displaying an angry, happy and neutral facial expressions. Our main goals were to provide an additional and updated validation to this database, using a sample from a different nationality (N = 155 Portuguese students, M = 23.73 years old, SD = 7.24 and to extend the number of subjective dimensions used to evaluate each image. Specifically, participants reported emotional labeling (forced-choice task and evaluated the emotional intensity and valence of the expression, as well as the attractiveness and familiarity of the model (7-points rating scales. Overall, results show that happy faces obtained the highest ratings across evaluative dimensions and emotion labeling accuracy. Female (vs. male models were perceived as more attractive, familiar and positive. The sex of the model also moderated the accuracy of emotional labeling and ratings of different facial expressions. Each picture of the set was categorized as low, moderate, or high for each dimension. Normative data for each stimulus (hits proportion, means, standard deviations, and confidence intervals per evaluative dimension is available as supplementary material (available at https://osf.io/fvc4m/.

  19. Interference among the Processing of Facial Emotion, Face Race, and Face Gender

    Science.gov (United States)

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender). PMID:27840621

  20. Interference among the Processing of Facial Emotion, Face Race, and Face Gender.

    Science.gov (United States)

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender).

  1. No Differences in Emotion Recognition Strategies in Children with Autism Spectrum Disorder: Evidence from Hybrid Faces

    Directory of Open Access Journals (Sweden)

    Kris Evers

    2014-01-01

    Full Text Available Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD. However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness or in the mouth region (so-called bottom-emotions: sadness, anger, and fear. No stronger reliance on mouth information was found in children with ASD.

  2. No differences in emotion recognition strategies in children with autism spectrum disorder: evidence from hybrid faces.

    Science.gov (United States)

    Evers, Kris; Kerkhof, Inneke; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2014-01-01

    Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD). However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD) group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region) with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness) or in the mouth region (so-called bottom-emotions: sadness, anger, and fear). No stronger reliance on mouth information was found in children with ASD.

  3. Impaired Integration of Emotional Faces and Affective Body Context in a Rare Case of Developmental Visual Agnosia

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran. R.; Bentin, Shlomo

    2011-01-01

    In the current study we examined the recognition of facial expressions embedded in emotionally expressive bodies in case LG, an individual with a rare form of developmental visual agnosia who suffers from severe prosopagnosia. Neuropsychological testing demonstrated that LG‘s agnosia is characterized by profoundly impaired visual integration. Unlike individuals with typical developmental prosopagnosia who display specific difficulties with face identity (but typically not expression) recognition, LG was also impaired at recognizing isolated facial expressions. By contrast, he successfully recognized the expressions portrayed by faceless emotional bodies handling affective paraphernalia. When presented with contextualized faces in emotional bodies his ability to detect the emotion expressed by a face did not improve even if it was embedded in an emotionally-congruent body context. Furthermore, in contrast to controls, LG displayed an abnormal pattern of contextual influence from emotionally-incongruent bodies. The results are interpreted in the context of a general integration deficit in developmental visual agnosia, suggesting that impaired integration may extend from the level of the face to the level of the full person. PMID:21482423

  4. Attention shaping as a means to improve emotion perception deficits in outpatients with schizophrenia and impaired controls.

    Science.gov (United States)

    Combs, Dennis R; Chapman, Dustin; Waguspack, Jace; Basso, Michael R; Penn, David L

    2011-04-01

    Deficits in emotion perception are common in people with schizophrenia and current research has focused on improving these deficits. In our previous research, we demonstrated that directing attention to salient facial features via attention shaping can improve these deficits among inpatients. In this study, we examined the efficacy of an enhanced attention shaping program that contains 192 emotional expressions from which 25 are randomly presented for training. We extended our previous work by using repeated administrations of the shaping intervention and testing its effect in outpatients with schizophrenia and impaired controls. Fifteen participants with schizophrenia and fourteen college student controls with emotion perception deficits were randomly assigned to 1, 3 or 5 sessions of attention shaping. Participants completed 2 outcome measures of emotion perception, the FEIT and BLERT, not presented during the training, and underwent eye tracking at pre and post-tests. All conditions and groups improved, but the largest improvements on the BLERT and FEIT were found for persons assigned to the 5 session condition. Performance on the shaping program was positively correlated with the two outcome measures of emotion perception. There was less support for changes in visual scanning of faces as there was a relative reduction in total scanning time from pre-test to post-test. Results are interpreted in terms of the efficacy of attention shaping as a means to improve emotion perception deficits. Copyright © 2010 Elsevier B.V. All rights reserved.

  5. Emotional face recognition in adolescent suicide attempters and adolescents engaging in non-suicidal self-injury.

    Science.gov (United States)

    Seymour, Karen E; Jones, Richard N; Cushman, Grace K; Galvan, Thania; Puzia, Megan E; Kim, Kerri L; Spirito, Anthony; Dickstein, Daniel P

    2016-03-01

    Little is known about the bio-behavioral mechanisms underlying and differentiating suicide attempts from non-suicidal self-injury (NSSI) in adolescents. Adolescents who attempt suicide or engage in NSSI often report significant interpersonal and social difficulties. Emotional face recognition ability is a fundamental skill required for successful social interactions, and deficits in this ability may provide insight into the unique brain-behavior interactions underlying suicide attempts versus NSSI in adolescents. Therefore, we examined emotional face recognition ability among three mutually exclusive groups: (1) inpatient adolescents who attempted suicide (SA, n = 30); (2) inpatient adolescents engaged in NSSI (NSSI, n = 30); and (3) typically developing controls (TDC, n = 30) without psychiatric illness. Participants included adolescents aged 13-17 years, matched on age, gender and full-scale IQ. Emotional face recognition was evaluated using the diagnostic assessment of nonverbal accuracy (DANVA-2). Compared to TDC youth, adolescents with NSSI made more errors on child fearful and adult sad face recognition while controlling for psychopathology and medication status (ps face recognition between NSSI and SA groups. Secondary analyses showed that compared to inpatients without major depression, those with major depression made fewer errors on adult sad face recognition even when controlling for group status (p recognition errors on adult happy faces even when controlling for group status (p face recognition than TDC, but not inpatient adolescents who attempted suicide. Further results suggest the importance of psychopathology in emotional face recognition. Replication of these preliminary results and examination of the role of context-dependent emotional processing are needed moving forward.

  6. Increased amygdala responses to emotional faces after psilocybin for treatment-resistant depression.

    Science.gov (United States)

    Roseman, Leor; Demetriou, Lysia; Wall, Matthew B; Nutt, David J; Carhart-Harris, Robin L

    2017-12-27

    Recent evidence indicates that psilocybin with psychological support may be effective for treating depression. Some studies have found that patients with depression show heightened amygdala responses to fearful faces and there is reliable evidence that treatment with SSRIs attenuates amygdala responses (Ma, 2015). We hypothesised that amygdala responses to emotional faces would be altered post-treatment with psilocybin. In this open-label study, 20 individuals diagnosed with moderate to severe, treatment-resistant depression, underwent two separate dosing sessions with psilocybin. Psychological support was provided before, during and after these sessions and 19 completed fMRI scans one week prior to the first session and one day after the second and last. Neutral, fearful and happy faces were presented in the scanner and analyses focused on the amygdala. Group results revealed rapid and enduring improvements in depressive symptoms post psilocybin. Increased responses to fearful and happy faces were observed in the right amygdala post-treatment, and right amygdala increases to fearful versus neutral faces were predictive of clinical improvements at 1-week. Psilocybin with psychological support was associated with increased amygdala responses to emotional stimuli, an opposite effect to previous findings with SSRIs. This suggests fundamental differences in these treatments' therapeutic actions, with SSRIs mitigating negative emotions and psilocybin allowing patients to confront and work through them. Based on the present results, we propose that psilocybin with psychological support is a treatment approach that potentially revives emotional responsiveness in depression, enabling patients to reconnect with their emotions. ISRCTN, number ISRCTN14426797. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Processing of Facial Emotion in Bipolar Depression and Euthymia.

    Science.gov (United States)

    Robinson, Lucy J; Gray, John M; Burt, Mike; Ferrier, I Nicol; Gallagher, Peter

    2015-10-01

    Previous studies of facial emotion processing in bipolar disorder (BD) have reported conflicting findings. In independently conducted studies, we investigate facial emotion labeling in euthymic and depressed BD patients using tasks with static and dynamically morphed images of different emotions displayed at different intensities. Study 1 included 38 euthymic BD patients and 28 controls. Participants completed two tasks: labeling of static images of basic facial emotions (anger, disgust, fear, happy, sad) shown at different expression intensities; the Eyes Test (Baron-Cohen, Wheelwright, Hill, Raste, & Plumb, 2001), which involves recognition of complex emotions using only the eye region of the face. Study 2 included 53 depressed BD patients and 47 controls. Participants completed two tasks: labeling of "dynamic" facial expressions of the same five basic emotions; the Emotional Hexagon test (Young, Perret, Calder, Sprengelmeyer, & Ekman, 2002). There were no significant group differences on any measures of emotion perception/labeling, compared to controls. A significant group by intensity interaction was observed in both emotion labeling tasks (euthymia and depression), although this effect did not survive the addition of measures of executive function/psychomotor speed as covariates. Only 2.6-15.8% of euthymic patients and 7.8-13.7% of depressed patients scored below the 10th percentile of the controls for total emotion recognition accuracy. There was no evidence of specific deficits in facial emotion labeling in euthymic or depressed BD patients. Methodological variations-including mood state, sample size, and the cognitive demands of the tasks-may contribute significantly to the variability in findings between studies.

  8. Altered Kinematics of Facial Emotion Expression and Emotion Recognition Deficits Are Unrelated in Parkinson's Disease.

    Science.gov (United States)

    Bologna, Matteo; Berardelli, Isabella; Paparella, Giulia; Marsili, Luca; Ricciardi, Lucia; Fabbrini, Giovanni; Berardelli, Alfredo

    2016-01-01

    Altered emotional processing, including reduced emotion facial expression and defective emotion recognition, has been reported in patients with Parkinson's disease (PD). However, few studies have objectively investigated facial expression abnormalities in PD using neurophysiological techniques. It is not known whether altered facial expression and recognition in PD are related. To investigate possible deficits in facial emotion expression and emotion recognition and their relationship, if any, in patients with PD. Eighteen patients with PD and 16 healthy controls were enrolled in this study. Facial expressions of emotion were recorded using a 3D optoelectronic system and analyzed using the facial action coding system. Possible deficits in emotion recognition were assessed using the Ekman test. Participants were assessed in one experimental session. Possible relationship between the kinematic variables of facial emotion expression, the Ekman test scores, and clinical and demographic data in patients were evaluated using the Spearman's test and multiple regression analysis. The facial expression of all six basic emotions had slower velocity and lower amplitude in patients in comparison to healthy controls (all P s facial expression kinematics and emotion recognition deficits were unrelated in patients (all P s > 0.05). Finally, no relationship emerged between kinematic variables of facial emotion expression, the Ekman test scores, and clinical and demographic data in patients (all P s > 0.05). The results in this study provide further evidence of altered emotional processing in PD. The lack of any correlation between altered facial emotion expression kinematics and emotion recognition deficits in patients suggests that these abnormalities are mediated by separate pathophysiological mechanisms.

  9. Emotion elicitor or emotion messenger? Subliminal priming reveals two faces of facial expressions.

    Science.gov (United States)

    Ruys, Kirsten I; Stapel, Diederik A

    2008-06-01

    Facial emotional expressions can serve both as emotional stimuli and as communicative signals. The research reported here was conducted to illustrate how responses to both roles of facial emotional expressions unfold over time. As an emotion elicitor, a facial emotional expression (e.g., a disgusted face) activates a response that is similar to responses to other emotional stimuli of the same valence (e.g., a dirty, nonflushed toilet). As an emotion messenger, the same facial expression (e.g., a disgusted face) serves as a communicative signal by also activating the knowledge that the sender is experiencing a specific emotion (e.g., the sender feels disgusted). By varying the duration of exposure to disgusted, fearful, angry, and neutral faces in two subliminal-priming studies, we demonstrated that responses to faces as emotion elicitors occur prior to responses to faces as emotion messengers, and that both types of responses may unfold unconsciously.

  10. Emotional face recognition deficits and medication effects in pre-manifest through stage-II Huntington's disease.

    Science.gov (United States)

    Labuschagne, Izelle; Jones, Rebecca; Callaghan, Jenny; Whitehead, Daisy; Dumas, Eve M; Say, Miranda J; Hart, Ellen P; Justo, Damian; Coleman, Allison; Dar Santos, Rachelle C; Frost, Chris; Craufurd, David; Tabrizi, Sarah J; Stout, Julie C

    2013-05-15

    Facial emotion recognition impairments have been reported in Huntington's disease (HD). However, the nature of the impairments across the spectrum of HD remains unclear. We report on emotion recognition data from 344 participants comprising premanifest HD (PreHD) and early HD patients, and controls. In a test of recognition of facial emotions, we examined responses to six basic emotional expressions and neutral expressions. In addition, and within the early HD sample, we tested for differences on emotion recognition performance between those 'on' vs. 'off' neuroleptic or selective serotonin reuptake inhibitor (SSRI) medications. The PreHD groups showed significant (precognition, compared to controls, on fearful, angry and surprised faces; whereas the early HD groups were significantly impaired across all emotions including neutral expressions. In early HD, neuroleptic use was associated with worse facial emotion recognition, whereas SSRI use was associated with better facial emotion recognition. The findings suggest that emotion recognition impairments exist across the HD spectrum, but are relatively more widespread in manifest HD than in the premanifest period. Commonly prescribed medications to treat HD-related symptoms also appear to affect emotion recognition. These findings have important implications for interpersonal communication and medication usage in HD. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  11. Affect labeling enhances exposure effectiveness for public speaking anxiety.

    Science.gov (United States)

    Niles, Andrea N; Craske, Michelle G; Lieberman, Matthew D; Hur, Christopher

    2015-05-01

    Exposure is an effective treatment for anxiety but many patients do not respond fully. Affect labeling (labeling emotional experience) attenuates emotional responding. The current project examined whether affect labeling enhances exposure effectiveness in participants with public speaking anxiety. Participants were randomized to exposure with or without affect labeling. Physiological arousal and self-reported fear were assessed before and after exposure and compared between groups. Consistent with hypotheses, participants assigned to Affect Labeling, especially those who used more labels during exposure, showed greater reduction in physiological activation than Control participants. No effect was found for self-report measures. Also, greater emotion regulation deficits at baseline predicted more benefit in physiological arousal from exposure combined with affect labeling than exposure alone. The current research provides evidence that behavioral strategies that target prefrontal-amygdala circuitry can improve treatment effectiveness for anxiety and these effects are particularly pronounced for patients with the greatest deficits in emotion regulation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Emotion regulation in Asperger's syndrome and high-functioning autism.

    Science.gov (United States)

    Samson, Andrea C; Huber, Oswald; Gross, James J

    2012-08-01

    It is generally thought that individuals with Asperger's syndrome and high-functioning autism (AS/HFA) have deficits in theory of mind. These deficits have been previously linked to problems with social cognition. However, we reasoned that AS/HFA individuals' Theory of Mind deficits also might lead to problems with emotion regulation. To assess emotional functioning in AS/HFA, 27 AS/HFA adults (16 women) and 27 age-, gender-, and education-matched typically developing (TD) participants completed a battery of measures of emotion experience, labeling, and regulation. With respect to emotion experience, individuals with AS/HFA reported higher levels of negative emotions, but similar levels of positive emotions, compared with TD individuals. With respect to emotion labeling, individuals with AS/HFA had greater difficulties identifying and describing their emotions, with approximately two-thirds exceeding the cutoff for alexithymia. With respect to emotion regulation, individuals with AS/HFA used reappraisal less frequently than TD individuals and reported lower levels of reappraisal self-efficacy. Although AS/HFA individuals used suppression more frequently than TD individuals, no difference in suppression self-efficacy was found. It is important to note that these differences in emotion regulation were evident even when controlling for emotion experience and labeling. Implications of these deficits are discussed, and future research directions are proposed.

  13. Emotional Labor, Face and Guan xi

    Institute of Scientific and Technical Information of China (English)

    Tianwenling

    2017-01-01

    Emotional Labor, Face and Guan xi are all relevant to performance, appearance, and emotional feelings, which are essential elements in work place. In other words, not only front-line workers, but all employees in an organization is faced up with the three

  14. Emotionally anesthetized: media violence induces neural changes during emotional face processing

    OpenAIRE

    Stockdale, Laura A.; Morrison, Robert G.; Kmiecik, Matthew J.; Garbarino, James; Silton, Rebecca L.

    2015-01-01

    Media violence exposure causes increased aggression and decreased prosocial behavior, suggesting that media violence desensitizes people to the emotional experience of others. Alterations in emotional face processing following exposure to media violence may result in desensitization to others’ emotional states. This study used scalp electroencephalography methods to examine the link between exposure to violence and neural changes associated with emotional face processing. Twenty-five particip...

  15. Task-irrelevant emotion facilitates face discrimination learning.

    Science.gov (United States)

    Lorenzino, Martina; Caudek, Corrado

    2015-03-01

    We understand poorly how the ability to discriminate faces from one another is shaped by visual experience. The purpose of the present study is to determine whether face discrimination learning can be facilitated by facial emotions. To answer this question, we used a task-irrelevant perceptual learning paradigm because it closely mimics the learning processes that, in daily life, occur without a conscious intention to learn and without an attentional focus on specific facial features. We measured face discrimination thresholds before and after training. During the training phase (4 days), participants performed a contrast discrimination task on face images. They were not informed that we introduced (task-irrelevant) subtle variations in the face images from trial to trial. For the Identity group, the task-irrelevant features were variations along a morphing continuum of facial identity. For the Emotion group, the task-irrelevant features were variations along an emotional expression morphing continuum. The Control group did not undergo contrast discrimination learning and only performed the pre-training and post-training tests, with the same temporal gap between them as the other two groups. Results indicate that face discrimination improved, but only for the Emotion group. Participants in the Emotion group, moreover, showed face discrimination improvements also for stimulus variations along the facial identity dimension, even if these (task-irrelevant) stimulus features had not been presented during training. The present results highlight the importance of emotions for face discrimination learning. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. The ties to unbind: Age-related differences in feature (unbinding in working memory for emotional faces

    Directory of Open Access Journals (Sweden)

    Didem ePehlivanoglu

    2014-04-01

    Full Text Available In the present study, we investigated age-related differences in the processing of emotional stimuli. Specifically, we were interested in whether older adults would show deficits in unbinding emotional expression (i.e., either no emotion, happiness, anger, or disgust from bound stimuli (i.e., photographs of faces expressing these emotions, as a hyperbinding account of age-related differences in working memory would predict. Younger and older adults completed different N-Back tasks (side-by-side 0-Back, 1-Back, 2-Back under three conditions: match/mismatch judgments based on either the identity of the face (identity condition, the face’s emotional expression (expression condition, or both identity and expression of the face (binding condition. Both age groups performed more slowly and with lower accuracy in the expression condition than in the binding condition, indicating the presence of an unbinding process. This unbinding effect was more pronounced in older adults than in younger adults, but only in the 2-Back task. Thus, older adults seemed to have a specific deficit in unbinding in working memory, over and beyond age-related differences observed in perceptual processing (0-Back and attention/short-term memory (1-Back. Additionally, no age-related differences were found in accuracy in the 0-Back task, but such differences emerged in the 1-Back task, and were further magnified in the 2-Back task, indicating independent age-related differences in attention/short-term memory and working memory. Pupil dilation data confirmed that the attention/short-term memory version of the task (1-Back is more effortful in older adults than younger adults.

  17. Emotionally anesthetized: media violence induces neural changes during emotional face processing.

    Science.gov (United States)

    Stockdale, Laura A; Morrison, Robert G; Kmiecik, Matthew J; Garbarino, James; Silton, Rebecca L

    2015-10-01

    Media violence exposure causes increased aggression and decreased prosocial behavior, suggesting that media violence desensitizes people to the emotional experience of others. Alterations in emotional face processing following exposure to media violence may result in desensitization to others' emotional states. This study used scalp electroencephalography methods to examine the link between exposure to violence and neural changes associated with emotional face processing. Twenty-five participants were shown a violent or nonviolent film clip and then completed a gender discrimination stop-signal task using emotional faces. Media violence did not affect the early visual P100 component; however, decreased amplitude was observed in the N170 and P200 event-related potentials following the violent film, indicating that exposure to film violence leads to suppression of holistic face processing and implicit emotional processing. Participants who had just seen a violent film showed increased frontal N200/P300 amplitude. These results suggest that media violence exposure may desensitize people to emotional stimuli and thereby require fewer cognitive resources to inhibit behavior. © The Author (2015). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  18. Emotion Recognition in Children With Down Syndrome: Influence of Emotion Label and Expression Intensity.

    Science.gov (United States)

    Cebula, Katie R; Wishart, Jennifer G; Willis, Diane S; Pitcairn, Tom K

    2017-03-01

    Some children with Down syndrome may experience difficulties in recognizing facial emotions, particularly fear, but it is not clear why, nor how such skills can best be facilitated. Using a photo-matching task, emotion recognition was tested in children with Down syndrome, children with nonspecific intellectual disability and cognitively matched, typically developing children (all groups N = 21) under four conditions: veridical vs. exaggerated emotions and emotion-labelling vs. generic task instructions. In all groups, exaggerating emotions facilitated recognition accuracy and speed, with emotion labelling facilitating recognition accuracy. Overall accuracy and speed did not differ in the children with Down syndrome, although recognition of fear was poorer than in the typically developing children and unrelated to emotion label use. Implications for interventions are considered.

  19. Interference among the Processing of Facial Emotion, Face Race, and Face Gender

    OpenAIRE

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People are able to simultaneously process multiple dimensions of facial properties. Facial processing models are based on the processing of facial properties. This paper examined the processing of facial emotion, face race and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interfered with face race in all the tasks. The interaction of face race and face gend...

  20. Exposure to childhood adversity and deficits in emotion recognition: results from a large, population-based sample.

    Science.gov (United States)

    Dunn, Erin C; Crawford, Katherine M; Soare, Thomas W; Button, Katherine S; Raffeld, Miriam R; Smith, Andrew D A C; Penton-Voak, Ian S; Munafò, Marcus R

    2018-03-07

    Emotion recognition skills are essential for social communication. Deficits in these skills have been implicated in mental disorders. Prior studies of clinical and high-risk samples have consistently shown that children exposed to adversity are more likely than their unexposed peers to have emotion recognition skills deficits. However, only one population-based study has examined this association. We analyzed data from children participating in the Avon Longitudinal Study of Parents and Children, a prospective birth cohort (n = 6,506). We examined the association between eight adversities, assessed repeatedly from birth to age 8 (caregiver physical or emotional abuse; sexual or physical abuse; maternal psychopathology; one adult in the household; family instability; financial stress; parent legal problems; neighborhood disadvantage) and the ability to recognize facial displays of emotion measured using the faces subtest of the Diagnostic Assessment of Non-Verbal Accuracy (DANVA) at age 8.5 years. In addition to examining the role of exposure (vs. nonexposure) to each type of adversity, we also evaluated the role of the timing, duration, and recency of each adversity using a Least Angle Regression variable selection procedure. Over three-quarters of the sample experienced at least one adversity. We found no evidence to support an association between emotion recognition deficits and previous exposure to adversity, either in terms of total lifetime exposure, timing, duration, or recency, or when stratifying by sex. Results from the largest population-based sample suggest that even extreme forms of adversity are unrelated to emotion recognition deficits as measured by the DANVA, suggesting the possible immutability of emotion recognition in the general population. These findings emphasize the importance of population-based studies to generate generalizable results. © 2018 Association for Child and Adolescent Mental Health.

  1. The complex duration perception of emotional faces: Effects of face direction

    Directory of Open Access Journals (Sweden)

    Katrin Martina Kliegl

    2015-03-01

    Full Text Available The perceived duration of emotional face stimuli strongly depends on the expressed emotion. But, emotional faces also differ regarding a number of other features like gaze, face direction, or sex. Usually, these features have been controlled by only using pictures of female models with straight gaze and face direction. Doi and Shinohara (2009 reported that an overestimation of angry faces could only be found when the model’s gaze was oriented towards the observer. We aimed at replicating this effect for face direction. Moreover, we explored the effect of face direction on the duration perception sad faces. Controlling for the sex of the face model and the participant, female and male participants rated the duration of neutral, angry and sad face stimuli of both sexes photographed from different perspectives in a bisection task. In line with current findings, we report a significant overestimation of angry compared to neutral face stimuli that was modulated by face direction. Moreover, the perceived duration of sad face stimuli did not differ from that of neutral faces and was not influenced by face direction. Furthermore, we found that faces of the opposite sex appeared to last longer than those of the same sex. This outcome is discussed with regards to stimulus parameters like the induced arousal, social relevance and an evolutionary context.

  2. Is emotion recognition the only problem in ADHD? effects of pharmacotherapy on face and emotion recognition in children with ADHD.

    Science.gov (United States)

    Demirci, Esra; Erdogan, Ayten

    2016-12-01

    The objectives of this study were to evaluate both face and emotion recognition, to detect differences among attention deficit and hyperactivity disorder (ADHD) subgroups, to identify effects of the gender and to assess the effects of methylphenidate and atomoxetine treatment on both face and emotion recognition in patients with ADHD. The study sample consisted of 41 male, 29 female patients, 8-15 years of age, who were diagnosed as having combined type ADHD (N = 26), hyperactive/impulsive type ADHD (N = 21) or inattentive type ADHD (N = 23) but had not previously used any medication for ADHD and 35 male, 25 female healthy individuals. Long-acting methylphenidate (OROS-MPH) was prescribed to 38 patients, whereas atomoxetine was prescribed to 32 patients. The reading the mind in the eyes test (RMET) and Benton face recognition test (BFRT) were applied to all participants before and after treatment. The patients with ADHD had a significantly lower number of correct answers in child and adolescent RMET and in BFRT than the healthy controls. Among the ADHD subtypes, the hyperactive/impulsive subtype had a lower number of correct answers in the RMET than the inattentive subtypes, and the hyperactive/impulsive subtype had a lower number of correct answers in short and long form of BFRT than the combined and inattentive subtypes. Male and female patients with ADHD did not differ significantly with respect to the number of correct answers on the RMET and BFRT. The patients showed significant improvement in RMET and BFRT after treatment with OROS-MPH or atomoxetine. Patients with ADHD have difficulties in face recognition as well as emotion recognition. Both OROS-MPH and atomoxetine affect emotion recognition. However, further studies on the face and emotion recognition are needed in ADHD.

  3. Emotional Intelligence in Learners with Attention Deficit Disorder

    Science.gov (United States)

    Wootton, Carol Anne; Roets, H. E.

    2013-01-01

    This study was undertaken to analyse and evaluate the nature and quality of emotional intelligence in learners with Attention Deficit Disorder (ADD), and to investigate whether their emotional intelligence was enhanced, and whether the symptoms and behaviour of these learners improved, after exposure to a programme on emotional intelligence.…

  4. Enhanced amygdala reactivity to emotional faces in adults reporting childhood emotional maltreatment

    Science.gov (United States)

    van Tol, Marie-José; Demenescu, Liliana R.; van der Wee, Nic J. A.; Veltman, Dick J.; Aleman, André; van Buchem, Mark A.; Spinhoven, Philip; Penninx, Brenda W. J. H.; Elzinga, Bernet M.

    2013-01-01

    In the context of chronic childhood emotional maltreatment (CEM; emotional abuse and/or neglect), adequately responding to facial expressions is an important skill. Over time, however, this adaptive response may lead to a persistent vigilance for emotional facial expressions. The amygdala and the medial prefrontal cortex (mPFC) are key regions in face processing. However, the neurobiological correlates of face processing in adults reporting CEM are yet unknown. We examined amydala and mPFC reactivity to emotional faces (Angry, Fearful, Sad, Happy, Neutral) vs scrambled faces in healthy controls and unmedicated patients with depression and/or anxiety disorders reporting CEM before the age of 16 years (n = 60), and controls and patients who report no childhood abuse (n = 75). We found that CEM was associated with enhanced bilateral amygdala reactivity to emotional faces in general, and independent of psychiatric status. Furthermore, we found no support for differential mPFC functioning, suggesting that amygdala hyper-responsivity to emotional facial perception in adults reporting CEM may be independent from top–down influences of the mPFC. These findings may be key in understanding the increased emotional sensitivity and interpersonal difficulties, that have been reported in individuals with a history of CEM. PMID:22258799

  5. Brain Structural Correlates of Emotion Recognition in Psychopaths.

    Directory of Open Access Journals (Sweden)

    Vanessa Pera-Guardiola

    Full Text Available Individuals with psychopathy present deficits in the recognition of facial emotional expressions. However, the nature and extent of these alterations are not fully understood. Furthermore, available data on the functional neural correlates of emotional face recognition deficits in adult psychopaths have provided mixed results. In this context, emotional face morphing tasks may be suitable for clarifying mild and emotion-specific impairments in psychopaths. Likewise, studies exploring corresponding anatomical correlates may be useful for disentangling available neurofunctional evidence based on the alleged neurodevelopmental roots of psychopathic traits. We used Voxel-Based Morphometry and a morphed emotional face expression recognition task to evaluate the relationship between regional gray matter (GM volumes and facial emotion recognition deficits in male psychopaths. In comparison to male healthy controls, psychopaths showed deficits in the recognition of sad, happy and fear emotional expressions. In subsequent brain imaging analyses psychopaths with better recognition of facial emotional expressions showed higher volume in the prefrontal cortex (orbitofrontal, inferior frontal and dorsomedial prefrontal cortices, somatosensory cortex, anterior insula, cingulate cortex and the posterior lobe of the cerebellum. Amygdala and temporal lobe volumes contributed to better emotional face recognition in controls only. These findings provide evidence suggesting that variability in brain morphometry plays a role in accounting for psychopaths' impaired ability to recognize emotional face expressions, and may have implications for comprehensively characterizing the empathy and social cognition dysfunctions typically observed in this population of subjects.

  6. Different Neural Correlates of Emotion-Label Words and Emotion-Laden Words: An ERP Study

    OpenAIRE

    Zhang, Juan; Wu, Chenggang; Meng, Yaxuan; Yuan, Zhen

    2017-01-01

    It is well-documented that both emotion-label words (e.g., sadness, happiness) and emotion-laden words (e.g., death, wedding) can induce emotion activation. However, the neural correlates of emotion-label words and emotion-laden words recognition have not been examined. The present study aimed to compare the underlying neural responses when processing the two kinds of words by employing event-related potential (ERP) measurements. Fifteen Chinese native speakers were asked to perform a lexical...

  7. The Contribution of Deficits in Emotional Clarity to Stress Responses and Depression

    OpenAIRE

    Flynn, Megan; Rudolph, Karen D.

    2010-01-01

    This research investigated the contribution of deficits in emotional clarity to children’s socioemotional adjustment. Specifically, this study examined the proposal that deficits in emotional clarity are associated with maladaptive interpersonal stress responses, and that maladaptive interpersonal stress responses act as a mechanism linking deficits in emotional clarity to childhood depressive symptoms. Participants included 345 3rd graders (M age = 8.89, SD = .34) assessed at two waves, appr...

  8. Americans and Palestinians judge spontaneous facial expressions of emotion.

    Science.gov (United States)

    Kayyal, Mary H; Russell, James A

    2013-10-01

    The claim that certain emotions are universally recognized from facial expressions is based primarily on the study of expressions that were posed. The current study was of spontaneous facial expressions shown by aborigines in Papua New Guinea (Ekman, 1980); 17 faces claimed to convey one (or, in the case of blends, two) basic emotions and five faces claimed to show other universal feelings. For each face, participants rated the degree to which each of the 12 predicted emotions or feelings was conveyed. The modal choice for English-speaking Americans (n = 60), English-speaking Palestinians (n = 60), and Arabic-speaking Palestinians (n = 44) was the predicted label for only 4, 5, and 4, respectively, of the 17 faces for basic emotions, and for only 2, 2, and 2, respectively, of the 5 faces for other feelings. Observers endorsed the predicted emotion or feeling moderately often (65%, 55%, and 44%), but also denied it moderately often (35%, 45%, and 56%). They also endorsed more than one (or, for blends, two) label(s) in each face-on average, 2.3, 2.3, and 1.5 of basic emotions and 2.6, 2.2, and 1.5 of other feelings. There were both similarities and differences across culture and language, but the emotional meaning of a facial expression is not well captured by the predicted label(s) or, indeed, by any single label.

  9. Seeing emotion with your ears: emotional prosody implicitly guides visual attention to faces.

    Directory of Open Access Journals (Sweden)

    Simon Rigoulot

    Full Text Available Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality and emotional speech prosody (auditory modality which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0-1250 ms], [1250-2500 ms], [2500-5000 ms] were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect, although this effect was often emotion-specific (with greatest effects for fear. Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions.

  10. Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

    Science.gov (United States)

    Rigoulot, Simon; Pell, Marc D.

    2012-01-01

    Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions. PMID:22303454

  11. Emotion recognition in borderline personality disorder: effects of emotional information on negative bias.

    Science.gov (United States)

    Fenske, Sabrina; Lis, Stefanie; Liebke, Lisa; Niedtfeld, Inga; Kirsch, Peter; Mier, Daniela

    2015-01-01

    Borderline Personality Disorder (BPD) is characterized by severe deficits in social interactions, which might be linked to deficits in emotion recognition. Research on emotion recognition abilities in BPD revealed heterogeneous results, ranging from deficits to heightened sensitivity. The most stable findings point to an impairment in the evaluation of neutral facial expressions as neutral, as well as to a negative bias in emotion recognition; that is the tendency to attribute negative emotions to neutral expressions, or in a broader sense to report a more negative emotion category than depicted. However, it remains unclear which contextual factors influence the occurrence of this negative bias. Previous studies suggest that priming by preceding emotional information and also constrained processing time might augment the emotion recognition deficit in BPD. To test these assumptions, 32 female BPD patients and 31 healthy females, matched for age and education, participated in an emotion recognition study, in which every facial expression was preceded by either a positive, neutral or negative scene. Furthermore, time constraints for processing were varied by presenting the facial expressions with short (100 ms) or long duration (up to 3000 ms) in two separate blocks. BPD patients showed a significant deficit in emotion recognition for neutral and positive facial expression, associated with a significant negative bias. In BPD patients, this emotion recognition deficit was differentially affected by preceding emotional information and time constraints, with a greater influence of emotional information during long face presentations and a greater influence of neutral information during short face presentations. Our results are in line with previous findings supporting the existence of a negative bias in emotion recognition in BPD patients, and provide further insights into biased social perceptions in BPD patients.

  12. Emotion recognition training using composite faces generalises across identities but not all emotions.

    Science.gov (United States)

    Dalili, Michael N; Schofield-Toloza, Lawrence; Munafò, Marcus R; Penton-Voak, Ian S

    2017-08-01

    Many cognitive bias modification (CBM) tasks use facial expressions of emotion as stimuli. Some tasks use unique facial stimuli, while others use composite stimuli, given evidence that emotion is encoded prototypically. However, CBM using composite stimuli may be identity- or emotion-specific, and may not generalise to other stimuli. We investigated the generalisability of effects using composite faces in two experiments. Healthy adults in each study were randomised to one of four training conditions: two stimulus-congruent conditions, where same faces were used during all phases of the task, and two stimulus-incongruent conditions, where faces of the opposite sex (Experiment 1) or faces depicting another emotion (Experiment 2) were used after the modification phase. Our results suggested that training effects generalised across identities. However, our results indicated only partial generalisation across emotions. These findings suggest effects obtained using composite stimuli may extend beyond the stimuli used in the task but remain emotion-specific.

  13. Differential emotion attribution to neutral faces of own and other races.

    Science.gov (United States)

    Hu, Chao S; Wang, Qiandong; Han, Tong; Weare, Ethan; Fu, Genyue

    2017-02-01

    Past research has demonstrated differential recognition of emotion on faces of different races. This paper reports the first study to explore differential emotion attribution to neutral faces of different races. Chinese and Caucasian adults viewed a series of Chinese and Caucasian neutral faces and judged their outward facial expression: neutral, positive, or negative. The results showed that both Chinese and Caucasian viewers perceived more Chinese faces than Caucasian faces as neutral. Nevertheless, Chinese viewers attributed positive emotion to Caucasian faces more than to Chinese faces, whereas Caucasian viewers attributed negative emotion to Caucasian faces more than to Chinese faces. Moreover, Chinese viewers attributed negative and neutral emotion to the faces of both races without significant difference in frequency, whereas Caucasian viewers mostly attributed neutral emotion to the faces. These differences between Chinese and Caucasian viewers may be due to differential visual experience, culture, racial stereotype, or expectation of the experiment. We also used eye tracking among the Chinese participants to explore the relationship between face-processing strategy and emotion attribution to neutral faces. The results showed that the interaction between emotion attribution and face race was significant on face-processing strategy, such as fixation proportion on eyes and saccade amplitude. Additionally, pupil size during processing Caucasian faces was larger than during processing Chinese faces.

  14. Serotonergic modulation of face-emotion recognition

    Directory of Open Access Journals (Sweden)

    C.M. Del-Ben

    2008-04-01

    Full Text Available Facial expressions of basic emotions have been widely used to investigate the neural substrates of emotion processing, but little is known about the exact meaning of subjective changes provoked by perceiving facial expressions. Our assumption was that fearful faces would be related to the processing of potential threats, whereas angry faces would be related to the processing of proximal threats. Experimental studies have suggested that serotonin modulates the brain processes underlying defensive responses to environmental threats, facilitating risk assessment behavior elicited by potential threats and inhibiting fight or flight responses to proximal threats. In order to test these predictions about the relationship between fearful and angry faces and defensive behaviors, we carried out a review of the literature about the effects of pharmacological probes that affect 5-HT-mediated neurotransmission on the perception of emotional faces. The hypothesis that angry faces would be processed as a proximal threat and that, as a consequence, their recognition would be impaired by an increase in 5-HT function was not supported by the results reviewed. In contrast, most of the studies that evaluated the behavioral effects of serotonin challenges showed that increased 5-HT neurotransmission facilitates the recognition of fearful faces, whereas its decrease impairs the same performance. These results agree with the hypothesis that fearful faces are processed as potential threats and that 5-HT enhances this brain processing.

  15. Emotion-processing deficit in alexithymia.

    Science.gov (United States)

    Roedema, T M; Simons, R F

    1999-05-01

    College undergraduates were identified as alexithymic or control, based on their scores on the Toronto Alexithymia Scale (TAS; Taylor, Ryan, & Bagby, 1985). All subjects were presented standardized emotion-eliciting color slides for 6 s while facial muscle, heart rate, and skin conductance activity were recorded. Stimuli were presented a second time while subjects were asked to provide emotion self-reports using a paper-and-pencil version of the Self-Assessment Manikin (SAM; Lang, 1980) and to generate a list of words describing their emotional reaction to each slide. Consistent with the definition of alexithymia as a syndrome characterized, in part, by a deficit in the identification of emotion states, high TAS subjects supplied fewer emotion-related words than did controls to describe their response to the slides. Alexithymics also indicated less variation along the arousal dimension of the SAM, produced fewer specific skin conductance responses and showed less heart rate deceleration to the slides, regardless of category. No valence-related differences between alexithymic and control subjects were noted.

  16. Configuration perception and face memory, and face context effects in developmental prosopagnosia.

    Science.gov (United States)

    Huis in 't Veld, Elisabeth; Van den Stock, Jan; de Gelder, Beatrice

    2012-01-01

    This study addresses two central and controversial issues in developmental prosopagnosia (DP), configuration- versus feature-based face processing and the influence of affective information from either facial or bodily expressions on face recognition. A sample of 10 DPs and 10 controls were tested with a previously developed face and object recognition and memory battery (Facial Expressive Action Stimulus Test, FEAST), a task measuring the influence of emotional faces and bodies on face identity matching (Face-Body Compound task), and an emotionally expressive face memory task (Emotional Face Memory task, FaMe-E). We show that DPs were impaired in upright, but not inverted, face matching but they performed at the level of controls on part-to-whole matching. Second, DPs showed impaired memory for both neutral and emotional faces and scored within the normal range on the Face-Body Compound task. Third, configural perception but not feature-based processing was significantly associated with memory performance. Taken together the results indicate that DPs have a deficit in configural processing at the perception stage that may underlie the memory impairment.

  17. Face shape and face identity processing in behavioral variant fronto-temporal dementia: A specific deficit for familiarity and name recognition of famous faces.

    Science.gov (United States)

    De Winter, François-Laurent; Timmers, Dorien; de Gelder, Beatrice; Van Orshoven, Marc; Vieren, Marleen; Bouckaert, Miriam; Cypers, Gert; Caekebeke, Jo; Van de Vliet, Laura; Goffin, Karolien; Van Laere, Koen; Sunaert, Stefan; Vandenberghe, Rik; Vandenbulcke, Mathieu; Van den Stock, Jan

    2016-01-01

    Deficits in face processing have been described in the behavioral variant of fronto-temporal dementia (bvFTD), primarily regarding the recognition of facial expressions. Less is known about face shape and face identity processing. Here we used a hierarchical strategy targeting face shape and face identity recognition in bvFTD and matched healthy controls. Participants performed 3 psychophysical experiments targeting face shape detection (Experiment 1), unfamiliar face identity matching (Experiment 2), familiarity categorization and famous face-name matching (Experiment 3). The results revealed group differences only in Experiment 3, with a deficit in the bvFTD group for both familiarity categorization and famous face-name matching. Voxel-based morphometry regression analyses in the bvFTD group revealed an association between grey matter volume of the left ventral anterior temporal lobe and familiarity recognition, while face-name matching correlated with grey matter volume of the bilateral ventral anterior temporal lobes. Subsequently, we quantified familiarity-specific and name-specific recognition deficits as the sum of the celebrities of which respectively only the name or only the familiarity was accurately recognized. Both indices were associated with grey matter volume of the bilateral anterior temporal cortices. These findings extent previous results by documenting the involvement of the left anterior temporal lobe (ATL) in familiarity detection and the right ATL in name recognition deficits in fronto-temporal lobar degeneration.

  18. New tests to measure individual differences in matching and labelling facial expressions of emotion, and their association with ability to recognise vocal emotions and facial identity.

    Science.gov (United States)

    Palermo, Romina; O'Connor, Kirsty B; Davis, Joshua M; Irons, Jessica; McKone, Elinor

    2013-01-01

    Although good tests are available for diagnosing clinical impairments in face expression processing, there is a lack of strong tests for assessing "individual differences"--that is, differences in ability between individuals within the typical, nonclinical, range. Here, we develop two new tests, one for expression perception (an odd-man-out matching task in which participants select which one of three faces displays a different expression) and one additionally requiring explicit identification of the emotion (a labelling task in which participants select one of six verbal labels). We demonstrate validity (careful check of individual items, large inversion effects, independence from nonverbal IQ, convergent validity with a previous labelling task), reliability (Cronbach's alphas of.77 and.76 respectively), and wide individual differences across the typical population. We then demonstrate the usefulness of the tests by addressing theoretical questions regarding the structure of face processing, specifically the extent to which the following processes are common or distinct: (a) perceptual matching and explicit labelling of expression (modest correlation between matching and labelling supported partial independence); (b) judgement of expressions from faces and voices (results argued labelling tasks tap into a multi-modal system, while matching tasks tap distinct perceptual processes); and (c) expression and identity processing (results argued for a common first step of perceptual processing for expression and identity).

  19. New Tests to Measure Individual Differences in Matching and Labelling Facial Expressions of Emotion, and Their Association with Ability to Recognise Vocal Emotions and Facial Identity

    Science.gov (United States)

    Palermo, Romina; O’Connor, Kirsty B.; Davis, Joshua M.; Irons, Jessica; McKone, Elinor

    2013-01-01

    Although good tests are available for diagnosing clinical impairments in face expression processing, there is a lack of strong tests for assessing “individual differences” – that is, differences in ability between individuals within the typical, nonclinical, range. Here, we develop two new tests, one for expression perception (an odd-man-out matching task in which participants select which one of three faces displays a different expression) and one additionally requiring explicit identification of the emotion (a labelling task in which participants select one of six verbal labels). We demonstrate validity (careful check of individual items, large inversion effects, independence from nonverbal IQ, convergent validity with a previous labelling task), reliability (Cronbach’s alphas of.77 and.76 respectively), and wide individual differences across the typical population. We then demonstrate the usefulness of the tests by addressing theoretical questions regarding the structure of face processing, specifically the extent to which the following processes are common or distinct: (a) perceptual matching and explicit labelling of expression (modest correlation between matching and labelling supported partial independence); (b) judgement of expressions from faces and voices (results argued labelling tasks tap into a multi-modal system, while matching tasks tap distinct perceptual processes); and (c) expression and identity processing (results argued for a common first step of perceptual processing for expression and identity). PMID:23840821

  20. 5-HTTLPR differentially predicts brain network responses to emotional faces

    DEFF Research Database (Denmark)

    Fisher, Patrick M; Grady, Cheryl L; Madsen, Martin K

    2015-01-01

    The effects of the 5-HTTLPR polymorphism on neural responses to emotionally salient faces have been studied extensively, focusing on amygdala reactivity and amygdala-prefrontal interactions. Despite compelling evidence that emotional face paradigms engage a distributed network of brain regions...... to fearful faces was significantly greater in S' carriers compared to LA LA individuals. These findings provide novel evidence for emotion-specific 5-HTTLPR effects on the response of a distributed set of brain regions including areas responsive to emotionally salient stimuli and critical components...... involved in emotion, cognitive and visual processing, less is known about 5-HTTLPR effects on broader network responses. To address this, we evaluated 5-HTTLPR differences in the whole-brain response to an emotional faces paradigm including neutral, angry and fearful faces using functional magnetic...

  1. The Contribution of Deficits in Emotional Clarity to Stress Responses and Depression

    Science.gov (United States)

    Flynn, Megan; Rudolph, Karen D.

    2010-01-01

    This research investigated the contribution of deficits in emotional clarity to children's socioemotional adjustment. Specifically, this study examined the proposal that deficits in emotional clarity are associated with maladaptive interpersonal stress responses, and that maladaptive interpersonal stress responses act as a mechanism linking…

  2. Free-Labeling Facial Expressions and Emotional Situations in Children Aged 3-7 Years: Developmental Trajectory and a Face Inferiority Effect

    Science.gov (United States)

    Wang, Zhenhong; Lü, Wei; Zhang, Hui; Surina, Alyssa

    2014-01-01

    Chinese children (N = 185, aged 3-7 years) were assessed on their abilities to freely label facial expressions and emotional situations. Results indicated that the overall accuracy of free-labeling facial expressions increased relatively quickly in children aged 3-5 years, but slowed down in children aged 5-7 years. In contrast, the overall…

  3. The Effect of Self-Referential Expectation on Emotional Face Processing.

    Directory of Open Access Journals (Sweden)

    Mel McKendrick

    Full Text Available The role of self-relevance has been somewhat neglected in static face processing paradigms but may be important in understanding how emotional faces impact on attention, cognition and affect. The aim of the current study was to investigate the effect of self-relevant primes on processing emotional composite faces. Sentence primes created an expectation of the emotion of the face before sad, happy, neutral or composite face photos were viewed. Eye movements were recorded and subsequent responses measured the cognitive and affective impact of the emotion expressed. Results indicated that primes did not guide attention, but impacted on judgments of valence intensity and self-esteem ratings. Negative self-relevant primes led to the most negative self-esteem ratings, although the effect of the prime was qualified by salient facial features. Self-relevant expectations about the emotion of a face and subsequent attention to a face that is congruent with these expectations strengthened the affective impact of viewing the face.

  4. Emotion regulation in mothers and young children faced with trauma.

    Science.gov (United States)

    Pat-Horenczyk, Ruth; Cohen, S; Ziv, Y; Achituv, M; Asulin-Peretz, L; Blanchard, T R; Schiff, M; Brom, D

    2015-01-01

    The present study investigated maternal emotion regulation as mediating the association between maternal posttraumatic stress symptoms and children's emotional dysregulation in a community sample of 431 Israeli mothers and children exposed to trauma. Little is known about the specific pathways through which maternal posttraumatic symptoms and deficits in emotion regulation contribute to emotional dysregulation. Inspired by the intergenerational process of relational posttraumatic stress disorder (PTSD), in which posttraumatic distress is transmitted from mothers to children, we suggest an analogous concept of relational emotion regulation, by which maternal emotion regulation problems may contribute to child emotion regulation deficits. Child emotion regulation problems were measured using the Child Behavior Checklist-Dysregulation Profile (CBCL-DP; T.M. Achenbach & I. Rescorla, 2000), which is comprised of three subscales of the CBCL: Attention, Aggression, and Anxiety/Depression. Maternal PTSD symptoms were assessed by the Posttraumatic Diagnostic Scale (E.B. Foa, L. Cashman, L. Jaycox, & K. Perry, 1997) and maternal emotion regulation by the Difficulties in Emotion Regulation Scale (K.L. Gratz & L. Roemer, 2004). Results showed that the child's emotion regulation problems were associated with both maternal posttraumatic symptoms and maternal emotion dysregulation. Further, maternal emotion regulation mediated the association between maternal posttraumatic symptoms and the child's regulation deficits. These findings highlight the central role of mothers' emotion regulation skills in the aftermath of trauma as it relates to children's emotion regulation skills. The degree of mothers' regulatory skills in the context of posttraumatic stress symptoms reflects a key process through which the intergenerational transmission of trauma may occur. Study results have critical implications for planning and developing clinical interventions geared toward the treatment of

  5. Emotional Faces Capture Spatial Attention in 5-Year-Old Children

    Directory of Open Access Journals (Sweden)

    Kit K. Elam

    2010-10-01

    Full Text Available Emotional facial expressions are important social cues that convey salient affective information. Infants, younger children, and adults all appear to orient spatial attention to emotional faces with a particularly strong bias to fearful faces. Yet in young children it is unclear whether or not both happy and fearful faces extract attention. Given that the processing of emotional faces is believed by some to serve an evolutionarily adaptive purpose, attentional biases to both fearful and happy expressions would be expected in younger children. However, the extent to which this ability is present in young children and whether or not this ability is genetically mediated is untested. Therefore, the aims of the current study were to assess the spatial-attentional properties of emotional faces in young children, with a preliminary test of whether this effect was influenced by genetics. Five-year-old twin pairs performed a dot-probe task. The results suggest that children preferentially direct spatial attention to emotional faces, particularly right visual field faces. The results provide support for the notion that the direction of spatial attention to emotional faces serves an evolutionarily adaptive function and may be mediated by genetic mechanisms.

  6. Face and emotion recognition deficits in Turner syndrome: a possible role for X-linked genes in amygdala development.

    Science.gov (United States)

    Lawrence, Kate; Kuntsi, Jonna; Coleman, Michael; Campbell, Ruth; Skuse, David

    2003-01-01

    Face recognition is thought to rely on configural visual processing. Where face recognition impairments have been identified, qualitatively delayed or anomalous configural processing has also been found. A group of women with Turner syndrome (TS) with monosomy for a single maternal X chromosome (45, Xm) showed an impairment in face recognition skills compared with normally developing women. However, normal configural face-processing abilities were apparent. The ability to recognize facial expressions of emotion, particularly fear, was also impaired in this TS subgroup. Face recognition and fear recognition accuracy were significantly correlated in the female control group but not in women with TS. The authors therefore suggest that anomalies in amygdala function may be a neurological feature of TS of this karyotype.

  7. Men appear more lateralized when noticing emotion in male faces.

    Science.gov (United States)

    Rahman, Qazi; Anchassi, Tarek

    2012-02-01

    Empirical tests of the "right hemisphere dominance" versus "valence" theories of emotion processing are confounded by known sex differences in lateralization. Moreover, information about the sex of the person posing an emotion might be processed differently by men and women because of an adaptive male bias to notice expressions of threat and vigilance in other male faces. The purpose of this study was to investigate whether sex of poser and emotion displayed influenced lateralization in men and women by analyzing "laterality quotient" scores on a test which depicts vertically split chimeric faces, formed with one half showing a neutral expression and the other half showing an emotional expression. We found that men (N = 50) were significantly more lateralized for emotions indicative of vigilance and threat (happy, sad, angry, and surprised) in male faces relative to female faces and compared to women (N = 44). These data indicate that sex differences in functional cerebral lateralization for facial emotion may be specific to the emotion presented and the sex of face presenting it. PsycINFO Database Record (c) 2012 APA, all rights reserved

  8. Altered Kinematics of Facial Emotion Expression and Emotion Recognition Deficits Are Unrelated in Parkinson?s Disease

    OpenAIRE

    Bologna, Matteo; Berardelli, Isabella; Paparella, Giulia; Marsili, Luca; Ricciardi, Lucia; Fabbrini, Giovanni; Berardelli, Alfredo

    2016-01-01

    Background Altered emotional processing, including reduced emotion facial expression and defective emotion recognition, has been reported in patients with Parkinson?s disease (PD). However, few studies have objectively investigated facial expression abnormalities in PD using neurophysiological techniques. It is not known whether altered facial expression and recognition in PD are related. Objective To investigate possible deficits in facial emotion expression and emotion recognition and their...

  9. Short-term memory for emotional faces in dysphoria.

    Science.gov (United States)

    Noreen, Saima; Ridout, Nathan

    2010-07-01

    The study aimed to determine if the memory bias for negative faces previously demonstrated in depression and dysphoria generalises from long- to short-term memory. A total of 29 dysphoric (DP) and 22 non-dysphoric (ND) participants were presented with a series of faces and asked to identify the emotion portrayed (happiness, sadness, anger, or neutral affect). Following a delay, four faces were presented (the original plus three distractors) and participants were asked to identify the target face. Half of the trials assessed memory for facial emotion, and the remaining trials examined memory for facial identity. At encoding, no group differences were apparent. At memory testing, relative to ND participants, DP participants exhibited impaired memory for all types of facial emotion and for facial identity when the faces featured happiness, anger, or neutral affect, but not sadness. DP participants exhibited impaired identity memory for happy faces relative to angry, sad, and neutral, whereas ND participants exhibited enhanced facial identity memory when faces were angry. In general, memory for faces was not related to performance at encoding. However, in DP participants only, memory for sad faces was related to sadness recognition at encoding. The results suggest that the negative memory bias for faces in dysphoria does not generalise from long- to short-term memory.

  10. Neurophysiological evidence (ERPs) for hemispheric processing of facial expressions of emotions: Evidence from whole face and chimeric face stimuli.

    Science.gov (United States)

    Damaskinou, Nikoleta; Watling, Dawn

    2018-05-01

    This study was designed to investigate the patterns of electrophysiological responses of early emotional processing at frontocentral sites in adults and to explore whether adults' activation patterns show hemispheric lateralization for facial emotion processing. Thirty-five adults viewed full face and chimeric face stimuli. After viewing two faces, sequentially, participants were asked to decide which of the two faces was more emotive. The findings from the standard faces and the chimeric faces suggest that emotion processing is present during the early phases of face processing in the frontocentral sites. In particular, sad emotional faces are processed differently than neutral and happy (including happy chimeras) faces in these early phases of processing. Further, there were differences in the electrode amplitudes over the left and right hemisphere, particularly in the early temporal window. This research provides supporting evidence that the chimeric face test is a test of emotion processing that elicits right hemispheric processing.

  11. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias.

    Science.gov (United States)

    Invitto, Sara; Calcagnì, Antonio; Mignozzi, Arianna; Scardino, Rosanna; Piraino, Giulia; Turchi, Daniele; De Feudis, Irio; Brunetti, Antonio; Bevilacqua, Vitoantonio; de Tommaso, Marina

    2017-01-01

    Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians). Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP) and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment). A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.

  12. Processing of emotional faces in social phobia

    Directory of Open Access Journals (Sweden)

    Nicole Kristjansen Rosenberg

    2011-02-01

    Full Text Available Previous research has found that individuals with social phobia differ from controls in their processing of emotional faces. For instance, people with social phobia show increased attention to briefly presented threatening faces. However, when exposure times are increased, the direction of this attentional bias is more unclear. Studies investigating eye movements have found both increased as well as decreased attention to threatening faces in socially anxious participants. The current study investigated eye movements to emotional faces in eight patients with social phobia and 34 controls. Three different tasks with different exposure durations were used, which allowed for an investigation of the time course of attention. At the early time interval, patients showed a complex pattern of both vigilance and avoidance of threatening faces. At the longest time interval, patients avoided the eyes of sad, disgust, and neutral faces more than controls, whereas there were no group differences for angry faces.

  13. Detecting and Categorizing Fleeting Emotions in Faces

    Science.gov (United States)

    Sweeny, Timothy D.; Suzuki, Satoru; Grabowecky, Marcia; Paller, Ken A.

    2013-01-01

    Expressions of emotion are often brief, providing only fleeting images from which to base important social judgments. We sought to characterize the sensitivity and mechanisms of emotion detection and expression categorization when exposure to faces is very brief, and to determine whether these processes dissociate. Observers viewed 2 backward-masked facial expressions in quick succession, 1 neutral and the other emotional (happy, fearful, or angry), in a 2-interval forced-choice task. On each trial, observers attempted to detect the emotional expression (emotion detection) and to classify the expression (expression categorization). Above-chance emotion detection was possible with extremely brief exposures of 10 ms and was most accurate for happy expressions. We compared categorization among expressions using a d′ analysis, and found that categorization was usually above chance for angry versus happy and fearful versus happy, but consistently poor for fearful versus angry expressions. Fearful versus angry categorization was poor even when only negative emotions (fearful, angry, or disgusted) were used, suggesting that this categorization is poor independent of decision context. Inverting faces impaired angry versus happy categorization, but not emotion detection, suggesting that information from facial features is used differently for emotion detection and expression categorizations. Emotion detection often occurred without expression categorization, and expression categorization sometimes occurred without emotion detection. These results are consistent with the notion that emotion detection and expression categorization involve separate mechanisms. PMID:22866885

  14. A new look at emotion perception: Concepts speed and shape facial emotion recognition.

    Science.gov (United States)

    Nook, Erik C; Lindquist, Kristen A; Zaki, Jamil

    2015-10-01

    Decades ago, the "New Look" movement challenged how scientists thought about vision by suggesting that conceptual processes shape visual perceptions. Currently, affective scientists are likewise debating the role of concepts in emotion perception. Here, we utilized a repetition-priming paradigm in conjunction with signal detection and individual difference analyses to examine how providing emotion labels-which correspond to discrete emotion concepts-affects emotion recognition. In Study 1, pairing emotional faces with emotion labels (e.g., "sad") increased individuals' speed and sensitivity in recognizing emotions. Additionally, individuals with alexithymia-who have difficulty labeling their own emotions-struggled to recognize emotions based on visual cues alone, but not when emotion labels were provided. Study 2 replicated these findings and further demonstrated that emotion concepts can shape perceptions of facial expressions. Together, these results suggest that emotion perception involves conceptual processing. We discuss the implications of these findings for affective, social, and clinical psychology. (c) 2015 APA, all rights reserved).

  15. Specificity of emotion regulation deficits in social anxiety: an internet study.

    Science.gov (United States)

    Rusch, Silke; Westermann, Stefan; Lincoln, Tania M

    2012-09-01

    There is evidence for an association between social anxiety and emotion regulation difficulties. This study investigates that emotion regulation difficulties are specific to two domains of social anxiety. An explorative study was conducted to examine the associations between emotion regulation facets and social anxiety in the normal population. N= 149 healthy volunteers participated in an internet-based survey. Emotion regulation deficits were measured by the Difficulties in Emotion Regulation Scale which consists of six subscales. Social anxiety was measured by the Social Phobia Scale and the Social Interaction Anxiety Scale. Hierarchical regression analyses showed that anxiety of interactive social situations is associated with non-acceptance of negative emotions, impulse control difficulties, and lack of functional emotion regulation strategies over and above the impact of age and general psychopathology. In contrast, anxiety of being observed by others was not specifically associated with emotion regulation strategies. The results support the hypothesis that specific emotion regulation deficits are relevant to specific aspects of social anxiety. Implications for further research and therapy are discussed. © 2011 The British Psychological Society.

  16. Emotional facial expressions reduce neural adaptation to face identity.

    Science.gov (United States)

    Gerlicher, Anna M V; van Loon, Anouk M; Scholte, H Steven; Lamme, Victor A F; van der Leij, Andries R

    2014-05-01

    In human social interactions, facial emotional expressions are a crucial source of information. Repeatedly presented information typically leads to an adaptation of neural responses. However, processing seems sustained with emotional facial expressions. Therefore, we tested whether sustained processing of emotional expressions, especially threat-related expressions, would attenuate neural adaptation. Neutral and emotional expressions (happy, mixed and fearful) of same and different identity were presented at 3 Hz. We used electroencephalography to record the evoked steady-state visual potentials (ssVEP) and tested to what extent the ssVEP amplitude adapts to the same when compared with different face identities. We found adaptation to the identity of a neutral face. However, for emotional faces, adaptation was reduced, decreasing linearly with negative valence, with the least adaptation to fearful expressions. This short and straightforward method may prove to be a valuable new tool in the study of emotional processing.

  17. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias

    Directory of Open Access Journals (Sweden)

    Sara Invitto

    2017-08-01

    Full Text Available Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians. Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment. A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.

  18. Detecting and categorizing fleeting emotions in faces.

    Science.gov (United States)

    Sweeny, Timothy D; Suzuki, Satoru; Grabowecky, Marcia; Paller, Ken A

    2013-02-01

    Expressions of emotion are often brief, providing only fleeting images from which to base important social judgments. We sought to characterize the sensitivity and mechanisms of emotion detection and expression categorization when exposure to faces is very brief, and to determine whether these processes dissociate. Observers viewed 2 backward-masked facial expressions in quick succession, 1 neutral and the other emotional (happy, fearful, or angry), in a 2-interval forced-choice task. On each trial, observers attempted to detect the emotional expression (emotion detection) and to classify the expression (expression categorization). Above-chance emotion detection was possible with extremely brief exposures of 10 ms and was most accurate for happy expressions. We compared categorization among expressions using a d' analysis, and found that categorization was usually above chance for angry versus happy and fearful versus happy, but consistently poor for fearful versus angry expressions. Fearful versus angry categorization was poor even when only negative emotions (fearful, angry, or disgusted) were used, suggesting that this categorization is poor independent of decision context. Inverting faces impaired angry versus happy categorization, but not emotion detection, suggesting that information from facial features is used differently for emotion detection and expression categorizations. Emotion detection often occurred without expression categorization, and expression categorization sometimes occurred without emotion detection. These results are consistent with the notion that emotion detection and expression categorization involve separate mechanisms. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  19. Face or body? Oxytocin improves perception of emotions from facial expressions in incongruent emotional body context.

    Science.gov (United States)

    Perry, Anat; Aviezer, Hillel; Goldstein, Pavel; Palgi, Sharon; Klein, Ehud; Shamay-Tsoory, Simone G

    2013-11-01

    The neuropeptide oxytocin (OT) has been repeatedly reported to play an essential role in the regulation of social cognition in humans in general, and specifically in enhancing the recognition of emotions from facial expressions. The later was assessed in different paradigms that rely primarily on isolated and decontextualized emotional faces. However, recent evidence has indicated that the perception of basic facial expressions is not context invariant and can be categorically altered by context, especially body context, at early perceptual levels. Body context has a strong effect on our perception of emotional expressions, especially when the actual target face and the contextually expected face are perceptually similar. To examine whether and how OT affects emotion recognition, we investigated the role of OT in categorizing facial expressions in incongruent body contexts. Our results show that in the combined process of deciphering emotions from facial expressions and from context, OT gives an advantage to the face. This advantage is most evident when the target face and the contextually expected face are perceptually similar. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. State-dependent alteration in face emotion recognition in depression.

    Science.gov (United States)

    Anderson, Ian M; Shippen, Clare; Juhasz, Gabriella; Chase, Diana; Thomas, Emma; Downey, Darragh; Toth, Zoltan G; Lloyd-Williams, Kathryn; Elliott, Rebecca; Deakin, J F William

    2011-04-01

    Negative biases in emotional processing are well recognised in people who are currently depressed but are less well described in those with a history of depression, where such biases may contribute to vulnerability to relapse. To compare accuracy, discrimination and bias in face emotion recognition in those with current and remitted depression. The sample comprised a control group (n = 101), a currently depressed group (n = 30) and a remitted depression group (n = 99). Participants provided valid data after receiving a computerised face emotion recognition task following standardised assessment of diagnosis and mood symptoms. In the control group women were more accurate in recognising emotions than men owing to greater discrimination. Among participants with depression, those in remission correctly identified more emotions than controls owing to increased response bias, whereas those currently depressed recognised fewer emotions owing to decreased discrimination. These effects were most marked for anger, fear and sadness but there was no significant emotion × group interaction, and a similar pattern tended to be seen for happiness although not for surprise or disgust. These differences were confined to participants who were antidepressant-free, with those taking antidepressants having similar results to the control group. Abnormalities in face emotion recognition differ between people with current depression and those in remission. Reduced discrimination in depressed participants may reflect withdrawal from the emotions of others, whereas the increased bias in those with a history of depression could contribute to vulnerability to relapse. The normal face emotion recognition seen in those taking medication may relate to the known effects of antidepressants on emotional processing and could contribute to their ability to protect against depressive relapse.

  1. Emotion categorization does not depend on explicit face categorization

    NARCIS (Netherlands)

    Seirafi, M.; de Weerd, P.; de Gelder, B.

    2013-01-01

    Face perception and emotion recognition have been extensively studied in the past decade; however, the relation between them is still poorly understood. A traditional view is that successful emotional categorization requires categorization of the stimulus as a ‘face', at least at the basic level.

  2. Digitizing the moving face: asymmetries of emotion and gender

    Directory of Open Access Journals (Sweden)

    Ashish Desai

    2009-04-01

    Full Text Available In a previous study with dextral males, Richardson and Bowers (1999 digitized real time video signals and found movement asymmetries over the left lower face for emotional, but not non-emotional expressions. These findings correspond to observations, based on subjective ratings of static pictures, that the left side of the face is more intensely expressive than the right (Sackeim, 1978. From a neuropsychological perspective, one possible interpretation of these findings is that emotional priming of the right hemisphere of the brain results in more muscular activity over the contralateral left than ipsilateral right side of the lower face. The purpose of the present study was to use computer-imaging methodology to determine whether there were gender differences in movement asymmetries across the face. We hypothesized that females would show less evidence of facial movement asymmetries during the expression of emotion. This hypothesis was based on findings of gender differences in the degree to which specific cognitive functions may be lateralized in the brain (i.e., females less lateralized than males. Forty-eight normal dextral college students (25 females, 23 males were videotaped while they displayed voluntary emotional expressions. A quantitative measure of movement change (called entropy was computed by subtracting the values of corresponding pixel intensities between adjacent frames and summing their differences. The upper and lower hemiface regions were examined separately due to differences in the cortical enervation of facial muscles in the upper (bilateral versus lower face (contralateral. Repeated measures ANOVA’s were used to analyze for the amount of overall facial movement and for facial asymmetries. Certain emotions were associated with significantly greater overall facial movement than others (p fear > (angry =sad > neutral. Both males and females showed this same pattern, with no gender differences in the total amount of facial

  3. Emotion Recognition of Weblog Sentences Based on an Ensemble Algorithm of Multi-label Classification and Word Emotions

    Science.gov (United States)

    Li, Ji; Ren, Fuji

    Weblogs have greatly changed the communication ways of mankind. Affective analysis of blog posts is found valuable for many applications such as text-to-speech synthesis or computer-assisted recommendation. Traditional emotion recognition in text based on single-label classification can not satisfy higher requirements of affective computing. In this paper, the automatic identification of sentence emotion in weblogs is modeled as a multi-label text categorization task. Experiments are carried out on 12273 blog sentences from the Chinese emotion corpus Ren_CECps with 8-dimension emotion annotation. An ensemble algorithm RAKEL is used to recognize dominant emotions from the writer's perspective. Our emotion feature using detailed intensity representation for word emotions outperforms the other main features such as the word frequency feature and the traditional lexicon-based feature. In order to deal with relatively complex sentences, we integrate grammatical characteristics of punctuations, disjunctive connectives, modification relations and negation into features. It achieves 13.51% and 12.49% increases for Micro-averaged F1 and Macro-averaged F1 respectively compared to the traditional lexicon-based feature. Result shows that multiple-dimension emotion representation with grammatical features can efficiently classify sentence emotion in a multi-label problem.

  4. Cognitive deficits in bipolar disorders: Implications for emotion.

    Science.gov (United States)

    Lima, Isabela M M; Peckham, Andrew D; Johnson, Sheri L

    2018-02-01

    discuss focuses on response inhibition, or the ability to suppress a prepotent response, which is considered a subtype of inhibition. Some tests measure multiple facets of executive function; for example the Trails B test likely requires working memory and cognitive flexibility (Sánchez-Cubillo et al., 2009). Aside from executive function, multiple other facets of cognition have been widely studied in bipolar disorder. Verbal and non-verbal memory are related to the ability to register, store and retrieve verbal or visual information (Lezak, 1995). Verbal fluency is measured as the number of verbal responses a person can generate to a given target, such as a specific semantic category (e.g., animals, furniture) or phonetic category (e.g., words that begin with letter F) (Diamond, 2013). Although cognitive tasks have been designed to evaluate these specific functions, it is important to note that most measures are highly inter-correlated and may assess multiple overlapping functions to some extent (for example, the Trails B test is often described as an "executive function" task, although this task likely involves both working memory and cognitive flexibility. Not surprisingly, then, some authors label the function of certain tests differently, and this is particularly evident in meta-analyses of cognition. As we describe findings in this paper, we will use the terms proposed by the authors but will also identify key tests used to define a cognitive construct. With this background in mind, we turn to a discussion of cognitive deficits, then of emotion-related traits. Our hope is that those concise summaries provide evidence for the importance of both domains, but also specificity regarding the facets of emotion and cognition that are most impaired in BD. This specificity then guides our consideration of models that integrate cognition and emotion. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Positive Emotional Engagement and Autism Risk

    Science.gov (United States)

    Lambert-Brown, Brittany L.; McDonald, Nicole M.; Mattson, Whitney I.; Martin, Katherine B.; Ibañez, Lisa V.; Stone, Wendy L.; Messinger, Daniel S.

    2015-01-01

    Positive emotional engagement develops in the context of face-to-face interactions during the first 6 months of life. Deficits in emotional engagement are characteristic of autism spectrum disorder (ASD) and may characterize the younger siblings of children with ASD (high-risk siblings). High-risk siblings are likely to exhibit a broad range of…

  6. Emotional facial expressions differentially influence predictions and performance for face recognition.

    Science.gov (United States)

    Nomi, Jason S; Rhodes, Matthew G; Cleary, Anne M

    2013-01-01

    This study examined how participants' predictions of future memory performance are influenced by emotional facial expressions. Participants made judgements of learning (JOLs) predicting the likelihood that they would correctly identify a face displaying a happy, angry, or neutral emotional expression in a future two-alternative forced-choice recognition test of identity (i.e., recognition that a person's face was seen before). JOLs were higher for studied faces with happy and angry emotional expressions than for neutral faces. However, neutral test faces with studied neutral expressions had significantly higher identity recognition rates than neutral test faces studied with happy or angry expressions. Thus, these data are the first to demonstrate that people believe happy and angry emotional expressions will lead to better identity recognition in the future relative to neutral expressions. This occurred despite the fact that neutral expressions elicited better identity recognition than happy and angry expressions. These findings contribute to the growing literature examining the interaction of cognition and emotion.

  7. State anxiety and emotional face recognition in healthy volunteers

    OpenAIRE

    Attwood, Angela S.; Easey, Kayleigh E.; Dalili, Michael N.; Skinner, Andrew L.; Woods, Andy; Crick, Lana; Ilett, Elizabeth; Penton-Voak, Ian S.; Munafò, Marcus R.

    2017-01-01

    High trait anxiety has been associated with detriments in emotional face processing. By contrast, relatively little is known about the effects of state anxiety on emotional face processing. We investigated the effects of state anxiety on recognition of emotional expressions (anger, sadness, surprise, disgust, fear and happiness) experimentally, using the 7.5% carbon dioxide (CO2) model to induce state anxiety, and in a large observational study. The experimental studies indicated reduced glob...

  8. Factual text and emotional pictures: overcoming a false dichotomy of cigarette warning labels.

    Science.gov (United States)

    Popova, Lucy; Owusu, Daniel; Jenson, Desmond; Neilands, Torsten B

    2017-04-20

    In reviewing the first set of pictorial warning labels in the USA, the courts equated textual labels with facts and information, and images with emotion. This study tested the differences in perceived informativeness and emotion between textual and pictorial cigarette warning labels. An online study with 1838 US adults who were non-smokers (n=764), transitioning smokers (quit smoking in the past 2 years or currently trying to quit, n=505) or current smokers (n=569). Each participant evaluated 9 out of 81 text and pictorial cigarette warning labels. Participants reported to what extent they perceived the label as informative and factual and the negative emotions they felt while looking at each label. We used linear mixed models to account for the nesting of multiple observations within each participant. There were no significant differences in perceived informativeness between textual (mean 6.15 on a 9-point scale) and pictorial labels (6.14, p=0.80, Cohen's d=0.003). Textual labels evoked slightly less emotion (4.21 on a 9-point scale) than pictorial labels (4.42, pemotion were strongly correlated (Pearson r=0.53, pemotional and not factual. Pictorial labels are rated as informative and factual, textual labels evoke emotion, and emotionality and informativeness are strongly correlated. These findings serve as evidence for the Food and Drug Administration (FDA) to counteract the claim that pictorial warning labels, by definition, are not 'purely factual and uncontroversial'. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  9. Adult age-differences in subjective impression of emotional faces are reflected in emotion-related attention and memory tasks

    Directory of Open Access Journals (Sweden)

    Joakim eSvard

    2014-05-01

    Full Text Available Although younger and older adults appear to attend to and remember emotional faces differently, less is known about age-related differences in the subjective emotional impression (arousal, potency, and valence of emotional faces and how these differences, in turn, are reflected in age differences in various emotional tasks. In the current study, we used the same facial emotional stimuli (angry and happy faces in four tasks: emotional rating, attention, categorical perception, and visual short-term memory (VSTM. The aim of this study was to investigate effects of age on the subjective emotional impression of angry and happy faces and to examine whether any age differences were mirrored in measures of emotional behavior (attention, categorical perception, and memory.In addition, regression analyses were used to further study impression-behavior associations. Forty younger adults (range 20-30 years and thirty-nine older adults (range 65-75 years participated in the experiment. The emotional rating task showed that older adults perceived less arousal, potency, and valence than younger adults and that the difference was more pronounced for angry than happy faces. Similarly, the results of the attention and memory tasks demonstrated interaction effects between emotion and age, and age differences on these measures were larger for angry than for happy faces. Regression analyses confirmed that in both age groups, higher potency ratings predicted both visual search and visual short-term memory efficiency. Future studies should consider the possibility that age differences in the subjective emotional impression of facial emotional stimuli may explain age differences in attention to and memory of such stimuli.

  10. Does comorbid anxiety counteract emotion recognition deficits in conduct disorder?

    Science.gov (United States)

    Short, Roxanna M L; Sonuga-Barke, Edmund J S; Adams, Wendy J; Fairchild, Graeme

    2016-08-01

    Previous research has reported altered emotion recognition in both conduct disorder (CD) and anxiety disorders (ADs) - but these effects appear to be of different kinds. Adolescents with CD often show a generalised pattern of deficits, while those with ADs show hypersensitivity to specific negative emotions. Although these conditions often cooccur, little is known regarding emotion recognition performance in comorbid CD+ADs. Here, we test the hypothesis that in the comorbid case, anxiety-related emotion hypersensitivity counteracts the emotion recognition deficits typically observed in CD. We compared facial emotion recognition across four groups of adolescents aged 12-18 years: those with CD alone (n = 28), ADs alone (n = 23), cooccurring CD+ADs (n = 20) and typically developing controls (n = 28). The emotion recognition task we used systematically manipulated the emotional intensity of facial expressions as well as fixation location (eye, nose or mouth region). Conduct disorder was associated with a generalised impairment in emotion recognition; however, this may have been modulated by group differences in IQ. AD was associated with increased sensitivity to low-intensity happiness, disgust and sadness. In general, the comorbid CD+ADs group performed similarly to typically developing controls. Although CD alone was associated with emotion recognition impairments, ADs and comorbid CD+ADs were associated with normal or enhanced emotion recognition performance. The presence of comorbid ADs appeared to counteract the effects of CD, suggesting a potentially protective role, although future research should examine the contribution of IQ and gender to these effects. © 2016 Association for Child and Adolescent Mental Health.

  11. Emotional Regulation and Executive Function Deficits in Unmedicated Chinese Children with Oppositional Defiant Disorder.

    Science.gov (United States)

    Jiang, Wenqing; Li, Yan; Du, Yasong; Fan, Juan

    2016-05-01

    This study aims to explore the feature of emotional regulation and executive functions in oppositional defiant disorder (ODD) children. The emotional regulation and executive functions of adolescents with ODD, as well as the relationship between the two factors were analyzed using tools including Adolescent Daily Emotional Regulation Questionnaire (ADERQ), Wisconsin Card Sorting Test (WCST) and Cambridge Neuropsychological Test Automated Battery (CANTAB), in comparison with attention deficit hyperactivity disorder (ADHD) children without behavioral problem and healthy children; the ADERQ assessed emotional regulation ability and others were used to assess executive function. Compared to normal children, the ODD group displayed significant differences in the scores of cognitive reappraisal, rumination, expressive suppression, and revealing of negative emotions, as well as in the score of cognitive reappraisal of positive emotions. WCST perseverative errors were well correlated with rumination of negative emotions (r=0.47). Logistic regression revealed that the minimum number of moves in the Stocking of Cambridge (SOC) test (one test in CANTAB) and negative emotion revealing, were strongly associated with ODD diagnosis. Children with ODD showed emotion dysregulation, with negative emotion dysregulation as the main feature. Emotion dysregulation and the lack of ability to plan lead to executive function deficits. The executive function deficits may guide us to understand the deep mechanism under ODD.

  12. Auditory Emotion Word Primes Influence Emotional Face Categorization in Children and Adults, but Not Vice Versa.

    Science.gov (United States)

    Vesker, Michael; Bahn, Daniela; Kauschke, Christina; Tschense, Monika; Degé, Franziska; Schwarzer, Gudrun

    2018-01-01

    In order to assess how the perception of audible speech and facial expressions influence one another for the perception of emotions, and how this influence might change over the course of development, we conducted two cross-modal priming experiments with three age groups of children (6-, 9-, and 12-years old), as well as college-aged adults. In Experiment 1, 74 children and 24 adult participants were tasked with categorizing photographs of emotional faces as positive or negative as quickly as possible after being primed with emotion words presented via audio in valence-congruent and valence-incongruent trials. In Experiment 2, 67 children and 24 adult participants carried out a similar categorization task, but with faces acting as visual primes, and emotion words acting as auditory targets. The results of Experiment 1 showed that participants made more errors when categorizing positive faces primed by negative words versus positive words, and that 6-year-old children are particularly sensitive to positive word primes, giving faster correct responses regardless of target valence. Meanwhile, the results of Experiment 2 did not show any congruency effects for priming by facial expressions. Thus, audible emotion words seem to exert an influence on the emotional categorization of faces, while faces do not seem to influence the categorization of emotion words in a significant way.

  13. Oxytocin effects on emotional response to others' faces via serotonin system in autism: A pilot study.

    Science.gov (United States)

    Fukai, Mina; Hirosawa, Tetsu; Kikuchi, Mitsuru; Ouchi, Yasuomi; Takahashi, Tetsuya; Yoshimura, Yuko; Miyagishi, Yoshiaki; Kosaka, Hirotaka; Yokokura, Masamichi; Yoshikawa, Etsuji; Bunai, Tomoyasu; Minabe, Yoshio

    2017-09-30

    The oxytocin (OT)-related serotonergic system is thought to play an important role in the etiology and social symptoms of autism spectrum disorder (ASD). However, no evidence exists for the relation between the prosocial effect of chronic OT administration and the brain serotonergic system. Ten male subjects with ASD were administered OT for 8-10 weeks in an open-label, single-arm, non-randomized, uncontrolled manner. Before and during the OT treatment, positron emission tomography was used with the ( 11 C)-3-amino-4-(2-[(demethylamino)methyl]phenylthio)benzonitrile( 11 C-DASB) radiotracer. Then binding of serotonin transporter ( 11 C-DASB BP ND ) was estimated. The main outcome measures were changes in 11 C-DASB BP ND and changes in the emotional response to others' faces. No significant change was found in the emotional response to others' faces after the 8-10 week OT treatment. However, the increased serotonin transporter (SERT) level in the striatum after treatment was correlated significantly with increased negative emotional response to human faces. This study revealed a relation between changes in the serotonergic system and in prosociality after chronic OT administration. Additional studies must be conducted to verify the chronic OT effects on social behavior via the serotonergic system. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  14. Emotion recognition pattern in adolescent boys with attention-deficit/hyperactivity disorder.

    Science.gov (United States)

    Aspan, Nikoletta; Bozsik, Csilla; Gadoros, Julia; Nagy, Peter; Inantsy-Pap, Judit; Vida, Peter; Halasz, Jozsef

    2014-01-01

    Social and emotional deficits were recently considered as inherent features of individuals with attention-deficit hyperactivity disorder (ADHD), but only sporadic literature data exist on emotion recognition in adolescents with ADHD. The aim of the present study was to establish emotion recognition profile in adolescent boys with ADHD in comparison with control adolescents. Forty-four adolescent boys (13-16 years) participated in the study after informed consent; 22 boys had a clinical diagnosis of ADHD, while data were also assessed from 22 adolescent control boys matched for age and Raven IQ. Parent- and self-reported behavioral characteristics were assessed by the means of the Strengths and Difficulties Questionnaire. The recognition of six basic emotions was evaluated by the "Facial Expressions of Emotion-Stimuli and Tests." Compared to controls, adolescents with ADHD were more sensitive in the recognition of disgust and, worse in the recognition of fear and showed a tendency for impaired recognition of sadness. Hyperactivity measures showed an inverse correlation with fear recognition. Our data suggest that adolescent boys with ADHD have alterations in the recognition of specific emotions.

  15. Neural Substrates of Auditory Emotion Recognition Deficits in Schizophrenia.

    Science.gov (United States)

    Kantrowitz, Joshua T; Hoptman, Matthew J; Leitman, David I; Moreno-Ortega, Marta; Lehrfeld, Jonathan M; Dias, Elisa; Sehatpour, Pejman; Laukka, Petri; Silipo, Gail; Javitt, Daniel C

    2015-11-04

    Deficits in auditory emotion recognition (AER) are a core feature of schizophrenia and a key component of social cognitive impairment. AER deficits are tied behaviorally to impaired ability to interpret tonal ("prosodic") features of speech that normally convey emotion, such as modulations in base pitch (F0M) and pitch variability (F0SD). These modulations can be recreated using synthetic frequency modulated (FM) tones that mimic the prosodic contours of specific emotional stimuli. The present study investigates neural mechanisms underlying impaired AER using a combined event-related potential/resting-state functional connectivity (rsfMRI) approach in 84 schizophrenia/schizoaffective disorder patients and 66 healthy comparison subjects. Mismatch negativity (MMN) to FM tones was assessed in 43 patients/36 controls. rsfMRI between auditory cortex and medial temporal (insula) regions was assessed in 55 patients/51 controls. The relationship between AER, MMN to FM tones, and rsfMRI was assessed in the subset who performed all assessments (14 patients, 21 controls). As predicted, patients showed robust reductions in MMN across FM stimulus type (p = 0.005), particularly to modulations in F0M, along with impairments in AER and FM tone discrimination. MMN source analysis indicated dipoles in both auditory cortex and anterior insula, whereas rsfMRI analyses showed reduced auditory-insula connectivity. MMN to FM tones and functional connectivity together accounted for ∼50% of the variance in AER performance across individuals. These findings demonstrate that impaired preattentive processing of tonal information and reduced auditory-insula connectivity are critical determinants of social cognitive dysfunction in schizophrenia, and thus represent key targets for future research and clinical intervention. Schizophrenia patients show deficits in the ability to infer emotion based upon tone of voice [auditory emotion recognition (AER)] that drive impairments in social cognition

  16. Cognitive organization of emotion: differences between labels and descriptors of emotion in jealousy situations.

    Science.gov (United States)

    Hupka, R B; Eshett, C

    1988-06-01

    The purpose of this study was to ascertain whether the cognitive organization of labels of emotion differs from descriptions of affective states. This was done in the context of determining whether the attributions of labels of emotion and descriptions of affective responses in jealousy situations differed according to the status of the interloper, presence of an audience to the untoward behavior, and sex of the respondent. The subjects, 300 male and female junior college students, read vignettes which placed them at a party where their mates passionately kissed interlopers of varying status, and whose transgressions were, or were not, observed by others. The subjects were required to indicate the likelihood that they would experience anger, disgust, fear, jealousy, sadness, and surprise, and 49 cognitive and physiological descriptions of the affective states referred to by the aforementioned labels of emotion. Different findings were obtained with the labels and descriptors of affective states. This was interpreted as support for the systems theory of G.E. Schwartz. The descriptions, but not the labels, indicated that men were most upset when the interloper was a best friend and least concerned when he was a stranger. In contrast, women were most upset when the interloper was someone of equal or lower status than themselves and least upset when the interloper was their best friend.

  17. Emotional faces and the default mode network.

    Science.gov (United States)

    Sreenivas, S; Boehm, S G; Linden, D E J

    2012-01-11

    The default-mode network (DMN) of the human brain has become a central topic of cognitive neuroscience research. Although alterations in its resting state activity and in its recruitment during tasks have been reported for several mental and neurodegenerative disorders, its role in emotion processing has received relatively little attention. We investigated brain responses to different categories of emotional faces with functional magnetic resonance imaging (fMRI) and found deactivation in ventromedial prefrontal cortex (VMPFC), posterior cingulate gyrus (PC) and cuneus. This deactivation was modulated by emotional category and was less prominent for happy than for sad faces. These deactivated areas along the midline conformed to areas of the DMN. We also observed emotion-dependent deactivation of the left middle frontal gyrus, which is not a classical component of the DMN. Conversely, several areas in a fronto-parietal network commonly linked with attention were differentially activated by emotion categories. Functional connectivity patterns, as obtained by correlation of activation levels, also varied between emotions. VMPFC, PC or cuneus served as hubs between the DMN-type areas and the fronto-parietal network. These data support recent suggestions that the DMN is not a unitary system but differentiates according to task and even type of stimulus. The emotion-specific differential pattern of DMN deactivation may be explored further in patients with mood disorder, where the quest for biological markers of emotional biases is still ongoing. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  18. A neural network underlying intentional emotional facial expression in neurodegenerative disease

    Directory of Open Access Journals (Sweden)

    Kelly A. Gola

    2017-01-01

    Full Text Available Intentional facial expression of emotion is critical to healthy social interactions. Patients with neurodegenerative disease, particularly those with right temporal or prefrontal atrophy, show dramatic socioemotional impairment. This was an exploratory study examining the neural and behavioral correlates of intentional facial expression of emotion in neurodegenerative disease patients and healthy controls. One hundred and thirty three participants (45 Alzheimer's disease, 16 behavioral variant frontotemporal dementia, 8 non-fluent primary progressive aphasia, 10 progressive supranuclear palsy, 11 right-temporal frontotemporal dementia, 9 semantic variant primary progressive aphasia patients and 34 healthy controls were video recorded while imitating static images of emotional faces and producing emotional expressions based on verbal command; the accuracy of their expression was rated by blinded raters. Participants also underwent face-to-face socioemotional testing and informants described participants' typical socioemotional behavior. Patients' performance on emotion expression tasks was correlated with gray matter volume using voxel-based morphometry (VBM across the entire sample. We found that intentional emotional imitation scores were related to fundamental socioemotional deficits; patients with known socioemotional deficits performed worse than controls on intentional emotion imitation; and intentional emotional expression predicted caregiver ratings of empathy and interpersonal warmth. Whole brain VBMs revealed a rightward cortical atrophy pattern homologous to the left lateralized speech production network was associated with intentional emotional imitation deficits. Results point to a possible neural mechanisms underlying complex socioemotional communication deficits in neurodegenerative disease patients.

  19. Recollection of Emotional Memories in Schizophrenia: Autonoetic awareness and specificity deficits

    Directory of Open Access Journals (Sweden)

    Aurore Neumann

    2006-03-01

    Full Text Available Episodic memory impairments seem to play a crucial role in schizophrenia. Most of the studies that have demonstrated such a deficit have used neutral material, leaving the recollection of emotional memories in schizophrenia unexplored. An overview is presented of a series of studies investigating the influence of emotion on episodic and autobiographical memory in schizophrenia. These experiments share a common experimental approach in which states of awareness accompanying recollection are considered. Results show that schizophrenia impairs conscious recollection in episodic and autobiographical memory tasks using emotional material. Schizophrenia is also associated with a reduction of the specificity with which autobiographical memories are recalled. An hypothesis in terms of a fundamental executive deficit underlying these impairments is proposed.

  20. Neuropsychology of facial expressions. The role of consciousness in processing emotional faces

    Directory of Open Access Journals (Sweden)

    Michela Balconi

    2012-04-01

    Full Text Available Neuropsychological studies have underlined the significant presence of distinct brain correlates deputed to analyze facial expression of emotion. It was observed that some cerebral circuits were considered as specific for emotional face comprehension as a function of conscious vs. unconscious processing of emotional information. Moreover, the emotional content of faces (i.e. positive vs. negative; more or less arousing may have an effect in activating specific cortical networks. Between the others, recent studies have explained the contribution of hemispheres in comprehending face, as a function of type of emotions (mainly related to the distinction positive vs. negative and of specific tasks (comprehending vs. producing facial expressions. Specifically, ERPs (event-related potentials analysis overview is proposed in order to comprehend how face may be processed by an observer and how he can make face a meaningful construct even in absence of awareness. Finally, brain oscillations is considered in order to explain the synchronization of neural populations in response to emotional faces when a conscious vs. unconscious processing is activated

  1. Categorical Perception of emotional faces is not affected by aging

    Directory of Open Access Journals (Sweden)

    Mandy Rossignol

    2009-11-01

    Full Text Available Effects of normal aging on categorical perception (CP of facial emotional expressions were investigated. One-hundred healthy participants (20 to 70 years old; five age groups had to identify morphed expressions ranging from neutrality to happiness, sadness and fear. We analysed percentages and latencies of correct recognition for nonmorphed emotional expressions, percentages and latencies of emotional recognition for morphed-faces, locus of the boundaries along the different continua and the number of intrusions. The results showed that unmorphed happy and fearful faces were better processed than unmorphed sad and neutral faces. For morphed faces, CP was confirmed, as latencies increased as a function of the distance between the displayed morph and the original unmorphed photograph. The locus of categorical boundaries was not affected by age. Aging did not alter the accuracy of recognition for original pictures, no more than the emotional recognition of morphed faces or the rate of intrusions. However, latencies of responses increased with age, for both unmorphed and morphed pictures. In conclusion, CP of facial expressions appears to be spared in aging.

  2. Facing mixed emotions: Analytic and holistic perception of facial emotion expressions engages separate brain networks.

    Science.gov (United States)

    Meaux, Emilie; Vuilleumier, Patrik

    2016-11-01

    The ability to decode facial emotions is of primary importance for human social interactions; yet, it is still debated how we analyze faces to determine their expression. Here we compared the processing of emotional face expressions through holistic integration and/or local analysis of visual features, and determined which brain systems mediate these distinct processes. Behavioral, physiological, and brain responses to happy and angry faces were assessed by presenting congruent global configurations of expressions (e.g., happy top+happy bottom), incongruent composite configurations (e.g., angry top+happy bottom), and isolated features (e.g. happy top only). Top and bottom parts were always from the same individual. Twenty-six healthy volunteers were scanned using fMRI while they classified the expression in either the top or the bottom face part but ignored information in the other non-target part. Results indicate that the recognition of happy and anger expressions is neither strictly holistic nor analytic Both routes were involved, but with a different role for analytic and holistic information depending on the emotion type, and different weights of local features between happy and anger expressions. Dissociable neural pathways were engaged depending on emotional face configurations. In particular, regions within the face processing network differed in their sensitivity to holistic expression information, which predominantly activated fusiform, inferior occipital areas and amygdala when internal features were congruent (i.e. template matching), whereas more local analysis of independent features preferentially engaged STS and prefrontal areas (IFG/OFC) in the context of full face configurations, but early visual areas and pulvinar when seen in isolated parts. Collectively, these findings suggest that facial emotion recognition recruits separate, but interactive dorsal and ventral routes within the face processing networks, whose engagement may be shaped by

  3. Adaptation to emotional conflict: evidence from a novel face emotion paradigm.

    Science.gov (United States)

    Clayson, Peter E; Larson, Michael J

    2013-01-01

    The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation) relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression). Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth) or incongruent (happy eyes, angry mouth) while high-density event-related potentials (ERPs) were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs) showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs). Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words.

  4. Adaptation to emotional conflict: evidence from a novel face emotion paradigm.

    Directory of Open Access Journals (Sweden)

    Peter E Clayson

    Full Text Available The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression. Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth or incongruent (happy eyes, angry mouth while high-density event-related potentials (ERPs were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs. Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words.

  5. On face antimagic labeling of double duplication of graphs

    Science.gov (United States)

    Shobana, L.; Kuppan, R.

    2018-04-01

    A Labeling of a plane graph G is called d-antimagic if every numbers, the set of s-sided face weights is Ws={as,as+d,as+2d,...,as+(fs-1)d} for some integers as and d (as>0,d≥0),where fs is the number of s-sided faces. We allow differentsets ws of different s.In this paper, we proved the existence of face antimagic labeling of types (1,0,0),(1,0,1),(1,1,0),(0,1,1) and (1,1,1) of double duplication of all vertices by edges of a cycle graph Cn: n≥3 and a tree of order n.

  6. What's good for the goose is not good for the gander: Age and gender differences in scanning emotion faces.

    Science.gov (United States)

    Sullivan, Susan; Campbell, Anna; Hutton, Sam B; Ruffman, Ted

    2017-05-01

    Research indicates that older adults' (≥60 years) emotion recognition is worse than that of young adults, young and older men's emotion recognition is worse than that of young and older women (respectively), older adults' looking at mouths compared with eyes is greater than that of young adults. Nevertheless, previous research has not compared older men's and women's looking at emotion faces so the present study had two aims: (a) to examine whether the tendency to look at mouths is stronger amongst older men compared with older women and (b) to examine whether men's mouth looking correlates with better emotion recognition. We examined the emotion recognition abilities and spontaneous gaze patterns of young (n = 60) and older (n = 58) males and females as they labelled emotion faces. Older men spontaneously looked more to mouths than older women, and older men's looking at mouths correlated with their emotion recognition, whereas women's looking at eyes correlated with their emotion recognition. The findings are discussed in relation to a growing body of research suggesting both age and gender differences in response to emotional stimuli and the differential efficacy of mouth and eyes looking for men and women. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. Degraded Impairment of Emotion Recognition in Parkinson's Disease Extends from Negative to Positive Emotions.

    Science.gov (United States)

    Lin, Chia-Yao; Tien, Yi-Min; Huang, Jong-Tsun; Tsai, Chon-Haw; Hsu, Li-Chuan

    2016-01-01

    Because of dopaminergic neurodegeneration, patients with Parkinson's disease (PD) show impairment in the recognition of negative facial expressions. In the present study, we aimed to determine whether PD patients with more advanced motor problems would show a much greater deficit in recognition of emotional facial expressions than a control group and whether impairment of emotion recognition would extend to positive emotions. Twenty-nine PD patients and 29 age-matched healthy controls were recruited. Participants were asked to discriminate emotions in Experiment  1 and identify gender in Experiment  2. In Experiment  1, PD patients demonstrated a recognition deficit for negative (sadness and anger) and positive faces. Further analysis showed that only PD patients with high motor dysfunction performed poorly in recognition of happy faces. In Experiment  2, PD patients showed an intact ability for gender identification, and the results eliminated possible abilities in the functions measured in Experiment  2 as alternative explanations for the results of Experiment  1. We concluded that patients' ability to recognize emotions deteriorated as the disease progressed. Recognition of negative emotions was impaired first, and then the impairment extended to positive emotions.

  8. Emotional faces influence evaluation of natural and transformed food.

    Science.gov (United States)

    Manippa, Valerio; Padulo, Caterina; Brancucci, Alfredo

    2018-07-01

    Previous evidence showed the presence of a straight relationship between feeding behavior and emotions. Despite that, no studies have focused on the influence of emotional faces on food processing. In our study, participants were presented with 72 couples of visual stimuli composed of a neutral, happy, or disgusted faces (5000 ms duration in Experiment 1, adaptation; 150 ms in Experiment 2, priming) followed by a food stimulus (1500 ms). Food stimuli were grouped in pleasant foods, further divided in natural and transformed, and unpleasant rotten foods. The task consisted in judging the food valence (as 'pleasant' or 'unpleasant') by keypress. Results showed a different pattern of response based on the transformation level of food. In general, the evaluation of natural foods was more rapid compared with transformed foods, maybe for their simplicity and healthier perception. In addition, transformed foods yielded incongruent responses with respect to the preceding emotional face, whereas natural foods yielded congruent responses with respect to it. These effects were independent of the duration of the emotional face (i.e., adaptation or priming paradigm) and may depend on pleasant food stimuli salience.

  9. Common impairments of emotional facial expression recognition in schizophrenia across French and Japanese cultures

    Directory of Open Access Journals (Sweden)

    Takashi eOkada

    2015-07-01

    Full Text Available To address whether the recognition of emotional facial expressions is impaired in schizophrenia across different cultures, patients with schizophrenia and age-matched normal controls in France and Japan were tested with a labeling task of emotional facial expressions and a matching task of unfamiliar faces. Schizophrenia patients in both France and Japan were less accurate in labeling fearful facial expressions. There was no correlation between the scores of facial emotion labeling and face matching. These results suggest that the impaired recognition of emotional facial expressions in schizophrenia is common across different cultures.

  10. How Does Sam Feel?: Children's Labelling and Drawing of Basic Emotions

    Science.gov (United States)

    Brechet, Claire; Baldy, Rene; Picard, Delphine

    2009-01-01

    This study compares the ability of children aged from 6 to 11 to freely produce emotional labels based on detailed scenarios (labelling task), and their ability to depict basic emotions in their human figure drawing (subsequent drawing task). This comparison assesses the relevance of the use of a human figure drawing task in order to test…

  11. When does subliminal affective image priming influence the ability of schizophrenic patients to perceive face emotions?

    Science.gov (United States)

    Vaina, Lucia Maria; Rana, Kunjan D; Cotos, Ionela; Li-Yang, Chen; Huang, Melissa A; Podea, Delia

    2014-12-24

    Deficits in face emotion perception are among the most pervasive aspects of schizophrenia impairments which strongly affects interpersonal communication and social skills. Schizophrenic patients (PSZ) and healthy control subjects (HCS) performed 2 psychophysical tasks. One, the SAFFIMAP test, was designed to determine the impact of subliminally presented affective or neutral images on the accuracy of face-expression (angry or neutral) perception. In the second test, FEP, subjects saw pictures of face-expression and were asked to rate them as angry, happy, or neutral. The following clinical scales were used to determine the acute symptoms in PSZ: Positive and Negative Syndrome (PANSS), Young Mania Rating (YMRS), Hamilton Depression (HAM-D), and Hamilton Anxiety (HAM-A). On the SAFFIMAP test, different from the HCS group, the PSZ group tended to categorize the neutral expression of test faces as angry and their response to the test-face expression was not influenced by the affective content of the primes. In PSZ, the PANSS-positive score was significantly correlated with correct perception of angry faces for aggressive or pleasant primes. YMRS scores were strongly correlated with PSZ's tendency to recognize angry face expressions when the prime was a pleasant or a neutral image. The HAM-D score was positively correlated with categorizing the test-faces as neutral, regardless of the affective content of the prime or of the test-face expression (angry or neutral). Despite its exploratory nature, this study provides the first evidence that conscious perception and categorization of facial emotions (neutral or angry) in PSZ is directly affected by their positive or negative symptoms of the disease as defined by their individual scores on the clinical diagnostic scales.

  12. Socio-emotional intervention in attention deficit hyperactive disorder

    Directory of Open Access Journals (Sweden)

    María Jesús Cardoso-Moreno

    2015-06-01

    Full Text Available Attention-deficit-hyperactivity disorder (ADHD is a common neuro-behavioural disorder with onset in childhood. These children have impaired emotional self-control, self-regulation of drive and motivation. Numerous studies have reported cognitive disabilities in memory, executive functions, spatial abilities and language skills. The main objective of this work is to determine whether a socio-emotional intervention programme could improve executive functions in children with Attention Deficit Hyperactivity Disorder. The sample of this study consisted of 25 children (8 female and 17 male aged between 8 and 12 years, diagnosed with ADHD and who were not taking any psychopharmacological treatment at the time of the study, and had not taken medication previously. Executive functioning was assessed through the Zoo Map test and Tower of Hanoi puzzle in pre-/post-test design. A socio-emotional intervention programme was implemented. The training consisted of 8 one-hour weekly sessions, on an individual basis. Results indicate that such a programme does lead to improved performance in the execution of tasks that evaluate executive functions. After the intervention, the children took less time to resolve the Zoo Map test. Results for the Hanoi Tower puzzle were also improved after intervention. The children needed a lower number of movements to complete the task.

  13. Socio-emotional intervention in attention deficit hyperactive disorder

    Directory of Open Access Journals (Sweden)

    María Jesús Cardoso-Moreno

    2015-12-01

    Full Text Available Attention-deficit-hyperactivity disorder (ADHD is a common neuro-behavioural disorder with onset in childhood. These children have impaired emotional self-control, self-regulation of drive and motivation. Numerous studies have reported cognitive disabilities in memory, executive functions, spatial abilities and language skills. The main objective of this work is to determine whether a socio-emotional intervention programme could improve executive functions in children with Attention Deficit Hyperactivity Disorder. The sample of this study consisted of 25 children (8 female and 17 male aged between 8 and 12 years, diagnosed with ADHD and who were not taking any psychopharmacological treatment at the time of the study, and had not taken medication previously. Executive functioning was assessed through the Zoo Map test and Tower of Hanoi puzzle in pre-/post-test design. A socio-emotional intervention programme was implemented. The training consisted of 8 one-hour weekly sessions, on an individual basis. Results indicate that such a programme does lead to improved performance in the execution of tasks that evaluate executive functions. After the intervention, the children took less time to resolve the Zoo Map test. Results for the Hanoi Tower puzzle were also improved after intervention. The children needed a lower number of movements to complete the task.

  14. Differentiating Emotions Across Contexts: Comparing Adults with and without Social Anxiety Disorder Using Random, Social Interaction, and Daily Experience Sampling

    Science.gov (United States)

    Kashdan, Todd B.; Farmer, Antonina S.

    2014-01-01

    The ability to recognize and label emotional experiences has been associated with well-being and adaptive functioning. This skill is particularly important in social situations, as emotions provide information about the state of relationships and help guide interpersonal decisions, such as whether to disclose personal information. Given the interpersonal difficulties linked to social anxiety disorder (SAD), deficient negative emotion differentiation may contribute to impairment in this population. We hypothesized that people with SAD would exhibit less negative emotion differentiation in daily life, and these differences would translate to impairment in social functioning. We recruited 43 people diagnosed with generalized SAD and 43 healthy adults to describe the emotions they experienced over 14 days. Participants received palmtop computers for responding to random prompts and describing naturalistic social interactions; to complete end-of-day diary entries, they used a secure online website. We calculated intraclass correlation coefficients to capture the degree of differentiation of negative and positive emotions for each context (random moments, face-to-face social interactions, and end-of-day reflections). Compared to healthy controls, the SAD group exhibited less negative (but not positive) emotion differentiation during random prompts, social interactions, and (at trend level) end-of-day assessments. These differences could not be explained by emotion intensity or variability over the 14 days, or to comorbid depression or anxiety disorders. Our findings suggest that people with generalized SAD have deficits in clarifying specific negative emotions felt at a given point of time. These deficits may contribute to difficulties with effective emotion regulation and healthy social relationship functioning. PMID:24512246

  15. Differentiating emotions across contexts: comparing adults with and without social anxiety disorder using random, social interaction, and daily experience sampling.

    Science.gov (United States)

    Kashdan, Todd B; Farmer, Antonina S

    2014-06-01

    The ability to recognize and label emotional experiences has been associated with well-being and adaptive functioning. This skill is particularly important in social situations, as emotions provide information about the state of relationships and help guide interpersonal decisions, such as whether to disclose personal information. Given the interpersonal difficulties linked to social anxiety disorder (SAD), deficient negative emotion differentiation may contribute to impairment in this population. We hypothesized that people with SAD would exhibit less negative emotion differentiation in daily life, and these differences would translate to impairment in social functioning. We recruited 43 people diagnosed with generalized SAD and 43 healthy adults to describe the emotions they experienced over 14 days. Participants received palmtop computers for responding to random prompts and describing naturalistic social interactions; to complete end-of-day diary entries, they used a secure online website. We calculated intraclass correlation coefficients to capture the degree of differentiation of negative and positive emotions for each context (random moments, face-to-face social interactions, and end-of-day reflections). Compared to healthy controls, the SAD group exhibited less negative (but not positive) emotion differentiation during random prompts, social interactions, and (at trend level) end-of-day assessments. These differences could not be explained by emotion intensity or variability over the 14 days, or to comorbid depression or anxiety disorders. Our findings suggest that people with generalized SAD have deficits in clarifying specific negative emotions felt at a given point of time. These deficits may contribute to difficulties with effective emotion regulation and healthy social relationship functioning.

  16. A leftward bias however you look at it: Revisiting the emotional chimeric face task as a tool for measuring emotion lateralization.

    Science.gov (United States)

    R Innes, Bobby; Burt, D Michael; Birch, Yan K; Hausmann, Markus

    2015-12-28

    Left hemiface biases observed within the Emotional Chimeric Face Task (ECFT) support emotional face perception models whereby all expressions are preferentially processed by the right hemisphere. However, previous research using this task has not considered that the visible midline between hemifaces might engage atypical facial emotion processing strategies in upright or inverted conditions, nor controlled for left visual field (thus right hemispheric) visuospatial attention biases. This study used novel emotional chimeric faces (blended at the midline) to examine laterality biases for all basic emotions. Left hemiface biases were demonstrated across all emotional expressions and were reduced, but not reversed, for inverted faces. The ECFT bias in upright faces was significantly increased in participants with a large attention bias. These results support the theory that left hemiface biases reflect a genuine bias in emotional face processing, and this bias can interact with attention processes similarly localized in the right hemisphere.

  17. [Is emotional dysregulation a component of attention-deficit/hyperactivity disorder (ADHD)?].

    Science.gov (United States)

    Villemonteix, T; Purper-Ouakil, D; Romo, L

    2015-04-01

    Attention-deficit/hyperactivity disorder (ADHD) is the most common neurodevelopmental disorder in children and adolescents. It is characterized by age-inappropriate inattention/impulsiveness and/or hyperactivity symptoms. ADHD shows a high comorbidity with oppositional defiant disorder (ODD), a disorder that features symptoms of emotional lability. Due to this comorbidity, emotional lability was long considered a secondary consequence of ADHD, which could arise under the influence of environmental factors such as inefficient parenting practices, as part of an ODD diagnosis. In this model of heterotypic continuity, emotional lability was considered not to play any causal role regarding ADHD symptomatology. As opposed to this view, it is now well established that a large number of children with ADHD and without any comorbid disorder exhibit symptoms of emotional lability. Furthermore, recent studies have found that negative emotionality accounts for significant unique variance in ADHD symptom severity, along with motor-perceptual and executive function deficits. Barkley proposed that ADHD is characterized by deficits of executive functions, and that a deficiency in the executive control of emotions is a necessary component of ADHD. According to this theory, the extent to which an individual with ADHD displays a deficiency in behavioral inhibition is the extent to which he or she will automatically display an equivalent degree of deficiency in emotional inhibition. However, not all children with ADHD exhibit symptoms of emotional lability, and studies have found that the association between emotional lability and ADHD was not mediated by executive function or motivational deficits. Task-based and resting state neuroimaging studies have disclosed an altered effective connectivity between regions dedicated to emotional regulation in children with ADHD when compared to typically developing children, notably between the amygdala, the prefrontal cortex, the hippocampus and

  18. Altered Functional Subnetwork During Emotional Face Processing: A Potential Intermediate Phenotype for Schizophrenia.

    Science.gov (United States)

    Cao, Hengyi; Bertolino, Alessandro; Walter, Henrik; Schneider, Michael; Schäfer, Axel; Taurisano, Paolo; Blasi, Giuseppe; Haddad, Leila; Grimm, Oliver; Otto, Kristina; Dixson, Luanna; Erk, Susanne; Mohnke, Sebastian; Heinz, Andreas; Romanczuk-Seiferth, Nina; Mühleisen, Thomas W; Mattheisen, Manuel; Witt, Stephanie H; Cichon, Sven; Noethen, Markus; Rietschel, Marcella; Tost, Heike; Meyer-Lindenberg, Andreas

    2016-06-01

    Although deficits in emotional processing are prominent in schizophrenia, it has been difficult to identify neural mechanisms related to the genetic risk for this highly heritable illness. Prior studies have not found consistent regional activation or connectivity alterations in first-degree relatives compared with healthy controls, suggesting that a more comprehensive search for connectomic biomarkers is warranted. To identify a potential systems-level intermediate phenotype linked to emotion processing in schizophrenia and to examine the psychological association, task specificity, test-retest reliability, and clinical validity of the identified phenotype. The study was performed in university research hospitals from June 1, 2008, through December 31, 2013. We examined 58 unaffected first-degree relatives of patients with schizophrenia and 94 healthy controls with an emotional face-matching functional magnetic resonance imaging paradigm. Test-retest reliability was analyzed with an independent sample of 26 healthy participants. A clinical association study was performed in 31 patients with schizophrenia and 45 healthy controls. Data analysis was performed from January 1 to September 30, 2014. Conventional amygdala activity and seeded connectivity measures, graph-based global and local network connectivity measures, Spearman rank correlation, intraclass correlation, and gray matter volumes. Among the 152 volunteers included in the relative-control sample, 58 were unaffected first-degree relatives of patients with schizophrenia (mean [SD] age, 33.29 [12.56]; 38 were women), and 94 were healthy controls without a first-degree relative with mental illness (mean [SD] age, 32.69 [10.09] years; 55 were women). A graph-theoretical connectivity approach identified significantly decreased connectivity in a subnetwork that primarily included the limbic cortex, visual cortex, and subcortex during emotional face processing (cluster-level P corrected for familywise error =

  19. Degraded Impairment of Emotion Recognition in Parkinson’s Disease Extends from Negative to Positive Emotions

    Directory of Open Access Journals (Sweden)

    Chia-Yao Lin

    2016-01-01

    Full Text Available Because of dopaminergic neurodegeneration, patients with Parkinson’s disease (PD show impairment in the recognition of negative facial expressions. In the present study, we aimed to determine whether PD patients with more advanced motor problems would show a much greater deficit in recognition of emotional facial expressions than a control group and whether impairment of emotion recognition would extend to positive emotions. Twenty-nine PD patients and 29 age-matched healthy controls were recruited. Participants were asked to discriminate emotions in Experiment  1 and identify gender in Experiment  2. In Experiment  1, PD patients demonstrated a recognition deficit for negative (sadness and anger and positive faces. Further analysis showed that only PD patients with high motor dysfunction performed poorly in recognition of happy faces. In Experiment  2, PD patients showed an intact ability for gender identification, and the results eliminated possible abilities in the functions measured in Experiment  2 as alternative explanations for the results of Experiment  1. We concluded that patients’ ability to recognize emotions deteriorated as the disease progressed. Recognition of negative emotions was impaired first, and then the impairment extended to positive emotions.

  20. Are neutral faces of children really emotionally neutral?

    OpenAIRE

    小松, 佐穂子; 箱田, 裕司; Komatsu, Sahoko; Hakoda, Yuji

    2012-01-01

    In this study, we investigated whether people recognize emotions from neutral faces of children (11 to 13 years old). We took facial images of 53 male and 54 female Japanese children who had been asked to keep a neutral facial expression. Then, we conducted an experiment in which 43 participants (19 to 34 years old) rated the strength of four emotions (happiness, surprise, sadness, and anger) for the facial images, using a 7- point scale. We found that (a) they rated both male and female face...

  1. Interactions between facial emotion and identity in face processing: evidence based on redundancy gains.

    Science.gov (United States)

    Yankouskaya, Alla; Booth, David A; Humphreys, Glyn

    2012-11-01

    Interactions between the processing of emotion expression and form-based information from faces (facial identity) were investigated using the redundant-target paradigm, in which we specifically tested whether identity and emotional expression are integrated in a superadditive manner (Miller, Cognitive Psychology 14:247-279, 1982). In Experiments 1 and 2, participants performed emotion and face identity judgments on faces with sad or angry emotional expressions. Responses to redundant targets were faster than responses to either single target when a universal emotion was conveyed, and performance violated the predictions from a model assuming independent processing of emotion and face identity. Experiment 4 showed that these effects were not modulated by varying interstimulus and nontarget contingencies, and Experiment 5 demonstrated that the redundancy gains were eliminated when faces were inverted. Taken together, these results suggest that the identification of emotion and facial identity interact in face processing.

  2. Congruence of happy and sad emotion in music and faces modifies cortical audiovisual activation.

    Science.gov (United States)

    Jeong, Jeong-Won; Diwadkar, Vaibhav A; Chugani, Carla D; Sinsoongsud, Piti; Muzik, Otto; Behen, Michael E; Chugani, Harry T; Chugani, Diane C

    2011-02-14

    The powerful emotion inducing properties of music are well-known, yet music may convey differing emotional responses depending on environmental factors. We hypothesized that neural mechanisms involved in listening to music may differ when presented together with visual stimuli that conveyed the same emotion as the music when compared to visual stimuli with incongruent emotional content. We designed this study to determine the effect of auditory (happy and sad instrumental music) and visual stimuli (happy and sad faces) congruent or incongruent for emotional content on audiovisual processing using fMRI blood oxygenation level-dependent (BOLD) signal contrast. The experiment was conducted in the context of a conventional block-design experiment. A block consisted of three emotional ON periods, music alone (happy or sad music), face alone (happy or sad faces), and music combined with faces where the music excerpt was played while presenting either congruent emotional faces or incongruent emotional faces. We found activity in the superior temporal gyrus (STG) and fusiform gyrus (FG) to be differentially modulated by music and faces depending on the congruence of emotional content. There was a greater BOLD response in STG when the emotion signaled by the music and faces was congruent. Furthermore, the magnitude of these changes differed for happy congruence and sad congruence, i.e., the activation of STG when happy music was presented with happy faces was greater than the activation seen when sad music was presented with sad faces. In contrast, incongruent stimuli diminished the BOLD response in STG and elicited greater signal change in bilateral FG. Behavioral testing supplemented these findings by showing that subject ratings of emotion in faces were influenced by emotion in music. When presented with happy music, happy faces were rated as more happy (p=0.051) and sad faces were rated as less sad (p=0.030). When presented with sad music, happy faces were rated as less

  3. Behavioural and neurophysiological evidence for face identity and face emotion processing in animals

    Science.gov (United States)

    Tate, Andrew J; Fischer, Hanno; Leigh, Andrea E; Kendrick, Keith M

    2006-01-01

    Visual cues from faces provide important social information relating to individual identity, sexual attraction and emotional state. Behavioural and neurophysiological studies on both monkeys and sheep have shown that specialized skills and neural systems for processing these complex cues to guide behaviour have evolved in a number of mammals and are not present exclusively in humans. Indeed, there are remarkable similarities in the ways that faces are processed by the brain in humans and other mammalian species. While human studies with brain imaging and gross neurophysiological recording approaches have revealed global aspects of the face-processing network, they cannot investigate how information is encoded by specific neural networks. Single neuron electrophysiological recording approaches in both monkeys and sheep have, however, provided some insights into the neural encoding principles involved and, particularly, the presence of a remarkable degree of high-level encoding even at the level of a specific face. Recent developments that allow simultaneous recordings to be made from many hundreds of individual neurons are also beginning to reveal evidence for global aspects of a population-based code. This review will summarize what we have learned so far from these animal-based studies about the way the mammalian brain processes the faces and the emotions they can communicate, as well as associated capacities such as how identity and emotion cues are dissociated and how face imagery might be generated. It will also try to highlight what questions and advances in knowledge still challenge us in order to provide a complete understanding of just how brain networks perform this complex and important social recognition task. PMID:17118930

  4. Influence of Emotional Facial Expressions on 3-5-Year-Olds' Face Recognition

    Science.gov (United States)

    Freitag, Claudia; Schwarzer, Gudrun

    2011-01-01

    Three experiments examined 3- and 5-year-olds' recognition of faces in constant and varied emotional expressions. Children were asked to identify repeatedly presented target faces, distinguishing them from distractor faces, during an immediate recognition test and during delayed assessments after 10 min and one week. Emotional facial expression…

  5. Emotional reaction facilitates the brain and behavioral impact of graphic cigarette warning labels in smokers

    Science.gov (United States)

    Wang, An-Li; Lowen, Steven B; Romer, Daniel; Giorno, Mario; Langleben, Daniel D

    2015-01-01

    Background Warning labels on cigarette packages are an important venue for information about the hazards of smoking. The 2009 US Family Smoking Prevention and Tobacco Control Act mandated replacing the current text-only labels with graphic warning labels. However, labels proposed by the Food and Drug Administration (FDA) were challenged in court by the tobacco companies, who argued successfully that the proposed labels needlessly encroached on their right to free speech, in part because they included images of high emotional salience that indiscriminately frightened rather than informed consumers. Methods We used functional MRI to examine the effects of graphic warning labels' emotional salience on smokers' brain activity and cognition. Twenty-four smokers viewed a random sequence of blocks of graphic warning labels that have been rated high or low on an ‘emotional reaction’ scale in previous research. Results We found that labels rated high on emotional reaction were better remembered, associated with reduction in the urge to smoke, and produced greater brain response in the amygdala, hippocampi, inferior frontal gyri and the insulae. Conclusions Recognition memory and craving are, respectively, correlates of effectiveness of addiction related public health communications and interventions, and amygdala activation facilitates the encoding of emotional memories. Thus, our results suggest that emotional reaction to graphic warning labels contributes to their public health impact and may be an integral part of the neural mechanisms underlying their effectiveness. Given the urgency of the debate about the constitutional risks and public health benefits of graphic warning labels, these preliminary findings warrant consideration while longitudinal clinical studies are underway PMID:25564288

  6. Children's understanding of facial expression of emotion: II. Drawing of emotion-faces.

    Science.gov (United States)

    Missaghi-Lakshman, M; Whissell, C

    1991-06-01

    67 children from Grades 2, 4, and 7 drew faces representing the emotional expressions of fear, anger, surprise, disgust, happiness, and sadness. The children themselves and 29 adults later decoded the drawings in an emotion-recognition task. Children were the more accurate decoders, and their accuracy and the accuracy of adults increased significantly for judgments of 7th-grade drawings. The emotions happy and sad were most accurately decoded. There were no significant differences associated with sex. In their drawings, children utilized a symbol system that seems to be based on a highlighting or exaggeration of features of the innately governed facial expression of emotion.

  7. The impact of emotional faces on social motivation in schizophrenia.

    Science.gov (United States)

    Radke, Sina; Pfersmann, Vera; Derntl, Birgit

    2015-10-01

    Impairments in emotion recognition and psychosocial functioning are a robust phenomenon in schizophrenia and may affect motivational behavior, particularly during socio-emotional interactions. To characterize potential deficits and their interplay, we assessed social motivation covering various facets, such as implicit and explicit approach-avoidance tendencies to facial expressions, in 27 patients with schizophrenia (SZP) and 27 matched healthy controls (HC). Moreover, emotion recognition abilities as well as self-reported behavioral activation and inhibition were evaluated. Compared to HC, SZP exhibited less pronounced approach-avoidance ratings to happy and angry expressions along with prolonged reactions during automatic approach-avoidance. Although deficits in emotion recognition were replicated, these were not associated with alterations in social motivation. Together with additional connections between psychopathology and several approach-avoidance processes, these results identify motivational impairments in SZP and suggest a complex relationship between different aspects of social motivation. In the context of specialized interventions aimed at improving social cognitive abilities in SZP, the link between such dynamic measures, motivational profiles and functional outcomes warrants further investigations, which can provide important leverage points for treatment. Crucially, our findings present first insights into the assessment and identification of target features of social motivation.

  8. Emotional dysregulation is a primary symptom in adult Attention-Deficit/Hyperactivity Disorder (ADHD).

    Science.gov (United States)

    Hirsch, Oliver; Chavanon, MiraLynn; Riechmann, Elke; Christiansen, Hanna

    2018-05-01

    Clinical observations suggest that adults have more diverse deficits than children with Attention Deficit/Hyperactivity Disorder (ADHD). These seem to entail difficulties with emotionality, self-concept and emotion regulation in particular, along with the cardinal symptoms of inattention, impulsivity, and hyperactivity for adult patients. Here, we probed a model that explicitly distinguished positive and negative affect, problems with self-concept and emotion regulation skills as distinct but correlating factors with the symptom domains of inattention, hyperactivity, and impulsivity. Participants were 213 newly diagnosed adults with ADHD (62.9% male, mean age 33.5 years). Symptoms were assessed via self-report on the Conners' Adult ADHD Rating Scales, a modified version of the Positive and Negative Affect Scale and the Emotion Regulation Skill Questionnaire. A confirmatory factor analysis with the R package lavaan, using a robust Maximum Likelihood estimator (MLR) for non-normal data, was conducted to test our new non-hierarchical 7-factor model. All calculated model-fit statistics revealed good model-fit (χ 2 /df ratio = 2.03, robust RMSEA = .07). The SRMR in our model reached .089, indicating an acceptable model fit. Factor loadings on the postulated factors had salient loadings ≥ .31 except for one item on the hyperactivity factor. Latent factor associations were especially salient between emotional dysregulation and problems with self-concept, and also partially with impulsivity/emotional lability. The three models of ADHD and emotion regulation as suggested by Shaw et al. (2014) could not be disentangled in this study, though the overall results support the model with shared neurocognitive deficits. Further, we did not separately analyze ADHD with or without comorbid disorders. As our sample of clinical cases with ADHD is highly comorbid (47.9%), other disorders than ADHD might account for the emotion regulation deficits, though a sensitivity

  9. Human sex differences in emotional processing of own-race and other-race faces.

    Science.gov (United States)

    Ran, Guangming; Chen, Xu; Pan, Yangu

    2014-06-18

    There is evidence that women and men show differences in the perception of affective facial expressions. However, none of the previous studies directly investigated sex differences in emotional processing of own-race and other-race faces. The current study addressed this issue using high time resolution event-related potential techniques. In total, data from 25 participants (13 women and 12 men) were analyzed. It was found that women showed increased N170 amplitudes to negative White faces compared with negative Chinese faces over the right hemisphere electrodes. This result suggests that women show enhanced sensitivity to other-race faces showing negative emotions (fear or disgust), which may contribute toward evolution. However, the current data showed that men had increased N170 amplitudes to happy Chinese versus happy White faces over the left hemisphere electrodes, indicating that men show enhanced sensitivity to own-race faces showing positive emotions (happiness). In this respect, men might use past pleasant emotional experiences to boost recognition of own-race faces.

  10. Emotion Regulation and Heterogeneity in Attention-Deficit/Hyperactivity Disorder

    Science.gov (United States)

    Musser, Erica D.; Galloway-Long, Hilary S.; Frick, Paul J.; Nigg, Joel T.

    2013-01-01

    Objective: How best to capture heterogeneity in attention-deficit/hyperactivity disorder (ADHD) using biomarkers has been elusive. This study evaluated whether emotion reactivity and regulation provide a means to achieve this. Method: Participants were classified into three groups: children with ADHD plus low prosocial behavior (hypothesized to be…

  11. Self-esteem Modulates the P3 Component in Response to the Self-face Processing after Priming with Emotional Faces.

    Science.gov (United States)

    Guan, Lili; Zhao, Yufang; Wang, Yige; Chen, Yujie; Yang, Juan

    2017-01-01

    The self-face processing advantage (SPA) refers to the research finding that individuals generally recognize their own face faster than another's face; self-face also elicits an enhanced P3 amplitude compared to another's face. It has been suggested that social evaluation threats could weaken the SPA and that self-esteem could be regarded as a threat buffer. However, little research has directly investigated the neural evidence of how self-esteem modulates the social evaluation threat to the SPA. In the current event-related potential study, 27 healthy Chinese undergraduate students were primed with emotional faces (angry, happy, or neutral) and were asked to judge whether the target face (self, friend, and stranger) was familiar or unfamiliar. Electrophysiological results showed that after priming with emotional faces (angry and happy), self-face elicited similar P3 amplitudes to friend-face in individuals with low self-esteem, but not in individuals with high self-esteem. The results suggest that as low self-esteem raises fears of social rejection and exclusion, priming with emotional faces (angry and happy) can weaken the SPA in low self-esteem individuals but not in high self-esteem individuals.

  12. Self-esteem Modulates the P3 Component in Response to the Self-face Processing after Priming with Emotional Faces

    Directory of Open Access Journals (Sweden)

    Lili Guan

    2017-08-01

    Full Text Available The self-face processing advantage (SPA refers to the research finding that individuals generally recognize their own face faster than another’s face; self-face also elicits an enhanced P3 amplitude compared to another’s face. It has been suggested that social evaluation threats could weaken the SPA and that self-esteem could be regarded as a threat buffer. However, little research has directly investigated the neural evidence of how self-esteem modulates the social evaluation threat to the SPA. In the current event-related potential study, 27 healthy Chinese undergraduate students were primed with emotional faces (angry, happy, or neutral and were asked to judge whether the target face (self, friend, and stranger was familiar or unfamiliar. Electrophysiological results showed that after priming with emotional faces (angry and happy, self-face elicited similar P3 amplitudes to friend-face in individuals with low self-esteem, but not in individuals with high self-esteem. The results suggest that as low self-esteem raises fears of social rejection and exclusion, priming with emotional faces (angry and happy can weaken the SPA in low self-esteem individuals but not in high self-esteem individuals.

  13. Face-body integration of intense emotional expressions of victory and defeat.

    Directory of Open Access Journals (Sweden)

    Lili Wang

    Full Text Available Human facial expressions can be recognized rapidly and effortlessly. However, for intense emotions from real life, positive and negative facial expressions are difficult to discriminate and the judgment of facial expressions is biased towards simultaneously perceived body expressions. This study employed event-related potentials (ERPs to investigate the neural dynamics involved in the integration of emotional signals from facial and body expressions of victory and defeat. Emotional expressions of professional players were used to create pictures of face-body compounds, with either matched or mismatched emotional expressions in faces and bodies. Behavioral results showed that congruent emotional information of face and body facilitated the recognition of facial expressions. ERP data revealed larger P1 amplitudes for incongruent compared to congruent stimuli. Also, a main effect of body valence on the P1 was observed, with enhanced amplitudes for the stimuli with losing compared to winning bodies. The main effect of body expression was also observed in N170 and N2, with winning bodies producing larger N170/N2 amplitudes. In the later stage, a significant interaction of congruence by body valence was found on the P3 component. Winning bodies elicited lager P3 amplitudes than losing bodies did when face and body conveyed congruent emotional signals. Beyond the knowledge based on prototypical facial and body expressions, the results of this study facilitate us to understand the complexity of emotion evaluation and categorization out of laboratory.

  14. Facial emotion recognition in paranoid schizophrenia and autism spectrum disorder.

    Science.gov (United States)

    Sachse, Michael; Schlitt, Sabine; Hainz, Daniela; Ciaramidaro, Angela; Walter, Henrik; Poustka, Fritz; Bölte, Sven; Freitag, Christine M

    2014-11-01

    Schizophrenia (SZ) and autism spectrum disorder (ASD) share deficits in emotion processing. In order to identify convergent and divergent mechanisms, we investigated facial emotion recognition in SZ, high-functioning ASD (HFASD), and typically developed controls (TD). Different degrees of task difficulty and emotion complexity (face, eyes; basic emotions, complex emotions) were used. Two Benton tests were implemented in order to elicit potentially confounding visuo-perceptual functioning and facial processing. Nineteen participants with paranoid SZ, 22 with HFASD and 20 TD were included, aged between 14 and 33 years. Individuals with SZ were comparable to TD in all obtained emotion recognition measures, but showed reduced basic visuo-perceptual abilities. The HFASD group was impaired in the recognition of basic and complex emotions compared to both, SZ and TD. When facial identity recognition was adjusted for, group differences remained for the recognition of complex emotions only. Our results suggest that there is a SZ subgroup with predominantly paranoid symptoms that does not show problems in face processing and emotion recognition, but visuo-perceptual impairments. They also confirm the notion of a general facial and emotion recognition deficit in HFASD. No shared emotion recognition deficit was found for paranoid SZ and HFASD, emphasizing the differential cognitive underpinnings of both disorders. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Detection of Emotional Faces: Salient Physical Features Guide Effective Visual Search

    Science.gov (United States)

    Calvo, Manuel G.; Nummenmaa, Lauri

    2008-01-01

    In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent,…

  16. Recognition memory for low- and high-frequency-filtered emotional faces: Low spatial frequencies drive emotional memory enhancement, whereas high spatial frequencies drive the emotion-induced recognition bias.

    Science.gov (United States)

    Rohr, Michaela; Tröger, Johannes; Michely, Nils; Uhde, Alarith; Wentura, Dirk

    2017-07-01

    This article deals with two well-documented phenomena regarding emotional stimuli: emotional memory enhancement-that is, better long-term memory for emotional than for neutral stimuli-and the emotion-induced recognition bias-that is, a more liberal response criterion for emotional than for neutral stimuli. Studies on visual emotion perception and attention suggest that emotion-related processes can be modulated by means of spatial-frequency filtering of the presented emotional stimuli. Specifically, low spatial frequencies are assumed to play a primary role for the influence of emotion on attention and judgment. Given this theoretical background, we investigated whether spatial-frequency filtering also impacts (1) the memory advantage for emotional faces and (2) the emotion-induced recognition bias, in a series of old/new recognition experiments. Participants completed incidental-learning tasks with high- (HSF) and low- (LSF) spatial-frequency-filtered emotional and neutral faces. The results of the surprise recognition tests showed a clear memory advantage for emotional stimuli. Most importantly, the emotional memory enhancement was significantly larger for face images containing only low-frequency information (LSF faces) than for HSF faces across all experiments, suggesting that LSF information plays a critical role in this effect, whereas the emotion-induced recognition bias was found only for HSF stimuli. We discuss our findings in terms of both the traditional account of different processing pathways for HSF and LSF information and a stimulus features account. The double dissociation in the results favors the latter account-that is, an explanation in terms of differences in the characteristics of HSF and LSF stimuli.

  17. Emotional reaction facilitates the brain and behavioural impact of graphic cigarette warning labels in smokers.

    Science.gov (United States)

    Wang, An-Li; Lowen, Steven B; Romer, Daniel; Giorno, Mario; Langleben, Daniel D

    2015-05-01

    Warning labels on cigarette packages are an important venue for information about the hazards of smoking. The 2009 US Family Smoking Prevention and Tobacco Control Act mandated replacing the current text-only labels with graphic warning labels. However, labels proposed by the Food and Drug Administration (FDA) were challenged in court by the tobacco companies, who argued successfully that the proposed labels needlessly encroached on their right to free speech, in part because they included images of high emotional salience that indiscriminately frightened rather than informed consumers. We used functional MRI to examine the effects of graphic warning labels' emotional salience on smokers' brain activity and cognition. Twenty-four smokers viewed a random sequence of blocks of graphic warning labels that have been rated high or low on an 'emotional reaction' scale in previous research. We found that labels rated high on emotional reaction were better remembered, associated with reduction in the urge to smoke, and produced greater brain response in the amygdala, hippocampi, inferior frontal gyri and the insulae. Recognition memory and craving are, respectively, correlates of effectiveness of addiction-related public health communications and interventions, and amygdala activation facilitates the encoding of emotional memories. Thus, our results suggest that emotional reaction to graphic warning labels contributes to their public health impact and may be an integral part of the neural mechanisms underlying their effectiveness. Given the urgency of the debate about the constitutional risks and public health benefits of graphic warning labels, these preliminary findings warrant consideration while longitudinal clinical studies are underway. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  18. The electrophysiological effects of the serotonin 1A receptor agonist buspirone in emotional face processing.

    Science.gov (United States)

    Bernasconi, Fosco; Kometer, Michael; Pokorny, Thomas; Seifritz, Erich; Vollenweider, Franz X

    2015-04-01

    Emotional face processing is critically modulated by the serotonergic system, and serotonin (5-HT) receptor agonists impair emotional face processing. However, the specific contribution of the 5-HT1A receptor remains poorly understood. Here we investigated the spatiotemporal brain mechanisms underpinning the modulation of emotional face processing induced by buspirone, a partial 5-HT1A receptor agonist. In a psychophysical discrimination of emotional faces task, we observed that the discrimination fearful versus neutral faces were reduced, but not happy versus neutral faces. Electrical neuroimaging analyses were applied to visual evoked potentials elicited by emotional face images, after placebo and buspirone administration. Buspirone modulated response strength (i.e., global field power) in the interval 230-248ms after stimulus onset. Distributed source estimation over this time interval revealed that buspirone decreased the neural activity in the right dorsolateral prefrontal cortex that was evoked by fearful faces. These results indicate temporal and valence-specific effects of buspirone on the neuronal correlates of emotional face processing. Furthermore, the reduced neural activity in the dorsolateral prefrontal cortex in response to fearful faces suggests a reduced attention to fearful faces. Collectively, these findings provide new insights into the role of 5-HT1A receptors in emotional face processing and have implications for affective disorders that are characterized by an increased attention to negative stimuli. Copyright © 2015 Elsevier B.V. and ECNP. All rights reserved.

  19. Prevalence of face recognition deficits in middle childhood.

    Science.gov (United States)

    Bennetts, Rachel J; Murray, Ebony; Boyce, Tian; Bate, Sarah

    2017-02-01

    Approximately 2-2.5% of the adult population is believed to show severe difficulties with face recognition, in the absence of any neurological injury-a condition known as developmental prosopagnosia (DP). However, to date no research has attempted to estimate the prevalence of face recognition deficits in children, possibly because there are very few child-friendly, well-validated tests of face recognition. In the current study, we examined face and object recognition in a group of primary school children (aged 5-11 years), to establish whether our tests were suitable for children and to provide an estimate of face recognition difficulties in children. In Experiment 1 (n = 184), children completed a pre-existing test of child face memory, the Cambridge Face Memory Test-Kids (CFMT-K), and a bicycle test with the same format. In Experiment 2 (n = 413), children completed three-alternative forced-choice matching tasks with faces and bicycles. All tests showed good psychometric properties. The face and bicycle tests were well matched for difficulty and showed a similar developmental trajectory. Neither the memory nor the matching tests were suitable to detect impairments in the youngest groups of children, but both tests appear suitable to screen for face recognition problems in middle childhood. In the current sample, 1.2-5.2% of children showed difficulties with face recognition; 1.2-4% showed face-specific difficulties-that is, poor face recognition with typical object recognition abilities. This is somewhat higher than previous adult estimates: It is possible that face matching tests overestimate the prevalence of face recognition difficulties in children; alternatively, some children may "outgrow" face recognition difficulties.

  20. Parsing cognitive and emotional empathy deficits for negative and positive stimuli in frontotemporal dementia.

    Science.gov (United States)

    Oliver, Lindsay D; Mitchell, Derek G V; Dziobek, Isabel; MacKinley, Julia; Coleman, Kristy; Rankin, Katherine P; Finger, Elizabeth C

    2015-01-01

    Behavioural variant frontotemporal dementia (bvFTD) is a debilitating neurodegenerative disorder characterized by frontal and temporal lobe atrophy primarily affecting social cognition and emotion, including loss of empathy. Many consider empathy to be a multidimensional construct, including cognitive empathy (the ability to adopt and understand another's perspective) and emotional empathy (the capacity to share another's emotional experience). Cognitive and emotional empathy deficits have been associated with bvFTD; however, little is known regarding the performance of patients with bvFTD on behavioural measures of emotional empathy, and whether empathic responses differ for negative versus positive stimuli. 24 patients with bvFTD and 24 healthy controls completed the Multifaceted Empathy Test (MET; Dziobek et al., 2008), a performance-based task that taps both cognitive and emotional facets of empathy, and allows for the discrimination of responses to negative versus positive realistic images. MET scores were also compared with caregiver ratings of patient behaviour on the Interpersonal Reactivity Index, which assesses patients' everyday demonstrations of perspective taking and empathic concern. Patients with bvFTD were less accurate than controls at inferring mental states for negative and positive stimuli. They also demonstrated lower levels of shared emotional experience, more positive emotional reactions, and diminished arousal to negative social stimuli relative to controls. Patients showed reduced emotional reactions to negative non-social stimuli as well. Lastly, the MET and IRI measures of emotional empathy were found to be significantly correlated within the bvFTD group. The results suggest that patients with bvFTD show a global deficit in cognitive empathy, and deficient emotional empathy for negative, but not positive, experiences. Further, a generalized emotional processing impairment for negative stimuli was observed, which could contribute to the

  1. A note on age differences in mood-congruent vs. mood-incongruent emotion processing in faces.

    Science.gov (United States)

    Voelkle, Manuel C; Ebner, Natalie C; Lindenberger, Ulman; Riediger, Michaela

    2014-01-01

    (1) Does experienced mood affect emotion perception in faces and is this perception mood-congruent or mood-incongruent?(2) Are there age-group differences in the interplay between experienced mood and emotion perception? (3) Does emotion perception in faces change as a function of the temporal sequence of study sessions and stimuli presentation, and (4) does emotion perception in faces serve a mood-regulatory function? One hundred fifty-four adults of three different age groups (younger: 20-31 years; middle-aged: 44-55 years; older adults: 70-81 years) were asked to provide multidimensional emotion ratings of a total of 1026 face pictures of younger, middle-aged, and older men and women, each displaying six different prototypical (primary) emotional expressions. By analyzing the likelihood of ascribing an additional emotional expression to a face whose primary emotion had been correctly recognized, the multidimensional rating approach permits the study of emotion perception while controlling for emotion recognition. Following up on previous research on mood responses to recurring unpleasant situations using the same dataset (Voelkle et al., 2013), crossed random effects analyses supported a mood-congruent relationship between experienced mood and perceived emotions in faces. In particular older adults were more likely to perceive happiness in faces when being in a positive mood and less likely to do so when being in a negative mood. This did not apply to younger adults. Temporal sequence of study sessions and stimuli presentation had a strong effect on the likelihood of ascribing an additional emotional expression. In contrast to previous findings, however, there was neither evidence for a change from mood-congruent to mood-incongruent responses over time nor evidence for a mood-regulatory effect.

  2. Asymmetric Engagement of Amygdala and Its Gamma Connectivity in Early Emotional Face Processing

    Science.gov (United States)

    Liu, Tai-Ying; Chen, Yong-Sheng; Hsieh, Jen-Chuen; Chen, Li-Fen

    2015-01-01

    The amygdala has been regarded as a key substrate for emotion processing. However, the engagement of the left and right amygdala during the early perceptual processing of different emotional faces remains unclear. We investigated the temporal profiles of oscillatory gamma activity in the amygdala and effective connectivity of the amygdala with the thalamus and cortical areas during implicit emotion-perceptual tasks using event-related magnetoencephalography (MEG). We found that within 100 ms after stimulus onset the right amygdala habituated to emotional faces rapidly (with duration around 20–30 ms), whereas activity in the left amygdala (with duration around 50–60 ms) sustained longer than that in the right. Our data suggest that the right amygdala could be linked to autonomic arousal generated by facial emotions and the left amygdala might be involved in decoding or evaluating expressive faces in the early perceptual emotion processing. The results of effective connectivity provide evidence that only negative emotional processing engages both cortical and subcortical pathways connected to the right amygdala, representing its evolutional significance (survival). These findings demonstrate the asymmetric engagement of bilateral amygdala in emotional face processing as well as the capability of MEG for assessing thalamo-cortico-limbic circuitry. PMID:25629899

  3. Lateralized hybrid faces: evidence of a valence-specific bias in the processing of implicit emotions.

    Science.gov (United States)

    Prete, Giulia; Laeng, Bruno; Tommasi, Luca

    2014-01-01

    It is well known that hemispheric asymmetries exist for both the analyses of low-level visual information (such as spatial frequency) and high-level visual information (such as emotional expressions). In this study, we assessed which of the above factors underlies perceptual laterality effects with "hybrid faces": a type of stimulus that allows testing for unaware processing of emotional expressions, when the emotion is displayed in the low-frequency information while an image of the same face with a neutral expression is superimposed to it. Despite hybrid faces being perceived as neutral, the emotional information modulates observers' social judgements. In the present study, participants were asked to assess friendliness of hybrid faces displayed tachistoscopically, either centrally or laterally to fixation. We found a clear influence of the hidden emotions also with lateral presentations. Happy faces were rated as more friendly and angry faces as less friendly with respect to neutral faces. In general, hybrid faces were evaluated as less friendly when they were presented in the left visual field/right hemisphere than in the right visual field/left hemisphere. The results extend the validity of the valence hypothesis in the specific domain of unaware (subcortical) emotion processing.

  4. Six week open-label reboxetine treatment in children and adolescents with attention deficit hyperactivity disorder

    Directory of Open Access Journals (Sweden)

    Arabgol F

    2007-10-01

    Full Text Available Background: Attention Deficit Hyperactivity Disorder (ADHD is a common psychiatric disorder among children and adolescents. This disorder causes difficulties in academic, behavioral, emotional, social and family performance. Stimulants show robust efficacy and a good safety profile in children with this disorder, but a significant percent of ADHD children do not respond adequately or cannot tolerate the associated adverse effects with stimulants. Such difficulties highlight the need for alternative safe and effective medications in the treatment of this disorder. This open-label study assessed the effectiveness of reboxetine, a selective norepinephrine reuptake inhibitor, in children and adolescents with attention deficit hyperactivity disorder (ADHD."nMethods: Fifteen child and adolescent outpatients, aged 7 to 16 (Mean± SD=9.72±2.71 years, diagnosed with ADHD were enrolled in a six open-label study with reboxetine 4-6 mg/d. The principal measure of the outcome was the teacher and parent Attention Deficit Hyperactive Disorder Rating Scale (ADHD Rating Scale. Patients were assessed by a child psychiatrist at baseline, 2, 4 and 6 weeks of the medication started. Side effects questionnaire was used to detect side effects of reboxetine. Repeated measures Analysis of variance (ANOVA was done for comparison of Teacher and Parent ADHD Rating Scale scores during the intervention."nResults: Twelve of 15 (80% participants completed the treatment protocol. A significant decrease in ADHD symptoms on teacher (p=0.04 and parent (p=0.003 ADHD rating scale was noted. Adverse effects were mild to moderate in severity. The most common adverse effects were drowsiness/sedation and appetite decrease."nConclusion: The results of the current study suggest the effectiveness of reboxetine in the treatment of ADHD in children and adolescents. Double-blind, placebo-controlled studies and larger sample size with long duration of intervention are indicated to rigorously

  5. The Impact of Top-Down Prediction on Emotional Face Processing in Social Anxiety

    Directory of Open Access Journals (Sweden)

    Guangming Ran

    2017-07-01

    Full Text Available There is evidence that people with social anxiety show abnormal processing of emotional faces. To investigate the impact of top-down prediction on emotional face processing in social anxiety, brain responses of participants with high and low social anxiety (LSA were recorded, while they performed a variation of the emotional task, using high temporal resolution event-related potential techniques. Behaviorally, we reported an effect of prediction with higher accuracy for predictable than unpredictable faces. Furthermore, we found that participants with high social anxiety (HSA, but not with LSA, recognized angry faces more accurately than happy faces. For the P100 and P200 components, HSA participants showed enhanced brain activity for angry faces compared to happy faces, suggesting a hypervigilance to angry faces. Importantly, HSA participants exhibited larger N170 amplitudes in the right hemisphere electrodes than LSA participants when they observed unpredictable angry faces, but not when the angry faces were predictable. This probably reflects the top-down prediction improving the deficiency at building a holistic face representation in HSA participants.

  6. Are Max-Specified Infant Facial Expressions during Face-to-Face Interaction Consistent with Differential Emotions Theory?

    Science.gov (United States)

    Matias, Reinaldo; Cohn, Jeffrey F.

    1993-01-01

    Examined infant facial expressions at two, four, and six months of age during face-to-face play and a still-face interaction with their mothers. Contrary to differential emotions theory, at no age did proportions or durations of discrete and blended negative expressions differ; they also showed different patterns of developmental change. (MM)

  7. Psilocybin modulates functional connectivity of the amygdala during emotional face discrimination.

    Science.gov (United States)

    Grimm, O; Kraehenmann, R; Preller, K H; Seifritz, E; Vollenweider, F X

    2018-04-24

    Recent studies suggest that the antidepressant effects of the psychedelic 5-HT2A receptor agonist psilocybin are mediated through its modulatory properties on prefrontal and limbic brain regions including the amygdala. To further investigate the effects of psilocybin on emotion processing networks, we studied for the first-time psilocybin's acute effects on amygdala seed-to-voxel connectivity in an event-related face discrimination task in 18 healthy volunteers who received psilocybin and placebo in a double-blind balanced cross-over design. The amygdala has been implicated as a salience detector especially involved in the immediate response to emotional face content. We used beta-series amygdala seed-to-voxel connectivity during an emotional face discrimination task to elucidate the connectivity pattern of the amygdala over the entire brain. When we compared psilocybin to placebo, an increase in reaction time for all three categories of affective stimuli was found. Psilocybin decreased the connectivity between amygdala and the striatum during angry face discrimination. During happy face discrimination, the connectivity between the amygdala and the frontal pole was decreased. No effect was seen during discrimination of fearful faces. Thus, we show psilocybin's effect as a modulator of major connectivity hubs of the amygdala. Psilocybin decreases the connectivity between important nodes linked to emotion processing like the frontal pole or the striatum. Future studies are needed to clarify whether connectivity changes predict therapeutic effects in psychiatric patients. Copyright © 2018 Elsevier B.V. and ECNP. All rights reserved.

  8. Mixed emotions: Sensitivity to facial variance in a crowd of faces.

    Science.gov (United States)

    Haberman, Jason; Lee, Pegan; Whitney, David

    2015-01-01

    The visual system automatically represents summary information from crowds of faces, such as the average expression. This is a useful heuristic insofar as it provides critical information about the state of the world, not simply information about the state of one individual. However, the average alone is not sufficient for making decisions about how to respond to a crowd. The variance or heterogeneity of the crowd--the mixture of emotions--conveys information about the reliability of the average, essential for determining whether the average can be trusted. Despite its importance, the representation of variance within a crowd of faces has yet to be examined. This is addressed here in three experiments. In the first experiment, observers viewed a sample set of faces that varied in emotion, and then adjusted a subsequent set to match the variance of the sample set. To isolate variance as the summary statistic of interest, the average emotion of both sets was random. Results suggested that observers had information regarding crowd variance. The second experiment verified that this was indeed a uniquely high-level phenomenon, as observers were unable to derive the variance of an inverted set of faces as precisely as an upright set of faces. The third experiment replicated and extended the first two experiments using method-of-constant-stimuli. Together, these results show that the visual system is sensitive to emergent information about the emotional heterogeneity, or ambivalence, in crowds of faces.

  9. Effects of acute psychosocial stress on neural activity to emotional and neutral faces in a face recognition memory paradigm.

    Science.gov (United States)

    Li, Shijia; Weerda, Riklef; Milde, Christopher; Wolf, Oliver T; Thiel, Christiane M

    2014-12-01

    Previous studies have shown that acute psychosocial stress impairs recognition of declarative memory and that emotional material is especially sensitive to this effect. Animal studies suggest a central role of the amygdala which modulates memory processes in hippocampus, prefrontal cortex and other brain areas. We used functional magnetic resonance imaging (fMRI) to investigate neural correlates of stress-induced modulation of emotional recognition memory in humans. Twenty-seven healthy, right-handed, non-smoker male volunteers performed an emotional face recognition task. During encoding, participants were presented with 50 fearful and 50 neutral faces. One hour later, they underwent either a stress (Trier Social Stress Test) or a control procedure outside the scanner which was followed immediately by the recognition session inside the scanner, where participants had to discriminate between 100 old and 50 new faces. Stress increased salivary cortisol, blood pressure and pulse, and decreased the mood of participants but did not impact recognition memory. BOLD data during recognition revealed a stress condition by emotion interaction in the left inferior frontal gyrus and right hippocampus which was due to a stress-induced increase of neural activity to fearful and a decrease to neutral faces. Functional connectivity analyses revealed a stress-induced increase in coupling between the right amygdala and the right fusiform gyrus, when processing fearful as compared to neutral faces. Our results provide evidence that acute psychosocial stress affects medial temporal and frontal brain areas differentially for neutral and emotional items, with a stress-induced privileged processing of emotional stimuli.

  10. Visual Afterimages of Emotional Faces in High Functioning Autism

    Science.gov (United States)

    Rutherford, M. D.; Troubridge, Erin K.; Walsh, Jennifer

    2012-01-01

    Fixating an emotional facial expression can create afterimages, such that subsequent faces are seen as having the opposite expression of that fixated. Visual afterimages have been used to map the relationships among emotion categories, and this method was used here to compare ASD and matched control participants. Participants adapted to a facial…

  11. Basic and complex emotion recognition in children with autism: cross-cultural findings.

    Science.gov (United States)

    Fridenson-Hayo, Shimrit; Berggren, Steve; Lassalle, Amandine; Tal, Shahar; Pigat, Delia; Bölte, Sven; Baron-Cohen, Simon; Golan, Ofer

    2016-01-01

    Children with autism spectrum conditions (ASC) have emotion recognition deficits when tested in different expression modalities (face, voice, body). However, these findings usually focus on basic emotions, using one or two expression modalities. In addition, cultural similarities and differences in emotion recognition patterns in children with ASC have not been explored before. The current study examined the similarities and differences in the recognition of basic and complex emotions by children with ASC and typically developing (TD) controls across three cultures: Israel, Britain, and Sweden. Fifty-five children with high-functioning ASC, aged 5-9, were compared to 58 TD children. On each site, groups were matched on age, sex, and IQ. Children were tested using four tasks, examining recognition of basic and complex emotions from voice recordings, videos of facial and bodily expressions, and emotional video scenarios including all modalities in context. Compared to their TD peers, children with ASC showed emotion recognition deficits in both basic and complex emotions on all three modalities and their integration in context. Complex emotions were harder to recognize, compared to basic emotions for the entire sample. Cross-cultural agreement was found for all major findings, with minor deviations on the face and body tasks. Our findings highlight the multimodal nature of ER deficits in ASC, which exist for basic as well as complex emotions and are relatively stable cross-culturally. Cross-cultural research has the potential to reveal both autism-specific universal deficits and the role that specific cultures play in the way empathy operates in different countries.

  12. Virtual faces expressing emotions: an initial concomitant and construct validity study.

    Science.gov (United States)

    Joyal, Christian C; Jacob, Laurence; Cigna, Marie-Hélène; Guay, Jean-Pierre; Renaud, Patrice

    2014-01-01

    Facial expressions of emotions represent classic stimuli for the study of social cognition. Developing virtual dynamic facial expressions of emotions, however, would open-up possibilities, both for fundamental and clinical research. For instance, virtual faces allow real-time Human-Computer retroactions between physiological measures and the virtual agent. The goal of this study was to initially assess concomitants and construct validity of a newly developed set of virtual faces expressing six fundamental emotions (happiness, surprise, anger, sadness, fear, and disgust). Recognition rates, facial electromyography (zygomatic major and corrugator supercilii muscles), and regional gaze fixation latencies (eyes and mouth regions) were compared in 41 adult volunteers (20 ♂, 21 ♀) during the presentation of video clips depicting real vs. virtual adults expressing emotions. Emotions expressed by each set of stimuli were similarly recognized, both by men and women. Accordingly, both sets of stimuli elicited similar activation of facial muscles and similar ocular fixation times in eye regions from man and woman participants. Further validation studies can be performed with these virtual faces among clinical populations known to present social cognition difficulties. Brain-Computer Interface studies with feedback-feedforward interactions based on facial emotion expressions can also be conducted with these stimuli.

  13. Cognitive Biases for Emotional Faces in High- and Low-Trait Depressive Participants

    Directory of Open Access Journals (Sweden)

    Yi-Hsing Hsieh

    2004-10-01

    Full Text Available This study examined the association between trait depression and information-processing biases. Thirty participants were divided into high- and low-trait depressive groups based on the median of their depressive subscale scores according to the Basic Personality Inventory. Information-processing biases were measured using a deployment-of-attention task (DOAT and a recognition memory task (RMT. For the DOAT, participants saw one emotional face paired with a neutral face of the same person, and then were forced to choose on which face the color patch had first occurred. The percentage of participants' choices favoring the happy, angry, or sad faces represented the selective attentional bias score for each emotion, respectively. For the RMT, participants rated different types of emotional faces and subsequently discriminated old faces from new faces. The memory strength for each type of face was calculated from hit and false-positive rates, based on the signal detection theory. Compared with the low-trait depressive group, the high-trait depressive group showed a negative cognitive style. This was an enhanced recognition memory for sad faces and a weakened inhibition of attending to sad faces, suggesting that those with high depressive trait may be vulnerable to interpersonal withdrawal.

  14. Memory for faces and voices varies as a function of sex and expressed emotion.

    Science.gov (United States)

    S Cortes, Diana; Laukka, Petri; Lindahl, Christina; Fischer, Håkan

    2017-01-01

    We investigated how memory for faces and voices (presented separately and in combination) varies as a function of sex and emotional expression (anger, disgust, fear, happiness, sadness, and neutral). At encoding, participants judged the expressed emotion of items in forced-choice tasks, followed by incidental Remember/Know recognition tasks. Results from 600 participants showed that accuracy (hits minus false alarms) was consistently higher for neutral compared to emotional items, whereas accuracy for specific emotions varied across the presentation modalities (i.e., faces, voices, and face-voice combinations). For the subjective sense of recollection ("remember" hits), neutral items received the highest hit rates only for faces, whereas for voices and face-voice combinations anger and fear expressions instead received the highest recollection rates. We also observed better accuracy for items by female expressers, and own-sex bias where female participants displayed memory advantage for female faces and face-voice combinations. Results further suggest that own-sex bias can be explained by recollection, rather than familiarity, rates. Overall, results show that memory for faces and voices may be influenced by the expressions that they carry, as well as by the sex of both items and participants. Emotion expressions may also enhance the subjective sense of recollection without enhancing memory accuracy.

  15. Memory for faces and voices varies as a function of sex and expressed emotion.

    Directory of Open Access Journals (Sweden)

    Diana S Cortes

    Full Text Available We investigated how memory for faces and voices (presented separately and in combination varies as a function of sex and emotional expression (anger, disgust, fear, happiness, sadness, and neutral. At encoding, participants judged the expressed emotion of items in forced-choice tasks, followed by incidental Remember/Know recognition tasks. Results from 600 participants showed that accuracy (hits minus false alarms was consistently higher for neutral compared to emotional items, whereas accuracy for specific emotions varied across the presentation modalities (i.e., faces, voices, and face-voice combinations. For the subjective sense of recollection ("remember" hits, neutral items received the highest hit rates only for faces, whereas for voices and face-voice combinations anger and fear expressions instead received the highest recollection rates. We also observed better accuracy for items by female expressers, and own-sex bias where female participants displayed memory advantage for female faces and face-voice combinations. Results further suggest that own-sex bias can be explained by recollection, rather than familiarity, rates. Overall, results show that memory for faces and voices may be influenced by the expressions that they carry, as well as by the sex of both items and participants. Emotion expressions may also enhance the subjective sense of recollection without enhancing memory accuracy.

  16. A note on age differences in mood-congruent versus mood-incongruent emotion processing in faces

    Directory of Open Access Journals (Sweden)

    Manuel C. Voelkle

    2014-06-01

    Full Text Available This article addresses four interrelated research questions: (1 Does experienced mood affect emotion perception in faces and is this perception mood-congruent or mood-incongruent? (2 Are there age-group differences in the interplay between experienced mood and emotion perception? (3 Does emotion perception in faces change as a function of the temporal sequence of study sessions and stimuli presentation, and (4 does emotion perception in faces serve a mood-regulatory function? One hundred fifty-four adults of three different age groups (younger: 20–31 years; middle-aged: 44–55 years; older adults: 70–81 years were asked to provide multidimensional emotion ratings of a total of 1,026 face pictures of younger, middle-aged, and older men and women, each displaying six different prototypical (primary emotional expressions. By analyzing the likelihood of ascribing an additional emotional expression to a face whose primary emotion had been correctly recognized, the multidimensional rating approach permits the study of emotion perception while controlling for emotion recognition. Following up on previous research on mood responses to recurring unpleasant situations using the same dataset (Voelkle, Ebner, Lindenberger, & Riediger, 2013, crossed random effects analyses supported a mood-congruent relationship between experienced mood and perceived emotions in faces. In particular older adults were more likely to perceive happiness in faces when being in a positive mood and less likely to do so when being in a negative mood. This did not apply to younger adults. Temporal sequence of study sessions and stimuli presentation had a strong effect on the likelihood of ascribing an additional emotional expression. In contrast to previous findings, however, there was neither evidence for a change from mood-congruent to mood-incongruent responses over time nor evidence for a mood-regulatory effect.

  17. Facing the Problem: Impaired Emotion Recognition During Multimodal Social Information Processing in Borderline Personality Disorder.

    Science.gov (United States)

    Niedtfeld, Inga; Defiebre, Nadine; Regenbogen, Christina; Mier, Daniela; Fenske, Sabrina; Kirsch, Peter; Lis, Stefanie; Schmahl, Christian

    2017-04-01

    Previous research has revealed alterations and deficits in facial emotion recognition in patients with borderline personality disorder (BPD). During interpersonal communication in daily life, social signals such as speech content, variation in prosody, and facial expression need to be considered simultaneously. We hypothesized that deficits in higher level integration of social stimuli contribute to difficulties in emotion recognition in BPD, and heightened arousal might explain this effect. Thirty-one patients with BPD and thirty-one healthy controls were asked to identify emotions in short video clips, which were designed to represent different combinations of the three communication channels: facial expression, speech content, and prosody. Skin conductance was recorded as a measure of sympathetic arousal, while controlling for state dissociation. Patients with BPD showed lower mean accuracy scores than healthy control subjects in all conditions comprising emotional facial expressions. This was true for the condition with facial expression only, and for the combination of all three communication channels. Electrodermal responses were enhanced in BPD only in response to auditory stimuli. In line with the major body of facial emotion recognition studies, we conclude that deficits in the interpretation of facial expressions lead to the difficulties observed in multimodal emotion processing in BPD.

  18. Time for a Change: College Students' Preference for Technology-Mediated Versus Face-to-Face Help for Emotional Distress.

    Science.gov (United States)

    Lungu, Anita; Sun, Michael

    2016-12-01

    Even with recent advances in psychological treatments and mobile technology, online computerized therapy is not yet popular. College students, with ubiquitous access to technology, experiencing high distress, and often nontreatment seekers, could be an important area for online treatment dissemination. Finding ways to reach out to college students by offering psychological interventions through technology, devices, and applications they often use, might increase their engagement in treatment. This study evaluates college students' reported willingness to seek help for emotional distress through novel delivery mediums, to play computer games for learning emotional coping skills, and to disclose personal information online. We also evaluated the role of ethnicity and level of emotional distress in help-seeking patterns. A survey exploring our domains of interest and the Mental Health Inventory ([MHI] as mental health index) were completed by 572 students (mean age 18.7 years, predominantly Asian American, female, and freshmen in college). More participants expressed preference for online versus face-to-face professional help. We found no relationship between MHI and help-seeking preference. A third of participants were likely to disclose at least as much information online as face-to-face. Ownership of mobile technology was pervasive. Asian Americans were more likely to be nontreatment seekers than Caucasians. Most participants were interested in serious games for emotional distress. Our results suggest that college students are very open to creative ways of receiving emotional help such as playing games and seeking emotional help online, suggesting a need for online evidence-based treatments.

  19. Perception of emotional prosody in adults with attention deficit hyperactivity disorder.

    Science.gov (United States)

    Kis, B; Guberina, N; Kraemer, M; Niklewski, F; Dziobek, I; Wiltfang, J; Abdel-Hamid, M

    2017-06-01

    Attention deficit hyperactivity disorder (ADHD) is associated with social conflicts. The purpose of this study was to explore domains of social cognition in adult patients with ADHD. The assessment of social cognition was based on established neuropsychological tests: the Tübinger Affect Battery (TAB) for prosody and the Cambridge Behaviour Scale (CBS) for empathy. The performance of adults with ADHD (N = 28) was compared with the performance of a control group (N = 29) matched according to basic demographic variables. Treatment-naïve adults with ADHD showed deficits in emotional prosody (P = 0.02) and in the ability to empathize (P 0.2). No gender differences concerning social cognitive skills were detected. ADHD is associated with social cognition impairments involving both emotional prosody and empathy. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. Manifold Adaptive Label Propagation for Face Clustering.

    Science.gov (United States)

    Pei, Xiaobing; Lyu, Zehua; Chen, Changqing; Chen, Chuanbo

    2015-08-01

    In this paper, a novel label propagation (LP) method is presented, called the manifold adaptive label propagation (MALP) method, which is to extend original LP by integrating sparse representation constraint into regularization framework of LP method. Similar to most LP, first of all, MALP also finds graph edges from given data and gives weights to the graph edges. Our goal is to find graph weights matrix adaptively. The key advantage of our approach is that MALP simultaneously finds graph weights matrix and predicts the label of unlabeled data. This paper also derives efficient algorithm to solve the proposed problem. Extensions of our MALP in kernel space and robust version are presented. The proposed method has been applied to the problem of semi-supervised face clustering using the well-known ORL, Yale, extended YaleB, and PIE datasets. Our experimental evaluations show the effectiveness of our method.

  1. VIRTUAL AVATAR FOR EMOTION RECOGNITION IN PATIENTS WITH SCHIZOPHRENIA: A PILOT STUDY

    Directory of Open Access Journals (Sweden)

    Samuel Marcos Pablos

    2016-08-01

    Full Text Available Persons who suffer from schizophrenia have difficulties in recognizing emotions in others’ facial expressions, which affects their capabilities for social interaction and hinders their social integration. Photographic images have traditionally been used to explore emotion recognition impairments in schizophrenia patients, which lack of the dynamism that is inherent to face to face social interactions. In order to overcome those inconveniences, in the present work the use of an animated, virtual face is approached. The avatar has the appearance of a highly realistic human face and is able to express different emotions dynamically, introducing some advantages over photograph-based approaches such as its dynamic appearance.We present the results of a pilot study in order to assess the validity of the interface as a tool for clinical psychiatrists. 20 subjects who suffer from schizophrenia of long evolution and 20 control subjects were invited to recognize a set of facial emotions showed by a virtual avatar and images. The objective of the study is to explore the possibilities of using a realistic-looking avatar for the assessment of emotion recognition deficits in patients who suffer schizophrenia. Our results suggest that the proposed avatar may be a suitable tool for the diagnosis and treatment of deficits in the facial recognition of emotions.

  2. Word and face recognition deficits following posterior cerebral artery stroke

    DEFF Research Database (Denmark)

    Kuhn, Christina D.; Asperud Thomsen, Johanne; Delfi, Tzvetelina

    2016-01-01

    Abstract Recent findings have challenged the existence of category specific brain areas for perceptual processing of words and faces, suggesting the existence of a common network supporting the recognition of both. We examined the performance of patients with focal lesions in posterior cortical...... areas to investigate whether deficits in recognition of words and faces systematically co-occur as would be expected if both functions rely on a common cerebral network. Seven right-handed patients with unilateral brain damage following stroke in areas supplied by the posterior cerebral artery were...... included (four with right hemisphere damage, three with left, tested at least 1 year post stroke). We examined word and face recognition using a delayed match-to-sample paradigm using four different categories of stimuli: cropped faces, full faces, words, and cars. Reading speed and word length effects...

  3. Fronto-Limbic Brain Dysfunction during the Regulation of Emotion in Schizophrenia.

    Directory of Open Access Journals (Sweden)

    Shaun M Eack

    Full Text Available Schizophrenia is characterized by significant and widespread impairments in the regulation of emotion. Evidence is only recently emerging regarding the neural basis of these emotion regulation impairments, and few studies have focused on the regulation of emotion during effortful cognitive processing. To examine the neural correlates of deficits in effortful emotion regulation, schizophrenia outpatients (N = 20 and age- and gender-matched healthy volunteers (N = 20 completed an emotional faces n-back task to assess the voluntary attentional control subprocess of emotion regulation during functional magnetic resonance imaging. Behavioral measures of emotional intelligence and emotion perception were administered to examine brain-behavior relationships with emotion processing outcomes. Results indicated that patients with schizophrenia demonstrated significantly greater activation in the bilateral striatum, ventromedial prefrontal, and right orbitofrontal cortices during the effortful regulation of positive emotional stimuli, and reduced activity in these same regions when regulating negative emotional information. The opposite pattern of results was observed in healthy individuals. Greater fronto-striatal response to positive emotional distractors was significantly associated with deficits in facial emotion recognition. These findings indicate that abnormalities in striatal and prefrontal cortical systems may be related to deficits in the effortful emotion regulatory process of attentional control in schizophrenia, and may significantly contribute to emotion processing deficits in the disorder.

  4. A new method for face detection in colour images for emotional bio-robots

    Institute of Scientific and Technical Information of China (English)

    HAPESHI; Kevin

    2010-01-01

    Emotional bio-robots have become a hot research topic in last two decades. Though there have been some progress in research, design and development of various emotional bio-robots, few of them can be used in practical applications. The study of emotional bio-robots demands multi-disciplinary co-operation. It involves computer science, artificial intelligence, 3D computation, engineering system modelling, analysis and simulation, bionics engineering, automatic control, image processing and pattern recognition etc. Among them, face detection belongs to image processing and pattern recognition. An emotional robot must have the ability to recognize various objects, particularly, it is very important for a bio-robot to be able to recognize human faces from an image. In this paper, a face detection method is proposed for identifying any human faces in colour images using human skin model and eye detection method. Firstly, this method can be used to detect skin regions from the input colour image after normalizing its luminance. Then, all face candidates are identified using an eye detection method. Comparing with existing algorithms, this method only relies on the colour and geometrical data of human face rather than using training datasets. From experimental results, it is shown that this method is effective and fast and it can be applied to the development of an emotional bio-robot with further improvements of its speed and accuracy.

  5. [Recognition of facial emotions and theory of mind in schizophrenia: could the theory of mind deficit be due to the non-recognition of facial emotions?].

    Science.gov (United States)

    Besche-Richard, C; Bourrin-Tisseron, A; Olivier, M; Cuervo-Lombard, C-V; Limosin, F

    2012-06-01

    The deficits of recognition of facial emotions and attribution of mental states are now well-documented in schizophrenic patients. However, we don't clearly know about the link between these two complex cognitive functions, especially in schizophrenia. In this study, we attempted to test the link between the recognition of facial emotions and the capacities of mentalization, notably the attribution of beliefs, in health and schizophrenic participants. We supposed that the level of performance of recognition of facial emotions, compared to the working memory and executive functioning, was the best predictor of the capacities to attribute a belief. Twenty schizophrenic participants according to DSM-IVTR (mean age: 35.9 years, S.D. 9.07; mean education level: 11.15 years, S.D. 2.58) clinically stabilized, receiving neuroleptic or antipsychotic medication participated in the study. They were matched on age (mean age: 36.3 years, S.D. 10.9) and educational level (mean educational level: 12.10, S.D. 2.25) with 30 matched healthy participants. All the participants were evaluated with a pool of tasks testing the recognition of facial emotions (the faces of Baron-Cohen), the attribution of beliefs (two stories of first order and two stories of second order), the working memory (the digit span of the WAIS-III and the Corsi test) and the executive functioning (Trail Making Test A et B, Wisconsin Card Sorting Test brief version). Comparing schizophrenic and healthy participants, our results confirmed a difference between the performances of the recognition of facial emotions and those of the attribution of beliefs. The result of the simple linear regression showed that the recognition of facial emotions, compared to the performances of working memory and executive functioning, was the best predictor of the performances in the theory of mind stories. Our results confirmed, in a sample of schizophrenic patients, the deficits in the recognition of facial emotions and in the

  6. Emotion Recognition in Face and Body Motion in Bulimia Nervosa.

    Science.gov (United States)

    Dapelo, Marcela Marin; Surguladze, Simon; Morris, Robin; Tchanturia, Kate

    2017-11-01

    Social cognition has been studied extensively in anorexia nervosa (AN), but there are few studies in bulimia nervosa (BN). This study investigated the ability of people with BN to recognise emotions in ambiguous facial expressions and in body movement. Participants were 26 women with BN, who were compared with 35 with AN, and 42 healthy controls. Participants completed an emotion recognition task by using faces portraying blended emotions, along with a body emotion recognition task by using videos of point-light walkers. The results indicated that BN participants exhibited difficulties recognising disgust in less-ambiguous facial expressions, and a tendency to interpret non-angry faces as anger, compared with healthy controls. These difficulties were similar to those found in AN. There were no significant differences amongst the groups in body motion emotion recognition. The findings suggest that difficulties with disgust and anger recognition in facial expressions may be shared transdiagnostically in people with eating disorders. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association.

  7. Investigating emotional top down modulation of ambiguous faces by single pulse TMS on early visual cortices

    Directory of Open Access Journals (Sweden)

    Zachary Adam Yaple

    2016-06-01

    Full Text Available Top-down processing is a mechanism in which memory, context and expectation are used to perceive stimuli. For this study we investigated how emotion content, induced by music mood, influences perception of happy and sad emoticons. Using single pulse TMS we stimulated right occipital face area (rOFA, primary visual cortex (V1 and vertex while subjects performed a face-detection task and listened to happy and sad music. At baseline, incongruent audio-visual pairings decreased performance, demonstrating dependence of emotion while perceiving ambiguous faces. However, performance of face identification decreased during rOFA stimulation regardless of emotional content. No effects were found between Cz and V1 stimulation. These results suggest that while rOFA is important for processing faces regardless of emotion, V1 stimulation had no effect. Our findings suggest that early visual cortex activity may not integrate emotional auditory information with visual information during emotion top-down modulation of faces.

  8. Functional Brain Activation to Emotional and non-Emotional Faces in Healthy Children: Evidence for Developmentally Undifferentiated Amygdala Function During the School Age Period

    Science.gov (United States)

    Pagliaccio, David; Luby, Joan L.; Gaffrey, Michael S.; Belden, Andrew C.; Botteron, Kelly N.; Harms, Michael P.; Barch, Deanna M.

    2013-01-01

    The amygdala is a key region in emotion processing. Particularly, fMRI studies have demonstrated that the amygdala is active during the viewing of emotional faces. Previous research has consistently found greater amygdala responses to fearful faces as compared to neutral faces in adults, convergent with a focus in the animal literature on the amygdala's role in fear processing. Studies have found that the amygdala also responds differentially to other facial emotion types in adults. Yet, the literature regarding when this differential amygdala responsivity develops is limited and mixed. Thus, the goal of current study was to examine amygdala responses to emotional and neutral faces in a relatively large sample of healthy school age children (N = 52). While the amygdala was active in response to emotional and neutral faces, the results do not support the hypothesis that the amygdala responds differentially to emotional faces in 7 – 12 year old children. Nonetheless, amygdala activity was correlated with the severity of subclinical depression symptoms and emotional regulation skills. Additionally, sex differences were observed in frontal, temporal, and visual regions as well as effects of pubertal development in visual regions. These findings suggest important differences in amygdala reactivity in childhood. PMID:23636982

  9. Emotional Faces in Context: Age Differences in Recognition Accuracy and Scanning Patterns

    Science.gov (United States)

    Noh, Soo Rim; Isaacowitz, Derek M.

    2014-01-01

    While age-related declines in facial expression recognition are well documented, previous research relied mostly on isolated faces devoid of context. We investigated the effects of context on age differences in recognition of facial emotions and in visual scanning patterns of emotional faces. While their eye movements were monitored, younger and older participants viewed facial expressions (i.e., anger, disgust) in contexts that were emotionally congruent, incongruent, or neutral to the facial expression to be identified. Both age groups had highest recognition rates of facial expressions in the congruent context, followed by the neutral context, and recognition rates in the incongruent context were worst. These context effects were more pronounced for older adults. Compared to younger adults, older adults exhibited a greater benefit from congruent contextual information, regardless of facial expression. Context also influenced the pattern of visual scanning characteristics of emotional faces in a similar manner across age groups. In addition, older adults initially attended more to context overall. Our data highlight the importance of considering the role of context in understanding emotion recognition in adulthood. PMID:23163713

  10. Effects of facial emotion recognition remediation on visual scanning of novel face stimuli.

    Science.gov (United States)

    Marsh, Pamela J; Luckett, Gemma; Russell, Tamara; Coltheart, Max; Green, Melissa J

    2012-11-01

    Previous research shows that emotion recognition in schizophrenia can be improved with targeted remediation that draws attention to important facial features (eyes, nose, mouth). Moreover, the effects of training have been shown to last for up to one month after training. The aim of this study was to investigate whether improved emotion recognition of novel faces is associated with concomitant changes in visual scanning of these same novel facial expressions. Thirty-nine participants with schizophrenia received emotion recognition training using Ekman's Micro-Expression Training Tool (METT), with emotion recognition and visual scanpath (VSP) recordings to face stimuli collected simultaneously. Baseline ratings of interpersonal and cognitive functioning were also collected from all participants. Post-METT training, participants showed changes in foveal attention to the features of facial expressions of emotion not used in METT training, which were generally consistent with the information about important features from the METT. In particular, there were changes in how participants looked at the features of facial expressions of emotion surprise, disgust, fear, happiness, and neutral, demonstrating that improved emotion recognition is paralleled by changes in the way participants with schizophrenia viewed novel facial expressions of emotion. However, there were overall decreases in foveal attention to sad and neutral faces that indicate more intensive instruction might be needed for these faces during training. Most importantly, the evidence shows that participant gender may affect training outcomes. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Retrieval-based Face Annotation by Weak Label Regularized Local Coordinate Coding.

    Science.gov (United States)

    Wang, Dayong; Hoi, Steven C H; He, Ying; Zhu, Jianke; Mei, Tao; Luo, Jiebo

    2013-08-02

    Retrieval-based face annotation is a promising paradigm of mining massive web facial images for automated face annotation. This paper addresses a critical problem of such paradigm, i.e., how to effectively perform annotation by exploiting the similar facial images and their weak labels which are often noisy and incomplete. In particular, we propose an effective Weak Label Regularized Local Coordinate Coding (WLRLCC) technique, which exploits the principle of local coordinate coding in learning sparse features, and employs the idea of graph-based weak label regularization to enhance the weak labels of the similar facial images. We present an efficient optimization algorithm to solve the WLRLCC task. We conduct extensive empirical studies on two large-scale web facial image databases: (i) a Western celebrity database with a total of $6,025$ persons and $714,454$ web facial images, and (ii)an Asian celebrity database with $1,200$ persons and $126,070$ web facial images. The encouraging results validate the efficacy of the proposed WLRLCC algorithm. To further improve the efficiency and scalability, we also propose a PCA-based approximation scheme and an offline approximation scheme (AWLRLCC), which generally maintains comparable results but significantly saves much time cost. Finally, we show that WLRLCC can also tackle two existing face annotation tasks with promising performance.

  12. Emotion recognition impairment in traumatic brain injury compared with schizophrenia spectrum: similar deficits with different origins.

    Science.gov (United States)

    Mancuso, Mauro; Magnani, Nadia; Cantagallo, Anna; Rossi, Giulia; Capitani, Donatella; Galletti, Vania; Cardamone, Giuseppe; Robertson, Ian Hamilton

    2015-02-01

    The aim of our study was to identify the common and separate mechanisms that might underpin emotion recognition impairment in patients with traumatic brain injury (TBI) and schizophrenia (Sz) compared with healthy controls (HCs). We recruited 21 Sz outpatients, 24 severe TBI outpatients, and 38 HCs, and we used eye-tracking to compare facial emotion processing performance. Both Sz and TBI patients were significantly poorer at recognizing facial emotions compared with HC. Sz patients showed a different way of exploring the Pictures of Facial Affects stimuli and were significantly worse in recognition of neutral expressions. Selective or sustained attention deficits in TBI may reduce efficient emotion recognition, whereas in Sz, there is a more strategic deficit underlying the observed problem. There would seem to be scope for adjustment of effective rehabilitative training focused on emotion recognition.

  13. Schizophrenia and Category-Selectivity in the Brain: Normal for Faces but Abnormal for Houses

    Directory of Open Access Journals (Sweden)

    Lisa Kronbichler

    2018-02-01

    Full Text Available Face processing is regularly found to be impaired in schizophrenia (SZ, thus suggesting that social malfunctioning might be caused by dysfunctional face processing. Most studies focused on emotional face processes, whereas non-emotional face processing received less attention. While current reports on abnormal face processing in SZ are mixed, examinations of non-emotional face processing compared to adequate control stimuli may clarify whether SZ is characterized by a face-processing deficit. Patients with SZ (n = 28 and healthy controls (n = 30 engaged in an fMRI scan where images of non-emotional faces and houses were presented. A simple inverted-picture detection task warranted the participants’ attention. Region of interest (ROI analyses were conducted on face-sensitive regions including the fusiform face area, the occipital face area, and the superior temporal sulcus. Scene-sensitivity was assessed in the parahippocampal place area (PPA and served as control condition. Patients did not show aberrant face-related neural processes in face-sensitive regions. This finding was also evident when analyses were done on individually defined ROIs or on in-house-localizer ROIs. Patients revealed a decreased specificity toward house stimuli as reflected in decreased neural response toward houses in the PPA. Again, this result was supported by supplementary analyses. Neural activation toward neutral faces was not found to be impaired in SZ, therefore speaking against an overall face-processing deficit. Aberrant activation in scene-sensitive PPA is also found in assessments of memory processes in SZ. It is up to future studies to show how impairments in PPA relate to functional outcome in SZ.

  14. Emotional expectations influence neural sensitivity to fearful faces in humans:An event-related potential study

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The present study tested whether neural sensitivity to salient emotional facial expressions was influenced by emotional expectations induced by a cue that validly predicted the expression of a subsequently presented target face. Event-related potentials (ERPs) elicited by fearful and neutral faces were recorded while participants performed a gender discrimination task under cued (‘expected’) and uncued (‘unexpected’) conditions. The behavioral results revealed that accuracy was lower for fearful compared with neutral faces in the unexpected condition, while accuracy was similar for fearful and neutral faces in the expected condition. ERP data revealed increased amplitudes in the P2 component and 200–250 ms interval for unexpected fearful versus neutral faces. By contrast, ERP responses were similar for fearful and neutral faces in the expected condition. These findings indicate that human neural sensitivity to fearful faces is modulated by emotional expectations. Although the neural system is sensitive to unpredictable emotionally salient stimuli, sensitivity to salient stimuli is reduced when these stimuli are predictable.

  15. A new selective developmental deficit: Impaired object recognition with normal face recognition.

    Science.gov (United States)

    Germine, Laura; Cashdollar, Nathan; Düzel, Emrah; Duchaine, Bradley

    2011-05-01

    Studies of developmental deficits in face recognition, or developmental prosopagnosia, have shown that individuals who have not suffered brain damage can show face recognition impairments coupled with normal object recognition (Duchaine and Nakayama, 2005; Duchaine et al., 2006; Nunn et al., 2001). However, no developmental cases with the opposite dissociation - normal face recognition with impaired object recognition - have been reported. The existence of a case of non-face developmental visual agnosia would indicate that the development of normal face recognition mechanisms does not rely on the development of normal object recognition mechanisms. To see whether a developmental variant of non-face visual object agnosia exists, we conducted a series of web-based object and face recognition tests to screen for individuals showing object recognition memory impairments but not face recognition impairments. Through this screening process, we identified AW, an otherwise normal 19-year-old female, who was then tested in the lab on face and object recognition tests. AW's performance was impaired in within-class visual recognition memory across six different visual categories (guns, horses, scenes, tools, doors, and cars). In contrast, she scored normally on seven tests of face recognition, tests of memory for two other object categories (houses and glasses), and tests of recall memory for visual shapes. Testing confirmed that her impairment was not related to a general deficit in lower-level perception, object perception, basic-level recognition, or memory. AW's results provide the first neuropsychological evidence that recognition memory for non-face visual object categories can be selectively impaired in individuals without brain damage or other memory impairment. These results indicate that the development of recognition memory for faces does not depend on intact object recognition memory and provide further evidence for category-specific dissociations in visual

  16. Faces and bodies: perception and mimicry of emotionally congruent and incongruent facial and bodily expressions

    Directory of Open Access Journals (Sweden)

    Mariska eKret

    2013-02-01

    Full Text Available Traditional emotion theories stress the importance of the face in the expression of emotions but bodily expressions are becoming increasingly important. Here we tested the hypothesis that similar physiological responses can be evoked by observing emotional face and body signals and that the reaction to angry signals is amplified in anxious individuals. We designed three experiments in which participants categorized emotional expressions from isolated facial and bodily expressions and from emotionally congruent and incongruent face-body compounds. Participants’ fixations were measured and their pupil size recorded with eye-tracking equipment, and their facial reactions measured with electromyography (EMG. The behavioral results support our prediction that the recognition of a facial expression is improved in the context of a matching posture and importantly, also vice versa. From their facial expression, it appeared that observers acted with signs of negative emotionality (increased corrugator activity to angry and fearful facial expressions and with positive emotionality (increased zygomaticus to happy facial expressions. What we predicted and found, was that angry and fearful cues from the face or the body, attracted more attention than happy cues. We further observed that responses evoked by angry cues were amplified in individuals with high anxiety scores. In sum, we show that people process bodily expressions of emotion in a similar fashion as facial expressions and that the congruency between the emotional signals from the face and body ameliorates the recognition of the emotion.

  17. Putting the face in context: Body expressions impact facial emotion processing in human infants

    Directory of Open Access Journals (Sweden)

    Purva Rajhans

    2016-06-01

    Full Text Available Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs. We primed infants with body postures (fearful, happy that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception.

  18. Veiled emotions: the effect of covered faces on emotion perception and attitudes

    NARCIS (Netherlands)

    Fischer, A.H.; Gillebaart, M.; Rotteveel, M.; Becker, D.; Vliek, M.

    2012-01-01

    The present study explores the relative absence of expressive cues and the effect of contextual cues on the perception of emotions and its effect on attitudes. The visibility of expressive cues was manipulated by showing films displaying female targets whose faces were either fully visible, covered

  19. Facial emotion linked cooperation in patients with paranoid schizophrenia: a test on the Interpersonal Communication Model.

    Science.gov (United States)

    Tse, Wai S; Yan Lu; Bond, Alyson J; Chan, Raymond Ck; Tam, Danny W H

    2011-09-01

    Patients with schizophrenia consistently show deficits in facial affect perception and social behaviours. It is illusive to suggest that these deficits in facial affect perception cause poor social behaviours. The present research aims to study how facial affects influence ingratiation, cooperation and punishment behaviours of the patients. Forty outpatients with paranoid schizophrenia, 26 matched depressed patients and 46 healthy volunteers were recruited. After measurement of clinical symptoms and depression, their facial emotion recognition, neurocognitive functioning and the facial affects dependent cooperative behaviour were measured using a modified version of Mixed-Motive Game. The depressed control group showed demographic characteristics, depression levels and neurocognitive functioning similar to the schizophrenic group. Patients with schizophrenia committed significantly more errors in neutral face identification than the other two groups. They were significantly more punitive on the Mixed-Motive Game in the neutral face condition. Neutral face misidentification was a unique emotion-processing deficit in the schizophrenic group. Their increase in punitive behaviours in the neutral face condition might confuse their family members and trigger more expressed emotion from them, thus increasing the risk of relapse. Family members might display more happy faces to promote positive relationships with patients.

  20. Music-Elicited Emotion Identification Using Optical Flow Analysis of Human Face

    Science.gov (United States)

    Kniaz, V. V.; Smirnova, Z. N.

    2015-05-01

    Human emotion identification from image sequences is highly demanded nowadays. The range of possible applications can vary from an automatic smile shutter function of consumer grade digital cameras to Biofied Building technologies, which enables communication between building space and residents. The highly perceptual nature of human emotions leads to the complexity of their classification and identification. The main question arises from the subjective quality of emotional classification of events that elicit human emotions. A variety of methods for formal classification of emotions were developed in musical psychology. This work is focused on identification of human emotions evoked by musical pieces using human face tracking and optical flow analysis. Facial feature tracking algorithm used for facial feature speed and position estimation is presented. Facial features were extracted from each image sequence using human face tracking with local binary patterns (LBP) features. Accurate relative speeds of facial features were estimated using optical flow analysis. Obtained relative positions and speeds were used as the output facial emotion vector. The algorithm was tested using original software and recorded image sequences. The proposed technique proves to give a robust identification of human emotions elicited by musical pieces. The estimated models could be used for human emotion identification from image sequences in such fields as emotion based musical background or mood dependent radio.

  1. Adolescents' emotional competence is associated with parents' neural sensitivity to emotions.

    Science.gov (United States)

    Telzer, Eva H; Qu, Yang; Goldenberg, Diane; Fuligni, Andrew J; Galván, Adriana; Lieberman, Matthew D

    2014-01-01

    An essential component of youths' successful development is learning to appropriately respond to emotions, including the ability to recognize, identify, and describe one's feelings. Such emotional competence is thought to arise through the parent-child relationship. Yet, the mechanisms by which parents transmit emotional competence to their children are difficult to measure because they are often implicit, idiosyncratic, and not easily articulated by parents or children. In the current study, we used a multifaceted approach that went beyond self-report measures and examined whether parental neural sensitivity to emotions predicted their child's emotional competence. Twenty-two adolescent-parent dyads completed an fMRI scan during which they labeled the emotional expressions of negatively valenced faces. Results indicate that parents who recruited the amygdala, VLPFC, and brain regions involved in mentalizing (i.e., inferring others' emotional states) had adolescent children with greater emotional competence. These results held after controlling for parents' self-reports of emotional expressivity and adolescents' self-reports of the warmth and support of their parent relationships. In addition, adolescents recruited neural regions involved in mentalizing during affect labeling, which significantly mediated the associated between parental neural sensitivity and adolescents' emotional competence, suggesting that youth are modeling or referencing their parents' emotional profiles, thereby contributing to better emotional competence.

  2. Adolescents’ emotional competence is associated with parents’ neural sensitivity to emotions

    Directory of Open Access Journals (Sweden)

    Eva H Telzer

    2014-07-01

    Full Text Available An essential component of youths’ successful development is learning to appropriately respond to emotions, including the ability to recognize, identify, and describe one’s feelings. Such emotional competence is thought to arise through the parent-child relationship. Yet, the mechanisms by which parents transmit emotional competence to their children are difficult to measure because they are often implicit, idiosyncratic, and not easily articulated by parents or children. In the current study, we used a multifaceted approach that went beyond self-report measures and examined whether parental neural sensitivity to emotions predicted their child’s emotional competence. Twenty-two adolescent-parent dyads completed an fMRI scan during which they labeled the emotional expressions of negatively valenced faces. Results indicate that parents who recruited the amygdala, VLPFC, and brain regions involved in mentalizing (i.e., inferring others’ emotional states had adolescent children with greater emotional competence. These results held after controlling for parents’ self-reports of emotional expressivity and adolescents’ self-reports of the warmth and support of their parent relationships. In addition, adolescents recruited neural regions involved in mentalizing during affect labeling, which significantly mediated the associated between parental neural sensitivity and adolescents’ emotional competence, suggesting that youth are modeling or referencing their parents’ emotional profiles, thereby contributing to better emotional competence.

  3. Affective Prosody Labeling in Youths with Bipolar Disorder or Severe Mood Dysregulation

    Science.gov (United States)

    Deveney, Christen M.; Brotman, Melissa A.; Decker, Ann Marie; Pine, Daniel S.; Leibenluft, Ellen

    2012-01-01

    Background: Accurate identification of nonverbal emotional cues is essential to successful social interactions, yet most research is limited to emotional face expression labeling. Little research focuses on the processing of emotional prosody, or tone of verbal speech, in clinical populations. Methods: Using the Diagnostic Analysis of Nonverbal…

  4. Sleep promotes consolidation of emotional memory in healthy children but not in children with attention-deficit hyperactivity disorder.

    Directory of Open Access Journals (Sweden)

    Alexander Prehn-Kristensen

    Full Text Available Fronto-limbic brain activity during sleep is believed to support the consolidation of emotional memories in healthy adults. Attention deficit-hyperactivity disorder (ADHD is accompanied by emotional deficits coincidently caused by dysfunctional interplay of fronto-limbic circuits. This study aimed to examine the role of sleep in the consolidation of emotional memory in ADHD in the context of healthy development. 16 children with ADHD, 16 healthy children, and 20 healthy adults participated in this study. Participants completed an emotional picture recognition paradigm in sleep and wake control conditions. Each condition had an immediate (baseline and delayed (target retrieval session. The emotional memory bias was baseline-corrected, and groups were compared in terms of sleep-dependent memory consolidation (sleep vs. wake. We observed an increased sleep-dependent emotional memory bias in healthy children compared to children with ADHD and healthy adults. Frontal oscillatory EEG activity (slow oscillations, theta during sleep correlated negatively with emotional memory performance in children with ADHD. When combining data of healthy children and adults, correlation coefficients were positive and differed from those in children with ADHD. Since children displayed a higher frontal EEG activity than adults these data indicate a decline in sleep-related consolidation of emotional memory in healthy development. In addition, it is suggested that deficits in sleep-related selection between emotional and non-emotional memories in ADHD exacerbate emotional problems during daytime as they are often reported in ADHD.

  5. Sleep Promotes Consolidation of Emotional Memory in Healthy Children but Not in Children with Attention-Deficit Hyperactivity Disorder

    Science.gov (United States)

    Prehn-Kristensen, Alexander; Munz, Manuel; Molzow, Ina; Wilhelm, Ines; Wiesner, Christian D.; Baving, Lioba

    2013-01-01

    Fronto-limbic brain activity during sleep is believed to support the consolidation of emotional memories in healthy adults. Attention deficit-hyperactivity disorder (ADHD) is accompanied by emotional deficits coincidently caused by dysfunctional interplay of fronto-limbic circuits. This study aimed to examine the role of sleep in the consolidation of emotional memory in ADHD in the context of healthy development. 16 children with ADHD, 16 healthy children, and 20 healthy adults participated in this study. Participants completed an emotional picture recognition paradigm in sleep and wake control conditions. Each condition had an immediate (baseline) and delayed (target) retrieval session. The emotional memory bias was baseline–corrected, and groups were compared in terms of sleep-dependent memory consolidation (sleep vs. wake). We observed an increased sleep-dependent emotional memory bias in healthy children compared to children with ADHD and healthy adults. Frontal oscillatory EEG activity (slow oscillations, theta) during sleep correlated negatively with emotional memory performance in children with ADHD. When combining data of healthy children and adults, correlation coefficients were positive and differed from those in children with ADHD. Since children displayed a higher frontal EEG activity than adults these data indicate a decline in sleep-related consolidation of emotional memory in healthy development. In addition, it is suggested that deficits in sleep-related selection between emotional and non-emotional memories in ADHD exacerbate emotional problems during daytime as they are often reported in ADHD. PMID:23734235

  6. Sleep promotes consolidation of emotional memory in healthy children but not in children with attention-deficit hyperactivity disorder.

    Science.gov (United States)

    Prehn-Kristensen, Alexander; Munz, Manuel; Molzow, Ina; Wilhelm, Ines; Wiesner, Christian D; Baving, Lioba

    2013-01-01

    Fronto-limbic brain activity during sleep is believed to support the consolidation of emotional memories in healthy adults. Attention deficit-hyperactivity disorder (ADHD) is accompanied by emotional deficits coincidently caused by dysfunctional interplay of fronto-limbic circuits. This study aimed to examine the role of sleep in the consolidation of emotional memory in ADHD in the context of healthy development. 16 children with ADHD, 16 healthy children, and 20 healthy adults participated in this study. Participants completed an emotional picture recognition paradigm in sleep and wake control conditions. Each condition had an immediate (baseline) and delayed (target) retrieval session. The emotional memory bias was baseline-corrected, and groups were compared in terms of sleep-dependent memory consolidation (sleep vs. wake). We observed an increased sleep-dependent emotional memory bias in healthy children compared to children with ADHD and healthy adults. Frontal oscillatory EEG activity (slow oscillations, theta) during sleep correlated negatively with emotional memory performance in children with ADHD. When combining data of healthy children and adults, correlation coefficients were positive and differed from those in children with ADHD. Since children displayed a higher frontal EEG activity than adults these data indicate a decline in sleep-related consolidation of emotional memory in healthy development. In addition, it is suggested that deficits in sleep-related selection between emotional and non-emotional memories in ADHD exacerbate emotional problems during daytime as they are often reported in ADHD.

  7. Age-related differences in event-related potentials for early visual processing of emotional faces.

    Science.gov (United States)

    Hilimire, Matthew R; Mienaltowski, Andrew; Blanchard-Fields, Fredda; Corballis, Paul M

    2014-07-01

    With advancing age, processing resources are shifted away from negative emotional stimuli and toward positive ones. Here, we explored this 'positivity effect' using event-related potentials (ERPs). Participants identified the presence or absence of a visual probe that appeared over photographs of emotional faces. The ERPs elicited by the onsets of angry, sad, happy and neutral faces were recorded. We examined the frontocentral emotional positivity (FcEP), which is defined as a positive deflection in the waveforms elicited by emotional expressions relative to neutral faces early on in the time course of the ERP. The FcEP is thought to reflect enhanced early processing of emotional expressions. The results show that within the first 130 ms young adults show an FcEP to negative emotional expressions, whereas older adults show an FcEP to positive emotional expressions. These findings provide additional evidence that the age-related positivity effect in emotion processing can be traced to automatic processes that are evident very early in the processing of emotional facial expressions. © The Author (2013). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  8. [Abnormal processing characteristics to basic emotional faces in the early phase in children with autism spectrum disorder].

    Science.gov (United States)

    Lin, Qiong-Xi; Wu, Gui-Hua; Zhang, Ling; Wang, Zeng-Jian; Pan, Ning; Xu, Cai-Juan; Jing, Jin; Jin, Yu

    2018-02-01

    To explore the recognition ability and abnormal processing characteristics to basic emotional faces in the early phase in children with autism spectrum disorders (ASD). Photos of Chinese static faces with four basic emotions (fearful, happy, angry and sad) were used as stimulus. Twenty-five ASD children and twenty-two age- and gender-matched typical developed children (normal controls) were asked to match the emotional faces with words. Event-related potential (ERP) data were recorded concurrently. N170 latencies for total emotion and fearful face in the left temporal region were faster than in the right one in normal controls (P<0.05), but the results were not noted in ASD children. Further, N170 latencies in the left temporal region of ASD children were slower than normal controls for total emotion, fearful and happy faces (P<0.05), and their N170 latencies in the right temporal region were prone to slower than normal controls for angry and fearful faces. The holistic perception speed of emotional faces in the early cognitive processing phase in ASD children is slower than normal controls. The lateralized response in the early phase of recognizing emotional faces may be aberrant in children with ASD.

  9. Increased deficits in emotion recognition and regulation in children and adolescents with exogenous obesity.

    Science.gov (United States)

    Percinel, Ipek; Ozbaran, Burcu; Kose, Sezen; Simsek, Damla Goksen; Darcan, Sukran

    2018-03-01

    In this study we aimed to evaluate emotion recognition and emotion regulation skills of children with exogenous obesity between the ages of 11 and 18 years and compare them with healthy controls. The Schedule for Affective Disorders and Schizophrenia for School Aged Children was used for psychiatric evaluations. Emotion recognition skills were evaluated using Faces Test and Reading the Mind in the Eyes Test. The Difficulties in Emotions Regulation Scale was used for evaluating skills of emotion regulation. Children with obesity had lower scores on Faces Test and Reading the Mind in the Eyes Test, and experienced greater difficulty in emotional regulation skills. Improved understanding of emotional recognition and emotion regulation in young people with obesity may improve their social adaptation and help in the treatment of their disorder. To the best of our knowledge, this is the first study to evaluate both emotional recognition and emotion regulation functions in obese children and obese adolescents between 11 and 18 years of age.

  10. Image-based Analysis of Emotional Facial Expressions in Full Face Transplants.

    Science.gov (United States)

    Bedeloglu, Merve; Topcu, Çagdas; Akgul, Arzu; Döger, Ela Naz; Sever, Refik; Ozkan, Ozlenen; Ozkan, Omer; Uysal, Hilmi; Polat, Ovunc; Çolak, Omer Halil

    2018-01-20

    In this study, it is aimed to determine the degree of the development in emotional expression of full face transplant patients from photographs. Hence, a rehabilitation process can be planned according to the determination of degrees as a later work. As envisaged, in full face transplant cases, the determination of expressions can be confused or cannot be achieved as the healthy control group. In order to perform image-based analysis, a control group consist of 9 healthy males and 2 full-face transplant patients participated in the study. Appearance-based Gabor Wavelet Transform (GWT) and Local Binary Pattern (LBP) methods are adopted for recognizing neutral and 6 emotional expressions which consist of angry, scared, happy, hate, confused and sad. Feature extraction was carried out by using both methods and combination of these methods serially. In the performed expressions, the extracted features of the most distinct zones in the facial area where the eye and mouth region, have been used to classify the emotions. Also, the combination of these region features has been used to improve classifier performance. Control subjects and transplant patients' ability to perform emotional expressions have been determined with K-nearest neighbor (KNN) classifier with region-specific and method-specific decision stages. The results have been compared with healthy group. It has been observed that transplant patients don't reflect some emotional expressions. Also, there were confusions among expressions.

  11. Emotion recognition and oxytocin in patients with schizophrenia

    Science.gov (United States)

    Averbeck, B. B.; Bobin, T.; Evans, S.; Shergill, S. S.

    2012-01-01

    Background Studies have suggested that patients with schizophrenia are impaired at recognizing emotions. Recently, it has been shown that the neuropeptide oxytocin can have beneficial effects on social behaviors. Method To examine emotion recognition deficits in patients and see whether oxytocin could improve these deficits, we carried out two experiments. In the first experiment we recruited 30 patients with schizophrenia and 29 age- and IQ-matched control subjects, and gave them an emotion recognition task. Following this, we carried out a second experiment in which we recruited 21 patients with schizophrenia for a double-blind, placebo-controlled cross-over study of the effects of oxytocin on the same emotion recognition task. Results In the first experiment we found that patients with schizophrenia had a deficit relative to controls in recognizing emotions. In the second experiment we found that administration of oxytocin improved the ability of patients to recognize emotions. The improvement was consistent and occurred for most emotions, and was present whether patients were identifying morphed or non-morphed faces. Conclusions These data add to a growing literature showing beneficial effects of oxytocin on social–behavioral tasks, as well as clinical symptoms. PMID:21835090

  12. Neural correlates of top-down processing in emotion perception: an ERP study of emotional faces in white noise versus noise-alone stimuli.

    Science.gov (United States)

    Lee, Kyu-Yong; Lee, Tae-Ho; Yoon, So-Jeong; Cho, Yang Seok; Choi, June-Seek; Kim, Hyun Taek

    2010-06-14

    In the present study, we investigated the neural correlates underlying the perception of emotion in response to facial stimuli in order to elucidate the extent to which emotional perception is affected by the top-down process. Subjects performed a forced, two-choice emotion discrimination task towards ambiguous visual stimuli consisted of emotional faces embedded in different levels of visual white noise, including white noise-alone stimuli. ERP recordings and behavioral responses were analyzed according to the four response categories: hit, miss, false alarm and correct rejection. We observed enlarged EPN and LPP amplitudes when subjects reported seeing fearful faces and a typical emotional EPN response in the white noise-alone conditions when fearful faces were not presented. The two components of the ERP data which imply the characteristic modulation reflecting emotional processing showed the type of emotion each individual subjectively perceived. The results suggest that top-down modulations might be indispensable for emotional perception, which consists of two distinct stages of stimulus processing in the brain. (c) 2010 Elsevier B.V. All rights reserved.

  13. Social and emotional relevance in face processing: Happy faces of future interaction partners enhance the LPP

    Directory of Open Access Journals (Sweden)

    Florian eBublatzky

    2014-07-01

    Full Text Available Human face perception is modulated by both emotional valence and social relevance, but their interaction has rarely been examined. Event-related brain potentials (ERP to happy, neutral, and angry facial expressions with different degrees of social relevance were recorded. Social relevance was manipulated by presenting pictures of two specific face actors as future interaction partners (meet condition, whereas two other face actors remained non-relevant. As a further control condition all stimuli were presented without specific task instructions (passive viewing condition. A within-subject design (Facial Expression x Relevance x Task was implemented, where randomly ordered face stimuli of four actors (2 women, from the KDEF were presented for 1s to 26 participants (16 female. Results showed an augmented N170, early posterior negativity (EPN, and late positive potential (LPP for emotional in contrast to neutral facial expressions. Of particular interest, face processing varied as a function of instructed social relevance. Whereas the meet condition was accompanied with unspecific effects regardless of relevance (P1, EPN, viewing potential interaction partners was associated with increased LPP amplitudes. The LPP was specifically enhanced for happy facial expressions of the future interaction partners. This underscores that social relevance can impact face processing already at an early stage of visual processing. These findings are discussed within the framework of motivated attention and face processing theories.

  14. Putting the face in context: Body expressions impact facial emotion processing in human infants.

    Science.gov (United States)

    Rajhans, Purva; Jessen, Sarah; Missana, Manuela; Grossmann, Tobias

    2016-06-01

    Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs). We primed infants with body postures (fearful, happy) that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. No strong evidence for lateralisation of word reading and face recognition deficits following posterior brain injury

    DEFF Research Database (Denmark)

    Gerlach, Christian; Marstrand, Lisbet; Starrfelt, Randi

    2014-01-01

    Face recognition and word reading are thought to be mediated by relatively independent cognitive systems lateralized to the right and left hemisphere respectively. In this case, we should expect a higher incidence of face recognition problems in patients with right hemisphere injury and a higher......-construction, motion perception), we found that both patient groups performed significantly worse than a matched control group. In particular we found a significant number of face recognition deficits in patients with left hemisphere injury and a significant number of patients with word reading deficits following...... right hemisphere injury. This suggests that face recognition and word reading may be mediated by more bilaterally distributed neural systems than is commonly assumed....

  16. Assessment of incongruent emotions in face and voice

    NARCIS (Netherlands)

    Takagi, S.; Tabei, K.-I.; Huis in 't Veld, E.M.J.; de Gelder, B.

    2013-01-01

    Information derived from facial and vocal nonverbal expressions plays an important role in social communication in the real and virtual worlds. In the present study, we investigated cultural differences between Japanese and Dutch participants in the multisensory perception of emotion. We used a face

  17. Gender differences in the recognition of emotional faces: are men less efficient?

    Directory of Open Access Journals (Sweden)

    Ana Ruiz-Ibáñez

    2017-06-01

    Full Text Available As research in recollection of stimuli with emotional valence indicates, emotions influence memory. Many studies in face and emotional facial expression recognition have focused on age (young and old people and gender-associated (men and women differences. Nevertheless, this kind of studies has produced contradictory results, because of that, it would be necessary to study gender involvement in depth. The main objective of our research consists of analyzing the differences in image recognition using faces with emotional facial expressions between two groups composed by university students aged 18-30. The first group is constituted by men and the second one by women. The results showed statistically significant differences in face corrected recognition (hit rate - false alarm rate: the women demonstrated a better recognition than the men. However, other analyzed variables as time or efficiency do not provide conclusive results. Furthermore, a significant negative correlation between the time used and the efficiency when doing the task was found in the male group. This information reinforces not only the hypothesis of gender difference in face recognition, in favor of women, but also these ones that suggest a different cognitive processing of facial stimuli in both sexes. Finally, we argue the necessity of a greater research related to variables as age or sociocultural level.

  18. Face memory and face recognition in children and adolescents with attention deficit hyperactivity disorder: A systematic review.

    Science.gov (United States)

    Romani, Maria; Vigliante, Miriam; Faedda, Noemi; Rossetti, Serena; Pezzuti, Lina; Guidetti, Vincenzo; Cardona, Francesco

    2018-06-01

    This review focuses on facial recognition abilities in children and adolescents with attention deficit hyperactivity disorder (ADHD). A systematic review, using PRISMA guidelines, was conducted to identify original articles published prior to May 2017 pertaining to memory, face recognition, affect recognition, facial expression recognition and recall of faces in children and adolescents with ADHD. The qualitative synthesis based on different studies shows a particular focus of the research on facial affect recognition without paying similar attention to the structural encoding of facial recognition. In this review, we further investigate facial recognition abilities in children and adolescents with ADHD, providing synthesis of the results observed in the literature, while detecting face recognition tasks used on face processing abilities in ADHD and identifying aspects not yet explored. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. The familial basis of facial emotion recognition deficits in adolescents with conduct disorder and their unaffected relatives.

    Science.gov (United States)

    Sully, K; Sonuga-Barke, E J S; Fairchild, G

    2015-07-01

    There is accumulating evidence of impairments in facial emotion recognition in adolescents with conduct disorder (CD). However, the majority of studies in this area have only been able to demonstrate an association, rather than a causal link, between emotion recognition deficits and CD. To move closer towards understanding the causal pathways linking emotion recognition problems with CD, we studied emotion recognition in the unaffected first-degree relatives of CD probands, as well as those with a diagnosis of CD. Using a family-based design, we investigated facial emotion recognition in probands with CD (n = 43), their unaffected relatives (n = 21), and healthy controls (n = 38). We used the Emotion Hexagon task, an alternative forced-choice task using morphed facial expressions depicting the six primary emotions, to assess facial emotion recognition accuracy. Relative to controls, the CD group showed impaired recognition of anger, fear, happiness, sadness and surprise (all p emotion recognition deficits are present in adolescents who are at increased familial risk for developing antisocial behaviour, as well as those who have already developed CD. Consequently, impaired emotion recognition appears to be a viable familial risk marker or candidate endophenotype for CD.

  20. A face to remember: emotional expression modulates prefrontal activity during memory formation.

    Science.gov (United States)

    Sergerie, Karine; Lepage, Martin; Armony, Jorge L

    2005-01-15

    Emotion can exert a modulatory role on episodic memory. Several studies have shown that negative stimuli (e.g., words, pictures) are better remembered than neutral ones. Although facial expressions are powerful emotional stimuli and have been shown to influence perception and attention processes, little is known about their effect on memory. We used functional magnetic resonance imaging (fMRI) in humans to investigate the effects of expression (happy, neutral, and fearful) on prefrontal cortex (PFC) activity during the encoding of faces, using a subsequent memory effect paradigm. Our results show that activity in right PFC predicted memory for faces, regardless of expression, while a homotopic region in the left hemisphere was associated with successful encoding only for faces with an emotional expression. These findings are consistent with the proposed role of right dorsolateral PFC in successful encoding of nonverbal material, but also suggest that left DLPFC may be a site where integration of memory and emotional processes occurs. This study sheds new light on the current controversy regarding the hemispheric lateralization of PFC in memory encoding.

  1. Sad benefit in face working memory: an emotional bias of melancholic depression.

    Science.gov (United States)

    Linden, Stefanie C; Jackson, Margaret C; Subramanian, Leena; Healy, David; Linden, David E J

    2011-12-01

    Emotion biases feature prominently in cognitive theories of depression and are a focus of psychological interventions. However, there is presently no stable neurocognitive marker of altered emotion-cognition interactions in depression. One reason may be the heterogeneity of major depressive disorder. Our aim in the present study was to find an emotional bias that differentiates patients with melancholic depression from controls, and patients with melancholic from those with non-melancholic depression. We used a working memory paradigm for emotional faces, where two faces with angry, happy, neutral, sad or fearful expression had to be retained over one second. Twenty patients with melancholic depression, 20 age-, education- and gender-matched control participants and 20 patients with non-melancholic depression participated in the study. We analysed performance on the working memory task using signal detection measures. We found an interaction between group and emotion on working memory performance that was driven by the higher performance for sad faces compared to other categories in the melancholic group. We computed a measure of "sad benefit", which distinguished melancholic and non-melancholic patients with good sensitivity and specificity. However, replication studies and formal discriminant analysis will be needed in order to assess whether emotion bias in working memory may become a useful diagnostic tool to distinguish these two syndromes. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Passive and motivated perception of emotional faces: qualitative and quantitative changes in the face processing network.

    Directory of Open Access Journals (Sweden)

    Laurie R Skelly

    Full Text Available Emotionally expressive faces are processed by a distributed network of interacting sub-cortical and cortical brain regions. The components of this network have been identified and described in large part by the stimulus properties to which they are sensitive, but as face processing research matures interest has broadened to also probe dynamic interactions between these regions and top-down influences such as task demand and context. While some research has tested the robustness of affective face processing by restricting available attentional resources, it is not known whether face network processing can be augmented by increased motivation to attend to affective face stimuli. Short videos of people expressing emotions were presented to healthy participants during functional magnetic resonance imaging. Motivation to attend to the videos was manipulated by providing an incentive for improved recall performance. During the motivated condition, there was greater coherence among nodes of the face processing network, more widespread correlation between signal intensity and performance, and selective signal increases in a task-relevant subset of face processing regions, including the posterior superior temporal sulcus and right amygdala. In addition, an unexpected task-related laterality effect was seen in the amygdala. These findings provide strong evidence that motivation augments co-activity among nodes of the face processing network and the impact of neural activity on performance. These within-subject effects highlight the necessity to consider motivation when interpreting neural function in special populations, and to further explore the effect of task demands on face processing in healthy brains.

  3. Memory deficits for facial identity in patients with amnestic mild cognitive impairment (MCI).

    Science.gov (United States)

    Savaskan, Egemen; Summermatter, Daniel; Schroeder, Clemens; Schächinger, Hartmut

    2018-01-01

    Faces are among the most relevant social stimuli revealing an encounter's identity and actual emotional state. Deficits in facial recognition may be an early sign of cognitive decline leading to social deficits. The main objective of the present study is to investigate if individuals with amnestic mild cognitive impairment show recognition deficits in facial identity. Thirty-seven individuals with amnestic mild cognitive impairment, multiple-domain (15 female; age: 75±8 yrs.) and forty-one healthy volunteers (24 female; age 71±6 yrs.) participated. All participants completed a human portrait memory test presenting unfamiliar faces with happy and angry emotional expressions. Five and thirty minutes later, old and new neutral faces were presented, and discrimination sensitivity (d') and response bias (C) were assessed as signal detection parameters of cued facial identity recognition. Memory performance was lower in amnestic mild cognitive impairment as compared to control subjects, mainly because of an altered response bias towards an increased false alarm rate (favoring false OLD ascription of NEW items). In both groups, memory performance declined between the early and later testing session, and was always better for acquired happy than angry faces. Facial identity memory is impaired in patients with amnestic mild cognitive impairment. Liberalization of the response bias may reflect a socially motivated compensatory mechanism maintaining an almost identical recognition hit rate of OLD faces in individuals with amnestic mild cognitive impairment.

  4. An emotional Stroop task with faces and words. A comparison of young and older adults.

    Science.gov (United States)

    Agustí, Ana I; Satorres, Encarnación; Pitarque, Alfonso; Meléndez, Juan C

    2017-08-01

    Given the contradictions of previous studies on the changes in attentional responses produced in aging a Stroop emotional task was proposed to compare young and older adults to words or faces with an emotional valence. The words happy or sad were superimposed on faces that express the emotion of happiness or sadness. The emotion expressed by the word and the face could agree or not (cued and uncued trials, respectively). 85 young and 66 healthy older adults had to identify both faces and words separately, and the interference between the two types of stimuli was examined. An interference effect was observed for both types of stimuli in both groups. There was more interference on positive faces and words than on negative stimuli. Older adults had more difficulty than younger in focusing on positive uncued trials, whereas there was no difference across samples on negative uncued trials. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Oxytocin and social pretreatment have similar effects on processing of negative emotional faces in healthy adult males

    Directory of Open Access Journals (Sweden)

    Anna eKis

    2013-08-01

    Full Text Available Oxytocin has been shown to affect several aspects of human social cognition, including facial emotion processing. There is also evidence that social stimuli (such as eye-contact can effectively modulate endogenous oxytocin levels.In the present study we directly tested whether intranasal oxytocin administration and pre-treatment with social stimuli had similar effects on face processing at the behavioural level. Subjects (N=52 healthy adult males were presented with a set of faces with expressions of different valence (negative, neutral, positive following different types of pretreatment (oxytocin – OT or placebo – PL and social interaction – Soc or no social interaction – NSoc, N=13 in each and were asked to rate all faces for perceived emotion and trustworthiness. On the next day subjects’ recognition memory was tested on a set of neutral faces and additionally they had to again rate each face for trustworthiness and emotion.Subjects in both the OT and the Soc pretreatment group (as compared to the PL and to the NSoc groups gave higher emotion and trustworthiness scores for faces with negative emotional expression. Moreover, 24 h later, subjects in the OT and Soc groups (unlike in control groups gave lower trustworthiness scores for previously negative faces, than for faces previously seen as emotionally neutral or positive.In sum these results provide the first direct evidence of the similar effects of intranasal oxytocin administration and social stimulation on the perception of negative facial emotions as well as on the delayed recall of negative emotional information.

  6. Embodied emotion impairment in Huntington's Disease.

    Science.gov (United States)

    Trinkler, Iris; Devignevielle, Sévérine; Achaibou, Amal; Ligneul, Romain V; Brugières, Pierre; Cleret de Langavant, Laurent; De Gelder, Beatrice; Scahill, Rachael; Schwartz, Sophie; Bachoud-Lévi, Anne-Catherine

    2017-07-01

    Theories of embodied cognition suggest that perceiving an emotion involves somatovisceral and motoric re-experiencing. Here we suggest taking such an embodied stance when looking at emotion processing deficits in patients with Huntington's Disease (HD), a neurodegenerative motor disorder. The literature on these patients' emotion recognition deficit has recently been enriched by some reports of impaired emotion expression. The goal of the study was to find out if expression deficits might be linked to a more motoric level of impairment. We used electromyography (EMG) to compare voluntary emotion expression from words to emotion imitation from static face images, and spontaneous emotion mimicry in 28 HD patients and 24 matched controls. For the latter two imitation conditions, an underlying emotion understanding is not imperative (even though performance might be helped by it). EMG measures were compared to emotion recognition and to the capacity to identify and describe emotions using alexithymia questionnaires. Alexithymia questionnaires tap into the more somato-visceral or interoceptive aspects of emotion perception. Furthermore, we correlated patients' expression and recognition scores to cerebral grey matter volume using voxel-based morphometry (VBM). EMG results replicated impaired voluntary emotion expression in HD. Critically, voluntary imitation and spontaneous mimicry were equally impaired and correlated with impaired recognition. By contrast, alexithymia scores were normal, suggesting that emotion representations on the level of internal experience might be spared. Recognition correlated with brain volume in the caudate as well as in areas previously associated with shared action representations, namely somatosensory, posterior parietal, posterior superior temporal sulcus (pSTS) and subcentral sulcus. Together, these findings indicate that in these patients emotion deficits might be tied to the "motoric level" of emotion expression. Such a double

  7. Emotional Lability in Patients with Attention-Deficit/Hyperactivity Disorder: Impact of Pharmacotherapy.

    Science.gov (United States)

    Childress, Ann C; Sallee, Floyd R

    2015-08-01

    Attention-deficit/hyperactivity disorder (ADHD) is a neurobehavioral disorder defined by persistent inattention and/or hyperactivity and impulsivity. These symptoms occur more frequently and are more severe in individuals with ADHD compared with those at a similar developmental level without ADHD, and can be conceptualized as deficits in executive functioning (EF). EF includes domains of metacognition and inhibition, which influence the ability to regulate responses elicited by emotional stimuli. EF deficits can lead to emotional lability (EL), which is characterized by sudden changes in emotion and behaviors of inappropriately high intensity that may include sudden bouts of anger, dysphoria, sadness, or euphoria. EL is common and estimated to occur in about 3.3-10% of the population. Recent estimates of EL prevalence in children and adolescents with ADHD range from 38 to 75%. The Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition does not include EL in diagnostic criteria for ADHD, but does include ADHD-associated features of low frustration tolerance, irritability, or mood lability. The neurobiological basis of EL is not well understood, but brain imaging studies support dividing EF into "cool" cognitive networks encompassing attention and planning activities, and "hot" motivational networks involved in temporal discounting, reward processing, and reward anticipation. Dysfunction in "hot" networks is thought to be related to EL. EL symptoms are associated with more severe ADHD and co-morbidities, have significant impact on functioning, and may respond to treatment with medications frequently used to treat ADHD. Treatment outcomes and areas for future research are discussed.

  8. From specificity to sensitivity: affective states modulate visual working memory for emotional expressive faces.

    Science.gov (United States)

    Maran, Thomas; Sachse, Pierre; Furtner, Marco

    2015-01-01

    Previous findings suggest that visual working memory (VWM) preferentially remembers angry looking faces. However, the meaning of facial actions is construed in relation to context. To date, there are no studies investigating the role of perceiver-based context when processing emotional cues in VWM. To explore the influence of affective context on VWM for faces, we conducted two experiments using both a VWM task for emotionally expressive faces and a mood induction procedure. Affective context was manipulated by unpleasant (Experiment 1) and pleasant (Experiment 2) IAPS pictures in order to induce an affect high in motivational intensity (defensive or appetitive, respectively) compared to a low arousal control condition. Results indicated specifically increased sensitivity of VWM for angry looking faces in the neutral condition. Enhanced VWM for angry faces was prevented by inducing affects of high motivational intensity. In both experiments, affective states led to a switch from specific enhancement of angry expressions in VWM to an equally sensitive representation of all emotional expressions. Our findings demonstrate that emotional expressions are of different behavioral relevance for the receiver depending on the affective context, supporting a functional organization of VWM along with flexible resource allocation. In VWM, stimulus processing adjusts to situational requirements and transitions from a specifically prioritizing default mode in predictable environments to a sensitive, hypervigilant mode in exposure to emotional events.

  9. Face and emotion expression processing and the serotonin transporter polymorphism 5-HTTLPR/rs22531.

    Science.gov (United States)

    Hildebrandt, A; Kiy, A; Reuter, M; Sommer, W; Wilhelm, O

    2016-06-01

    Face cognition, including face identity and facial expression processing, is a crucial component of socio-emotional abilities, characterizing humans as highest developed social beings. However, for these trait domains molecular genetic studies investigating gene-behavior associations based on well-founded phenotype definitions are still rare. We examined the relationship between 5-HTTLPR/rs25531 polymorphisms - related to serotonin-reuptake - and the ability to perceive and recognize faces and emotional expressions in human faces. For this aim we conducted structural equation modeling on data from 230 young adults, obtained by using a comprehensive, multivariate task battery with maximal effort tasks. By additionally modeling fluid intelligence and immediate and delayed memory factors, we aimed to address the discriminant relationships of the 5-HTTLPR/rs25531 polymorphisms with socio-emotional abilities. We found a robust association between the 5-HTTLPR/rs25531 polymorphism and facial emotion perception. Carriers of two long (L) alleles outperformed carriers of one or two S alleles. Weaker associations were present for face identity perception and memory for emotional facial expressions. There was no association between the 5-HTTLPR/rs25531 polymorphism and non-social abilities, demonstrating discriminant validity of the relationships. We discuss the implications and possible neural mechanisms underlying these novel findings. © 2016 John Wiley & Sons Ltd and International Behavioural and Neural Genetics Society.

  10. Different underlying mechanisms for face emotion and gender processing during feature-selective attention: Evidence from event-related potential studies.

    Science.gov (United States)

    Wang, Hailing; Ip, Chengteng; Fu, Shimin; Sun, Pei

    2017-05-01

    Face recognition theories suggest that our brains process invariant (e.g., gender) and changeable (e.g., emotion) facial dimensions separately. To investigate whether these two dimensions are processed in different time courses, we analyzed the selection negativity (SN, an event-related potential component reflecting attentional modulation) elicited by face gender and emotion during a feature selective attention task. Participants were instructed to attend to a combination of face emotion and gender attributes in Experiment 1 (bi-dimensional task) and to either face emotion or gender in Experiment 2 (uni-dimensional task). The results revealed that face emotion did not elicit a substantial SN, whereas face gender consistently generated a substantial SN in both experiments. These results suggest that face gender is more sensitive to feature-selective attention and that face emotion is encoded relatively automatically on SN, implying the existence of different underlying processing mechanisms for invariant and changeable facial dimensions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Deficits in general emotion regulation skills-Evidence of a transdiagnostic factor.

    Science.gov (United States)

    Lukas, Christian Aljoscha; Ebert, David Daniel; Fuentes, Hugo Trevisi; Caspar, Franz; Berking, Matthias

    2017-12-15

    Deficits in emotion regulation (ER) skills are discussed as a transdiagnostic factor contributing to the development and maintenance of various mental disorders. However, systematic comparisons of a broad range of ER skills across diagnostic groups that are based on comparable definitions and measures of ER are still rare. Therefore, we conducted two studies assessing a broad range of ER skills with the Emotion Regulation Skills Questionnaire in individuals meeting criteria for mental disorders (N 1  = 1448; N 2  = 137) and in a general population sample (N = 214). Consistent across the two studies, participants in the clinical samples reported lower general and lower specific ER skills than participants in the general population sample. Also consistent across the two studies, diagnostic subgroups of the clinical samples differed significantly with regard to general and specific ER skills. The studies provide evidence that deficits in ER are associated with various forms of psychopathology. However, mental disorders seem to differ with regard to how strongly they are linked to ER skills. © 2017 Wiley Periodicals, Inc.

  12. Grounding Context in Face Processing: Color, Emotion and Gender

    Directory of Open Access Journals (Sweden)

    Sandrine eGil

    2015-03-01

    Full Text Available In recent years, researchers have become interested in the way that the affective quality of contextual information transfers to a perceived target. We therefore examined the effect of a red (versus green, mixed red/green and achromatic background–known to be valenced−on the processing of stimuli that play a key role in human interactions, namely facial expressions. We also examined whether the valenced-color effect can be modulated by gender, which is also known to be valenced. Female and male adult participants performed a categorization task of facial expressions of emotion in which the faces of female and male posers expressing two ambiguous emotions (i.e., neutral and surprise were presented against the four different colored backgrounds. Additionally, this task was completed by collecting subjective ratings for each colored background in the form of five semantic differential scales corresponding to both discrete and dimensional perspectives of emotion. We found that the red background resulted in more negative face perception than the green background, whether the poser was female or male. However, whereas this valenced-color effect was the only effect for female posers, for male posers, the effect was modulated by both the nature of the ambiguous emotion and the decoder’s gender. Overall, our findings offer evidence that color and gender have a common valence-based dimension.

  13. Recognizing Emotions: Testing an Intervention for Children with Autism Spectrum Disorders

    Science.gov (United States)

    Richard, Donna Abely; More, William; Joy, Stephen P.

    2015-01-01

    A severely impaired capacity for social interaction is one of the characteristics of individuals with autism spectrum disorder (ASD). Deficits in facial emotional recognition processing may be associated with this limitation. The Build-a-Face (BAF) art therapy intervention was developed to assist with emotional recognition through the viewing and…

  14. Detection of emotional faces: salient physical features guide effective visual search.

    Science.gov (United States)

    Calvo, Manuel G; Nummenmaa, Lauri

    2008-08-01

    In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent, surprised and disgusted faces was found both under upright and inverted display conditions. Inversion slowed down the detection of these faces less than that of others (fearful, angry, and sad). Accordingly, the detection advantage involves processing of featural rather than configural information. The facial features responsible for the detection advantage are located in the mouth rather than the eye region. Computationally modeled visual saliency predicted both attentional orienting and detection. Saliency was greatest for the faces (happy) and regions (mouth) that were fixated earlier and detected faster, and there was close correspondence between the onset of the modeled saliency peak and the time at which observers initially fixated the faces. The authors conclude that visual saliency of specific facial features--especially the smiling mouth--is responsible for facilitated initial orienting, which thus shortens detection. (PsycINFO Database Record (c) 2008 APA, all rights reserved).

  15. Vicarious Social Touch Biases Gazing at Faces and Facial Emotions.

    Science.gov (United States)

    Schirmer, Annett; Ng, Tabitha; Ebstein, Richard P

    2018-02-01

    Research has suggested that interpersonal touch promotes social processing and other-concern, and that women may respond to it more sensitively than men. In this study, we asked whether this phenomenon would extend to third-party observers who experience touch vicariously. In an eye-tracking experiment, participants (N = 64, 32 men and 32 women) viewed prime and target images with the intention of remembering them. Primes comprised line drawings of dyadic interactions with and without touch. Targets comprised two faces shown side-by-side, with one being neutral and the other being happy or sad. Analysis of prime fixations revealed that faces in touch interactions attracted longer gazing than faces in no-touch interactions. In addition, touch enhanced gazing at the area of touch in women but not men. Analysis of target fixations revealed that touch priming increased looking at both faces immediately after target onset, and subsequently, at the emotional face in the pair. Sex differences in target processing were nonsignificant. Together, the present results imply that vicarious touch biases visual attention to faces and promotes emotion sensitivity. In addition, they suggest that, compared with men, women are more aware of tactile exchanges in their environment. As such, vicarious touch appears to share important qualities with actual physical touch. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  16. Altered emotional recognition and expression in patients with Parkinson’s disease

    Directory of Open Access Journals (Sweden)

    Jin Y

    2017-11-01

    Full Text Available Yazhou Jin,* Zhiqi Mao,* Zhipei Ling, Xin Xu, Zhiyuan Zhang, Xinguang Yu Department of Neurosurgery, People’s Liberation Army General Hospital, Beijing, People’s Republic of China *These authors contributed equally to this work Background: Parkinson’s disease (PD patients exhibit deficits in emotional recognition and expression abilities, including emotional faces and voices. The aim of this study was to explore emotional processing in pre-deep brain stimulation (pre-DBS PD patients using two sensory modalities (visual and auditory. Methods: Fifteen PD patients who needed DBS surgery and 15 healthy, age- and gender-matched controls were recruited as participants. All participants were assessed by the Karolinska Directed Emotional Faces database 50 Faces Recognition test. Vocal recognition was evaluated by the Montreal Affective Voices database 50 Voices Recognition test. For emotional facial expression, the participants were asked to imitate five basic emotions (neutral, happiness, anger, fear, and sadness. The subjects were required to express nonverbal vocalizations of the five basic emotions. Fifteen Chinese native speakers were recruited as decoders. We recorded the accuracy of the responses, reaction time, and confidence level. Results: For emotional recognition and expression, the PD group scored lower on both facial and vocal emotional processing than did the healthy control group. There were significant differences between the two groups in both reaction time and confidence level. A significant relationship was also found between emotional recognition and emotional expression when considering all participants between the two groups together. Conclusion: The PD group exhibited poorer performance on both the recognition and expression tasks. Facial emotion deficits and vocal emotion abnormalities were associated with each other. In addition, our data allow us to speculate that emotional recognition and expression may share a common

  17. The heterogeneity of attention-deficit/hyperactivity disorder symptoms and conduct problems: Cognitive inhibition, emotion regulation, emotionality, and disorganized attachment.

    Science.gov (United States)

    Forslund, Tommie; Brocki, Karin C; Bohlin, Gunilla; Granqvist, Pehr; Eninger, Lilianne

    2016-09-01

    This study examined the contributions of several important domains of functioning to attention-deficit/hyperactivity disorder (ADHD) symptoms and conduct problems. Specifically, we investigated whether cognitive inhibition, emotion regulation, emotionality, and disorganized attachment made independent and specific contributions to these externalizing behaviour problems from a multiple pathways perspective. The study included laboratory measures of cognitive inhibition and disorganized attachment in 184 typically developing children (M age = 6 years, 10 months, SD = 1.7). Parental ratings provided measures of emotion regulation, emotionality, and externalizing behaviour problems. Results revealed that cognitive inhibition, regulation of positive emotion, and positive emotionality were independently and specifically related to ADHD symptoms. Disorganized attachment and negative emotionality formed independent and specific relations to conduct problems. Our findings support the multiple pathways perspective on ADHD, with poor regulation of positive emotion and high positive emotionality making distinct contributions to ADHD symptoms. More specifically, our results support the proposal of a temperamentally based pathway to ADHD symptoms. The findings also indicate that disorganized attachment and negative emotionality constitute pathways specific to conduct problems rather than to ADHD symptoms. © 2016 The British Psychological Society.

  18. Neurofunctional Underpinnings of Audiovisual Emotion Processing in Teens with Autism Spectrum Disorders

    Science.gov (United States)

    Doyle-Thomas, Krissy A.R.; Goldberg, Jeremy; Szatmari, Peter; Hall, Geoffrey B.C.

    2013-01-01

    Despite successful performance on some audiovisual emotion tasks, hypoactivity has been observed in frontal and temporal integration cortices in individuals with autism spectrum disorders (ASD). Little is understood about the neurofunctional network underlying this ability in individuals with ASD. Research suggests that there may be processing biases in individuals with ASD, based on their ability to obtain meaningful information from the face and/or the voice. This functional magnetic resonance imaging study examined brain activity in teens with ASD (n = 18) and typically developing controls (n = 16) during audiovisual and unimodal emotion processing. Teens with ASD had a significantly lower accuracy when matching an emotional face to an emotion label. However, no differences in accuracy were observed between groups when matching an emotional voice or face-voice pair to an emotion label. In both groups brain activity during audiovisual emotion matching differed significantly from activity during unimodal emotion matching. Between-group analyses of audiovisual processing revealed significantly greater activation in teens with ASD in a parietofrontal network believed to be implicated in attention, goal-directed behaviors, and semantic processing. In contrast, controls showed greater activity in frontal and temporal association cortices during this task. These results suggest that in the absence of engaging integrative emotional networks during audiovisual emotion matching, teens with ASD may have recruited the parietofrontal network as an alternate compensatory system. PMID:23750139

  19. Emotion improves and impairs early vision.

    Science.gov (United States)

    Bocanegra, Bruno R; Zeelenberg, René

    2009-06-01

    Recent studies indicate that emotion enhances early vision, but the generality of this finding remains unknown. Do the benefits of emotion extend to all basic aspects of vision, or are they limited in scope? Our results show that the brief presentation of a fearful face, compared with a neutral face, enhances sensitivity for the orientation of subsequently presented low-spatial-frequency stimuli, but diminishes orientation sensitivity for high-spatial-frequency stimuli. This is the first demonstration that emotion not only improves but also impairs low-level vision. The selective low-spatial-frequency benefits are consistent with the idea that emotion enhances magnocellular processing. Additionally, we suggest that the high-spatial-frequency deficits are due to inhibitory interactions between magnocellular and parvocellular pathways. Our results suggest an emotion-induced trade-off in visual processing, rather than a general improvement. This trade-off may benefit perceptual dimensions that are relevant for survival at the expense of those that are less relevant.

  20. Emotion-Cognition Interactions in Schizophrenia: Implicit and Explicit Effects of Facial Expression

    Science.gov (United States)

    Linden, Stefanie C.; Jackson, Margaret C.; Subramanian, Leena; Wolf, Claudia; Green, Paul; Healy, David; Linden, David E. J.

    2010-01-01

    Working memory (WM) and emotion classification are amongst the cognitive domains where specific deficits have been reported for patients with schizophrenia. In healthy individuals, the capacity of visual working memory is enhanced when the material to be retained is emotionally salient, particularly for angry faces. We investigated whether…

  1. The effects of modafinil, caffeine, and dextroamphetamine on judgments of simple versus complex emotional expressions following sleep deprivation.

    Science.gov (United States)

    Huck, Nathan O; McBride, Sharon A; Kendall, Athena P; Grugle, Nancy L; Killgore, William D S

    2008-04-01

    Cognitive abilities such as vigilance, attention, memory, and executive functioning can be degraded significantly following extended periods of wakefulness. Although much evidence suggests that sleep-loss induced deficits in alertness and vigilance can be reversed or mitigated by stimulants such as caffeine, it is not clear how these compounds may affect other higher level cognitive processes such as emotional perception and judgment. Following 47 h of sleep deprivation, the study examined the effect of three stimulant medications (modafinil 400 mg, dextroamphetamine 20 mg, caffeine 600 mg) or placebo on the ability of 54 healthy participants to discriminate and label simple emotional expressions versus complex affect blends (created by morphing photographs of two different affective facial expressions). For simple affective faces, neither sleep loss nor stimulant medications made any difference on the accuracy of judgments. In contrast, for complex emotion blends, all three stimulant medications significantly improved the ability to discriminate subtle aspects of emotion correctly relative to placebo, but did not differ from one another. These findings suggest that all three stimulant medications are effective at restoring some aspects of subtle affective perception.

  2. Acute pharmacologically induced shifts in serotonin availability abolish emotion-selective responses to negative face emotions in distinct brain networks

    DEFF Research Database (Denmark)

    Grady, Cheryl Lynn; Siebner, Hartwig R; Hornboll, Bettina

    2013-01-01

    Pharmacological manipulation of serotonin availability can alter the processing of facial expressions of emotion. Using a within-subject design, we measured the effect of serotonin on the brain's response to aversive face emotions with functional MRI while 20 participants judged the gender...... of neutral, fearful and angry faces. In three separate and counterbalanced sessions, participants received citalopram (CIT) to raise serotonin levels, underwent acute tryptophan depletion (ATD) to lower serotonin, or were studied without pharmacological challenge (Control). An analysis designed to identify...

  3. From Specificity to Sensitivity: Affective states modulate visual working memory for emotional expressive faces

    Directory of Open Access Journals (Sweden)

    Thomas eMaran

    2015-08-01

    Full Text Available Previous findings suggest that visual working memory preferentially remembers angry looking faces. However, the meaning of facial actions is construed in relation to context. To date, there are no studies investigating the role of perceiver-based context when processing emotional cues in visual working memory. To explore the influence of affective context on visual working memory for faces, we conducted two experiments using both a visual working memory task for emotionally expressive faces and a mood induction procedure. Affective context was manipulated by unpleasant (Experiment 1 and pleasant (Experiment 2 IAPS pictures in order to induce an affect high in motivational intensity (defensive or appetitive, respectively compared to a low arousal control condition. Results indicated specifically increased sensitivity of visual working memory for angry looking faces in the neutral condition. Enhanced visual working memory for angry faces was prevented by inducing affects of high motivational intensity. In both experiments, affective states led to a switch from specific enhancement of angry expressions in visual working memory to an equally sensitive representation of all emotional expressions. Our findings demonstrate that emotional expressions are of different behavioral relevance for the receiver depending on the affective context, supporting a functional organization of visual working memory along with flexible resource allocation. In visual working memory, stimulus processing adjusts to situational requirements and transitions from a specifically prioritizing default mode in predictable environments to a sensitive, hypervigilant mode in exposure to emotional events.

  4. Interdependent mechanisms for processing gender and emotion:The special status of angry male faces

    Directory of Open Access Journals (Sweden)

    Daniel A Harris

    2016-07-01

    Full Text Available While some models of how various attributes of a face are processed have posited that face features, invariant physical cues such as gender or ethnicity as well as variant social cues such as emotion, may be processed independently (e.g., Bruce & Young, 1986, other models suggest a more distributed representation and interdependent processing (e.g., Haxby, Hoffman, & Gobbini, 2000. Here we use a contingent adaptation paradigm to investigate if mechanisms for processing the gender and emotion of a face are interdependent and symmetric across the happy-angry emotional continuum and regardless of the gender of the face. We simultaneously adapted participants to angry female faces and happy male faces (Experiment 1 or to happy female faces and angry male faces (Experiment 2. In Experiment 1 we found evidence for contingent adaptation, with simultaneous aftereffects in opposite directions: male faces were biased towards angry while female faces were biased towards happy. Interestingly, in the complementary Experiment 2 we did not find evidence for contingent adaptation, with both male and female faces biased towards angry. Our results highlight that evidence for contingent adaptation and the underlying interdependent face processing mechanisms that would allow for contingent adaptation may only be evident for certain combinations of face features. Such limits may be especially important in the case of social cues given how maladaptive it may be to stop responding to threatening information, with male angry faces considered to be the most threatening. The underlying neuronal mechanisms that could account for such asymmetric effects in contingent adaptation remain to be elucidated.

  5. Emotion Recognition in Animated Compared to Human Stimuli in Adolescents with Autism Spectrum Disorder

    Science.gov (United States)

    Brosnan, Mark; Johnson, Hilary; Grawmeyer, Beate; Chapman, Emma; Benton, Laura

    2015-01-01

    There is equivocal evidence as to whether there is a deficit in recognising emotional expressions in Autism spectrum disorder (ASD). This study compared emotion recognition in ASD in three types of emotion expression media (still image, dynamic image, auditory) across human stimuli (e.g. photo of a human face) and animated stimuli (e.g. cartoon…

  6. Face Processing and Facial Emotion Recognition in Adults with Down Syndrome

    Science.gov (United States)

    Barisnikov, Koviljka; Hippolyte, Loyse; Van der Linden, Martial

    2008-01-01

    Face processing and facial expression recognition was investigated in 17 adults with Down syndrome, and results were compared with those of a child control group matched for receptive vocabulary. On the tasks involving faces without emotional content, the adults with Down syndrome performed significantly worse than did the controls. However, their…

  7. Word wins over Face: Emotional Stroop effect activates the frontal cortical network

    Directory of Open Access Journals (Sweden)

    Shima Ovaysikia

    2011-01-01

    Full Text Available The prefrontal cortex (PFC has been implicated in higher order cognitive control of behaviour. Sometimes such control is executed through suppression of an unwanted response in order to avoid conflict. Conflict occurs when two simultaneously competing processes lead to different behavioral outcomes, as seen in tasks such as the anti-saccade, go/no-go and the Stroop task. We set out to examine whether different types of stimuli in a modified emotional Stroop task would cause similar interference effects as the original Stroop-colour/word, and whether the required suppression mechanism(s would recruit similar regions of the medial PFC (mPFC. By using emotional words and emotional faces in this Stroop experiment, we examined the two well-learned automatic behaviours of word reading and recognition of face expressions. In our emotional Stroop paradigm, words were processed faster than face expressions with incongruent trials yielding longer reaction times (RT and larger number of errors compared to the congruent trials. This novel Stroop effect activated the anterior and inferior regions of the mPFC, namely the anterior cingulate cortex (ACC, inferior frontal gyrus (IFG as well as the superior frontal gyrus. Our results suggest that prepotent behaviours such as reading and recognition of face expressions are stimulus-dependent and perhaps hierarchical, hence recruiting distinct regions of the mPFC. Moreover, the faster processing of word reading compared to reporting face expressions is indicative of the formation of stronger stimulus-response (SR associations of an over-learned behaviour compared to an instinctive one, which could alternatively be explained through the distinction between awareness and selective attention.

  8. Psilocybin with psychological support improves emotional face recognition in treatment-resistant depression.

    Science.gov (United States)

    Stroud, J B; Freeman, T P; Leech, R; Hindocha, C; Lawn, W; Nutt, D J; Curran, H V; Carhart-Harris, R L

    2018-02-01

    Depressed patients robustly exhibit affective biases in emotional processing which are altered by SSRIs and predict clinical outcome. The objective of this study is to investigate whether psilocybin, recently shown to rapidly improve mood in treatment-resistant depression (TRD), alters patients' emotional processing biases. Seventeen patients with treatment-resistant depression completed a dynamic emotional face recognition task at baseline and 1 month later after two doses of psilocybin with psychological support. Sixteen controls completed the emotional recognition task over the same time frame but did not receive psilocybin. We found evidence for a group × time interaction on speed of emotion recognition (p = .035). At baseline, patients were slower at recognising facial emotions compared with controls (p psilocybin, this difference was remediated (p = .208). Emotion recognition was faster at follow-up compared with baseline in patients (p = .004, d = .876) but not controls (p = .263, d = .302). In patients, this change was significantly correlated with a reduction in anhedonia over the same time period (r = .640, p = .010). Psilocybin with psychological support appears to improve processing of emotional faces in treatment-resistant depression, and this correlates with reduced anhedonia. Placebo-controlled studies are warranted to follow up these preliminary findings.

  9. Pretreatment Differences in BOLD Response to Emotional Faces Correlate with Antidepressant Response to Scopolamine.

    Science.gov (United States)

    Furey, Maura L; Drevets, Wayne C; Szczepanik, Joanna; Khanna, Ashish; Nugent, Allison; Zarate, Carlos A

    2015-03-28

    Faster acting antidepressants and biomarkers that predict treatment response are needed to facilitate the development of more effective treatments for patients with major depressive disorders. Here, we evaluate implicitly and explicitly processed emotional faces using neuroimaging to identify potential biomarkers of treatment response to the antimuscarinic, scopolamine. Healthy participants (n=15) and unmedicated-depressed major depressive disorder patients (n=16) participated in a double-blind, placebo-controlled crossover infusion study using scopolamine (4 μg/kg). Before and following scopolamine, blood oxygen-level dependent signal was measured using functional MRI during a selective attention task. Two stimuli comprised of superimposed pictures of faces and houses were presented. Participants attended to one stimulus component and performed a matching task. Face emotion was modulated (happy/sad) creating implicit (attend-houses) and explicit (attend-faces) emotion processing conditions. The pretreatment difference in blood oxygen-level dependent response to happy and sad faces under implicit and explicit conditions (emotion processing biases) within a-priori regions of interest was correlated with subsequent treatment response in major depressive disorder. Correlations were observed exclusively during implicit emotion processing in the regions of interest, which included the subgenual anterior cingulate (Pemotional faces prior to treatment reflect the potential to respond to scopolamine. These findings replicate earlier results, highlighting the potential for pretreatment neural activity in the middle occipital cortices and subgenual anterior cingulate to inform us about the potential to respond clinically to scopolamine. Published by Oxford University Press on behalf of CINP 2015. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  10. A motivational determinant of facial emotion recognition: regulatory focus affects recognition of emotions in faces.

    Science.gov (United States)

    Sassenrath, Claudia; Sassenberg, Kai; Ray, Devin G; Scheiter, Katharina; Jarodzka, Halszka

    2014-01-01

    Two studies examined an unexplored motivational determinant of facial emotion recognition: observer regulatory focus. It was predicted that a promotion focus would enhance facial emotion recognition relative to a prevention focus because the attentional strategies associated with promotion focus enhance performance on well-learned or innate tasks - such as facial emotion recognition. In Study 1, a promotion or a prevention focus was experimentally induced and better facial emotion recognition was observed in a promotion focus compared to a prevention focus. In Study 2, individual differences in chronic regulatory focus were assessed and attention allocation was measured using eye tracking during the facial emotion recognition task. Results indicated that the positive relation between a promotion focus and facial emotion recognition is mediated by shorter fixation duration on the face which reflects a pattern of attention allocation matched to the eager strategy in a promotion focus (i.e., striving to make hits). A prevention focus did not have an impact neither on perceptual processing nor on facial emotion recognition. Taken together, these findings demonstrate important mechanisms and consequences of observer motivational orientation for facial emotion recognition.

  11. Consciousness and arousal effects on emotional face processing as revealed by brain oscillations. A gamma band analysis.

    Science.gov (United States)

    Balconi, Michela; Lucchiari, Claudio

    2008-01-01

    It remains an open question whether it is possible to assign a single brain operation or psychological function for facial emotion decoding to a certain type of oscillatory activity. Gamma band activity (GBA) offers an adequate tool for studying cortical activation patterns during emotional face information processing. In the present study brain oscillations were analyzed in response to facial expression of emotions. Specifically, GBA modulation was measured when twenty subjects looked at emotional (angry, fearful, happy, and sad faces) or neutral faces in two different conditions: supraliminal (10 ms) vs subliminal (150 ms) stimulation (100 target-mask pairs for each condition). The results showed that both consciousness and significance of the stimulus in terms of arousal can modulate the power synchronization (ERD decrease) during 150-350 time range: an early oscillatory event showed its peak at about 200 ms post-stimulus. GBA was enhanced by supraliminal more than subliminal elaboration, as well as more by high arousal (anger and fear) than low arousal (happiness and sadness) emotions. Finally a left-posterior dominance for conscious elaboration was found, whereas right hemisphere was discriminant in emotional processing of face in comparison with neutral face.

  12. Effect of positive emotion on consolidation of memory for faces: the modulation of facial valence and facial gender.

    Science.gov (United States)

    Wang, Bo

    2013-01-01

    Studies have shown that emotion elicited after learning enhances memory consolidation. However, no prior studies have used facial photos as stimuli. This study examined the effect of post-learning positive emotion on consolidation of memory for faces. During the learning participants viewed neutral, positive, or negative faces. Then they were assigned to a condition in which they either watched a 9-minute positive video clip, or a 9-minute neutral video. Then 30 minutes after the learning participants took a surprise memory test, in which they made "remember", "know", and "new" judgements. The findings are: (1) Positive emotion enhanced consolidation of recognition for negative male faces, but impaired consolidation of recognition for negative female faces; (2) For males, recognition for negative faces was equivalent to that for positive faces; for females, recognition for negative faces was better than that for positive faces. Our study provides the important evidence that effect of post-learning emotion on memory consolidation can extend to facial stimuli and such an effect can be modulated by facial valence and facial gender. The findings may shed light on establishing models concerning the influence of emotion on memory consolidation.

  13. Emotion recognition and cognitive empathy deficits in adolescent offenders revealed by context-sensitive tasks

    Directory of Open Access Journals (Sweden)

    Maria Luz eGonzalez-Gadea

    2014-10-01

    Full Text Available Emotion recognition and empathy abilities require the integration of contextual information in real-life scenarios. Previous reports have explored these domains in adolescent offenders (AOs but have not used tasks that replicate everyday situations. In this study we included ecological measures with different levels of contextual dependence to evaluate emotion recognition and empathy in AOs relative to non-offenders, controlling for the effect of demographic variables. We also explored the influence of fluid intelligence (FI and executive functions (EFs in the prediction of relevant deficits in these domains. Our results showed that AOs exhibit deficits in context-sensitive measures of emotion recognition and cognitive empathy. Difficulties in these tasks were neither explained by demographic variables nor predicted by FI or EFs. However, performance on measures that included simpler stimuli or could be solved by explicit knowledge was either only partially affected by demographic variables or preserved in AOs. These findings indicate that AOs show contextual social-cognition impairments which are relatively independent of basic cognitive functioning and demographic variables.

  14. A Pilot Study Examining a Computer-Based Intervention to Improve Recognition and Understanding of Emotions in Young Children with Communication and Social Deficits.

    Science.gov (United States)

    Romero, Neri L

    2017-06-01

    A common social impairment in individuals with ASD is difficulty interpreting and or predicting emotions of others. To date, several interventions targeting teaching emotion recognition and understanding have been utilized both by researchers and practitioners. The results suggest that teaching emotion recognition is possible, but that the results do not generalize to non-instructional contexts. This study sought to replicate earlier findings of a positive impact of teaching emotion recognition using a computer-based intervention and to extend it by testing for generalization on live models in the classroom setting. Two boys and one girl, four to eight years in age, educated in self-contained classrooms for students with communication and social skills deficits, participated in this study. A multiple probe across participants design was utilized. Measures of emotion recognition and understanding were assessed at baseline, intervention, and one month post-intervention to determine maintenance effects. Social validity was assessed through parent and teacher questionnaires. All participants showed improvements in measures assessing their recognition of emotions in faces, generalized knowledge to live models, and maintained gains one month post intervention. These preliminary results are encouraging and should be utilized to inform a group design, in order to test efficacy with a larger population. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Facing emotions in narcolepsy with cataplexy: haemodynamic and behavioural responses during emotional stimulation.

    Science.gov (United States)

    de Zambotti, Massimiliano; Pizza, Fabio; Covassin, Naima; Vandi, Stefano; Cellini, Nicola; Stegagno, Luciano; Plazzi, Giuseppe

    2014-08-01

    Narcolepsy with cataplexy is a complex sleep disorder that affects the modulation of emotions: cataplexy, the key symptom of narcolepsy, is indeed strongly linked with emotions that usually trigger the episodes. Our study aimed to investigate haemodynamic and behavioural responses during emotional stimulation in narco-cataplexy. Twelve adult drug-naive narcoleptic patients (five males; age: 33.3 ± 9.4 years) and 12 healthy controls (five males; age: 30.9 ± 9.5 years) were exposed to emotional stimuli (pleasant, unpleasant and neutral pictures). Heart rate, arterial blood pressure and mean cerebral blood flow velocity of the middle cerebral arteries were continuously recorded using photoplethysmography and Doppler ultrasound. Ratings of valence and arousal and coping strategies were scored by the Self-Assessment Manikin and by questionnaires, respectively. Narcoleptic patients' haemodynamic responses to pictures overlapped with the data obtained from controls: decrease of heart rate and increase of mean cerebral blood flow velocity regardless of pictures' content, increase of systolic blood pressure during the pleasant condition, and relative reduction of heart rate during pleasant and unpleasant conditions. However, when compared with controls, narcoleptic patients reported lower arousal scores during the pleasant and neutral stimulation, and lower valence scores during the pleasant condition, respectively, and also a lower score at the 'focus on and venting of emotions' dimensions of coping. Our results suggested that adult narcoleptic patients, compared with healthy controls, inhibited their emotion-expressive behaviour to emotional stimulation, and that may be related to the development of adaptive cognitive strategies to face emotions avoiding cataplexy. © 2014 European Sleep Research Society.

  16. Specific Patterns of Emotion Recognition from Faces in Children with ASD: Results of a Cross-Modal Matching Paradigm

    Science.gov (United States)

    Golan, Ofer; Gordon, Ilanit; Fichman, Keren; Keinan, Giora

    2018-01-01

    Children with ASD show emotion recognition difficulties, as part of their social communication deficits. We examined facial emotion recognition (FER) in intellectually disabled children with ASD and in younger typically developing (TD) controls, matched on mental age. Our emotion-matching paradigm employed three different modalities: facial, vocal…

  17. Emotional face processing and flat affect in schizophrenia: functional and structural neural correlates.

    Science.gov (United States)

    Lepage, M; Sergerie, K; Benoit, A; Czechowska, Y; Dickie, E; Armony, J L

    2011-09-01

    There is a general consensus in the literature that schizophrenia causes difficulties with facial emotion perception and discrimination. Functional brain imaging studies have observed reduced limbic activity during facial emotion perception but few studies have examined the relation to flat affect severity. A total of 26 people with schizophrenia and 26 healthy controls took part in this event-related functional magnetic resonance imaging study. Sad, happy and neutral faces were presented in a pseudo-random order and participants indicated the gender of the face presented. Manual segmentation of the amygdala was performed on a structural T1 image. Both the schizophrenia group and the healthy control group rated the emotional valence of facial expressions similarly. Both groups exhibited increased brain activity during the perception of emotional faces relative to neutral ones in multiple brain regions, including multiple prefrontal regions bilaterally, the right amygdala, right cingulate cortex and cuneus. Group comparisons, however, revealed increased activity in the healthy group in the anterior cingulate, right parahippocampal gyrus and multiple visual areas. In schizophrenia, the severity of flat affect correlated significantly with neural activity in several brain areas including the amygdala and parahippocampal region bilaterally. These results suggest that many of the brain regions involved in emotional face perception, including the amygdala, are equally recruited in both schizophrenia and controls, but flat affect can also moderate activity in some other brain regions, notably in the left amygdala and parahippocampal gyrus bilaterally. There were no significant group differences in the volume of the amygdala.

  18. Emotion perception, but not affect perception, is impaired with semantic memory loss.

    Science.gov (United States)

    Lindquist, Kristen A; Gendron, Maria; Barrett, Lisa Feldman; Dickerson, Bradford C

    2014-04-01

    For decades, psychologists and neuroscientists have hypothesized that the ability to perceive emotions on others' faces is inborn, prelinguistic, and universal. Concept knowledge about emotion has been assumed to be epiphenomenal to emotion perception. In this article, we report findings from 3 patients with semantic dementia that cannot be explained by this "basic emotion" view. These patients, who have substantial deficits in semantic processing abilities, spontaneously perceived pleasant and unpleasant expressions on faces, but not discrete emotions such as anger, disgust, fear, or sadness, even in a task that did not require the use of emotion words. Our findings support the hypothesis that discrete emotion concept knowledge helps transform perceptions of affect (positively or negatively valenced facial expressions) into perceptions of discrete emotions such as anger, disgust, fear, and sadness. These findings have important consequences for understanding the processes supporting emotion perception.

  19. Infants' Temperament and Mothers', and Fathers' Depression Predict Infants' Attention to Objects Paired with Emotional Faces.

    Science.gov (United States)

    Aktar, Evin; Mandell, Dorothy J; de Vente, Wieke; Majdandžić, Mirjana; Raijmakers, Maartje E J; Bögels, Susan M

    2016-07-01

    Between 10 and 14 months, infants gain the ability to learn about unfamiliar stimuli by observing others' emotional reactions to those stimuli, so called social referencing (SR). Joint processing of emotion and head/gaze direction is essential for SR. This study tested emotion and head/gaze direction effects on infants' attention via pupillometry in the period following the emergence of SR. Pupil responses of 14-to-17-month-old infants (N = 57) were measured during computerized presentations of unfamiliar objects alone, before-and-after being paired with emotional (happy, sad, fearful vs. neutral) faces gazing towards (vs. away) from objects. Additionally, the associations of infants' temperament, and parents' negative affect/depression/anxiety with infants' pupil responses were explored. Both mothers and fathers of participating infants completed questionnaires about their negative affect, depression and anxiety symptoms and their infants' negative temperament. Infants allocated more attention (larger pupils) to negative vs. neutral faces when the faces were presented alone, while they allocated less attention to objects paired with emotional vs. neutral faces independent of head/gaze direction. Sad (but not fearful) temperament predicted more attention to emotional faces. Infants' sad temperament moderated the associations of mothers' depression (but not anxiety) with infants' attention to objects. Maternal depression predicted more attention to objects paired with emotional expressions in infants low in sad temperament, while it predicted less attention in infants high in sad temperament. Fathers' depression (but not anxiety) predicted more attention to objects paired with emotional expressions independent of infants' temperament. We conclude that infants' own temperamental dispositions for sadness, and their exposure to mothers' and fathers' depressed moods may influence infants' attention to emotion-object associations in social learning contexts.

  20. Child's recognition of emotions in robot's face and body

    NARCIS (Netherlands)

    Cohen, I.; Looije, R.; Neerincx, M.A.

    2011-01-01

    Social robots can comfort and support children who have to cope with chronic diseases. In previous studies, a "facial robot", the iCat, proved to show well-recognized emotional expressions that are important in social interactions. The question is if a mobile robot without a face, the Nao, can

  1. Temporal lobe structures and facial emotion recognition in schizophrenia patients and nonpsychotic relatives.

    Science.gov (United States)

    Goghari, Vina M; Macdonald, Angus W; Sponheim, Scott R

    2011-11-01

    Temporal lobe abnormalities and emotion recognition deficits are prominent features of schizophrenia and appear related to the diathesis of the disorder. This study investigated whether temporal lobe structural abnormalities were associated with facial emotion recognition deficits in schizophrenia and related to genetic liability for the disorder. Twenty-seven schizophrenia patients, 23 biological family members, and 36 controls participated. Several temporal lobe regions (fusiform, superior temporal, middle temporal, amygdala, and hippocampus) previously associated with face recognition in normative samples and found to be abnormal in schizophrenia were evaluated using volumetric analyses. Participants completed a facial emotion recognition task and an age recognition control task under time-limited and self-paced conditions. Temporal lobe volumes were tested for associations with task performance. Group status explained 23% of the variance in temporal lobe volume. Left fusiform gray matter volume was decreased by 11% in patients and 7% in relatives compared with controls. Schizophrenia patients additionally exhibited smaller hippocampal and middle temporal volumes. Patients were unable to improve facial emotion recognition performance with unlimited time to make a judgment but were able to improve age recognition performance. Patients additionally showed a relationship between reduced temporal lobe gray matter and poor facial emotion recognition. For the middle temporal lobe region, the relationship between greater volume and better task performance was specific to facial emotion recognition and not age recognition. Because schizophrenia patients exhibited a specific deficit in emotion recognition not attributable to a generalized impairment in face perception, impaired emotion recognition may serve as a target for interventions.

  2. It Is Not Just in Faces! Processing of Emotion and Intention from Biological Motion in Psychiatric Disorders

    Directory of Open Access Journals (Sweden)

    Łukasz Okruszek

    2018-02-01

    Full Text Available Social neuroscience offers a wide range of techniques that may be applied to study the social cognitive deficits that may underlie reduced social functioning—a common feature across many psychiatric disorders. At the same time, a significant proportion of research in this area has been conducted using paradigms that utilize static displays of faces or eyes. The use of point-light displays (PLDs offers a viable alternative for studying recognition of emotion or intention inference while minimizing the amount of information presented to participants. This mini-review aims to summarize studies that have used PLD to study emotion and intention processing in schizophrenia (SCZ, affective disorders, anxiety and personality disorders, eating disorders and neurodegenerative disorders. Two main conclusions can be drawn from the reviewed studies: first, the social cognitive problems found in most of the psychiatric samples using PLD were of smaller magnitude than those found in studies presenting social information using faces or voices. Second, even though the information presented in PLDs is extremely limited, presentation of these types of stimuli is sufficient to elicit the disorder-specific, social cognitive biases (e.g., mood-congruent bias in depression, increased threat perception in anxious individuals, aberrant body size perception in eating disorders documented using other methodologies. Taken together, these findings suggest that point-light stimuli may be a useful method of studying social information processing in psychiatry. At the same time, some limitations of using this methodology are also outlined.

  3. ‘Distracters’ do not always distract: Visual working memory for angry faces is enhanced by incidental emotional words.

    Directory of Open Access Journals (Sweden)

    Margaret Cecilia Jackson

    2012-10-01

    Full Text Available We are often required to filter out distraction in order to focus on a primary task during which working memory (WM is engaged. Previous research has shown that negative versus neutral distracters presented during a visual WM maintenance period significantly impair memory for neutral information. However, the contents of WM are often also emotional in nature. The question we address here is how incidental information might impact upon visual WM when both this and the memory items contain emotional information. We presented emotional versus neutral words during the maintenance interval of an emotional visual WM faces task. Participants encoded two angry or happy faces into WM, and several seconds into a 9 second maintenance period a negative, positive, or neutral word was flashed on the screen three times. A single neutral test face was presented for retrieval with a face identity that was either present or absent in the preceding study array. WM for angry face identities was significantly better when an emotional (negative or positive versus neutral (or no word was presented. In contrast, WM for happy face identities was not significantly affected by word valence. These findings suggest that the presence of emotion within an intervening stimulus boosts the emotional value of threat-related information maintained in visual WM and thus improves performance. In addition, we show that incidental events that are emotional in nature do not always distract from an ongoing WM task.

  4. An exploration of emotional protection and regulation in nurse-patient interactions: The role of the professional face and the emotional mirror.

    Science.gov (United States)

    Cecil, Penelope; Glass, Nel

    2015-01-01

    While interpersonal styles of nurse-patient communication have become more relaxed in recent years, nurses remain challenged in emotional engagement with patients and other health professionals. In order to preserve a professional distance in patient care delivery however slight, nurses need to be able to regulate their emotions. This research aimed to investigate nurses' perceptions of emotional protection and regulation in patient care delivery. A qualitative approach was used for the study utilising in-depth semi-structured interviews and researcher reflective journaling. Participants were drawn from rural New South Wales. Following institutional ethics approval 5 nurses were interviewed and reflective journaling commenced. The interviews and the reflective journal were transcribed verbatim. The results revealed that nurses' emotional regulation demonstrated by a 'professional face' was an important strategy to enable delivery of quality care even though it resulted in emotional containment. Such regulation was a protective mechanism employed to look after self and was critical in situations of emotional dissonance. The results also found that nurses experience emotional dissonance in situations where they have unresolved personal emotional issues and the latter was a individual motivator to manage emotions in the workplace. Emotions play a pivotal role within nurse-patient relationships. The professional face can be recognised as contributing to emotional health and therefore maintaining the emotional health of nurses in practice. This study foregrounds the importance of regulating emotions and nurturing nurses' emotional health in contemporary practice.

  5. Electrophysiological correlates of emotional face processing in typically developing adults and adults with high functioning Autism

    OpenAIRE

    Barrie, Jennifer Nicole

    2012-01-01

    Emotional expressions have been found to affect various event-related potentials (ERPs). Furthermore, socio-emotional functioning is altered in individuals with autism, and a growing body of neuroimaging and electrophysiological evidence substantiates underlying neural differences for face processing in this population. However, relatively few studies have examined the time-course of emotional face processing in autism. This study examined how implicit (not the intended focus of attention) ve...

  6. Interactions among the effects of head orientation, emotional expression, and physical attractiveness on face preferences.

    Science.gov (United States)

    Main, Julie C; DeBruine, Lisa M; Little, Anthony C; Jones, Benedict C

    2010-01-01

    Previous studies have shown that preferences for direct versus averted gaze are modulated by emotional expressions and physical attractiveness. For example, preferences for direct gaze are stronger when judging happy or physically attractive faces than when judging disgusted or physically unattractive faces. Here we show that preferences for front versus three-quarter views of faces, in which gaze direction was always congruent with head orientation, are also modulated by emotional expressions and physical attractiveness; participants demonstrated preferences for front views of faces over three-quarter views of faces when judging the attractiveness of happy, physically attractive individuals, but not when judging the attractiveness of relatively unattractive individuals or those with disgusted expressions. Moreover, further analyses indicated that these interactions did not simply reflect differential perceptions of the intensity of the emotional expressions shown in each condition. Collectively, these findings present novel evidence that the effect of the direction of the attention of others on attractiveness judgments is modulated by cues to the physical attractiveness and emotional state of the depicted individual, potentially reflecting psychological adaptations for efficient allocation of social effort. These data also present the first behavioural evidence that the effect of the direction of the attention of others on attractiveness judgments reflects viewer-referenced, rather than face-referenced, coding and/or processing of gaze direction.

  7. Investigating emotion recognition and empathy deficits in Conduct Disorder using behavioural and eye-tracking methods

    OpenAIRE

    Martin-Key, Nayra, Anna

    2017-01-01

    The aim of this thesis was to characterise the nature of the emotion recognition and empathy deficits observed in male and female adolescents with Conduct Disorder (CD) and varying levels of callous-unemotional (CU) traits. The first two experiments employed behavioural tasks with concurrent eye-tracking methods to explore the mechanisms underlying facial and body expression recognition deficits. Having CD and being male independently predicted poorer facial expression recognition across all ...

  8. Identification of emotions in mixed disgusted-happy faces as a function of depressive symptom severity.

    Science.gov (United States)

    Sanchez, Alvaro; Romero, Nuria; Maurage, Pierre; De Raedt, Rudi

    2017-12-01

    Interpersonal difficulties are common in depression, but their underlying mechanisms are not yet fully understood. The role of depression in the identification of mixed emotional signals with a direct interpersonal value remains unclear. The present study aimed to clarify this question. A sample of 39 individuals reporting a broad range of depression levels completed an emotion identification task where they viewed faces expressing three emotional categories (100% disgusted and 100% happy faces, as well as their morphed 50% disgusted - 50% happy exemplars). Participants were asked to identify the corresponding depicted emotion as "clearly disgusted", "mixed", or "clearly happy". Higher depression levels were associated with lower identification of positive emotions in 50% disgusted - 50% happy faces. The study was conducted with an analogue sample reporting individual differences in subclinical depression levels. Further research must replicate these findings in a clinical sample and clarify whether differential emotional identification patterns emerge in depression for different mixed negative-positive emotions (sad-happy vs. disgusted-happy). Depression may account for a lower bias to perceive positive states when ambiguous states from others include subtle signals of social threat (i.e., disgust), leading to an under-perception of positive social signals. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Emotional cues enhance the attentional effects on spatial and temporal resolution.

    Science.gov (United States)

    Bocanegra, Bruno R; Zeelenberg, René

    2011-12-01

    In the present study, we demonstrated that the emotional significance of a spatial cue enhances the effect of covert attention on spatial and temporal resolution (i.e., our ability to discriminate small spatial details and fast temporal flicker). Our results indicated that fearful face cues, as compared with neutral face cues, enhanced the attentional benefits in spatial resolution but also enhanced the attentional deficits in temporal resolution. Furthermore, we observed that the overall magnitudes of individuals' attentional effects correlated strongly with the magnitude of the emotion × attention interaction effect. Combined, these findings provide strong support for the idea that emotion enhances the strength of a cue's attentional response.

  10. Emotional face expression modulates occipital-frontal effective connectivity during memory formation in a bottom-up fashion

    Directory of Open Access Journals (Sweden)

    Daiming eXiu

    2015-04-01

    Full Text Available This study investigated the role of bottom-up and top-down neural mechanisms in the processing of emotional face expression during memory formation. Functional brain imaging data was acquired during incidental learning of positive (‘happy’, neutral and negative (‘angry’ or ‘fearful’ faces. Dynamic Causal Modeling (DCM was applied on the fMRI data to characterize effective connectivity within a brain network involving face perception (inferior occipital gyrus and fusiform gyrus and successful memory formation related areas (hippocampus, superior parietal lobule, amygdala and orbitofrontal cortex. The bottom-up models assumed processing of emotional face expression along feed forward pathways to the orbitofrontal cortex. The top-down models assumed that the orbitofrontal cortex processed emotional valence and mediated connections to the hippocampus. A subsequent recognition memory test showed an effect of negative emotion on the response bias, but not on memory performance. Our DCM findings showed that the bottom-up model family of effective connectivity best explained the data across all subjects and specified that emotion affected most bottom-up connections to the orbitofrontal cortex, especially from the occipital visual cortex and superior parietal lobule. Of those pathways to the orbitofrontal cortex the connection from the inferior occipital gyrus correlated with memory performance independently of valence. We suggest that bottom-up neural mechanisms support effects of emotional face expression and memory formation in a parallel and partially overlapping fashion.

  11. Transcutaneous vagus nerve stimulation (tVNS) enhances recognition of emotions in faces but not bodies.

    Science.gov (United States)

    Sellaro, Roberta; de Gelder, Beatrice; Finisguerra, Alessandra; Colzato, Lorenza S

    2018-02-01

    The polyvagal theory suggests that the vagus nerve is the key phylogenetic substrate enabling optimal social interactions, a crucial aspect of which is emotion recognition. A previous study showed that the vagus nerve plays a causal role in mediating people's ability to recognize emotions based on images of the eye region. The aim of this study is to verify whether the previously reported causal link between vagal activity and emotion recognition can be generalized to situations in which emotions must be inferred from images of whole faces and bodies. To this end, we employed transcutaneous vagus nerve stimulation (tVNS), a novel non-invasive brain stimulation technique that causes the vagus nerve to fire by the application of a mild electrical stimulation to the auricular branch of the vagus nerve, located in the anterior protuberance of the outer ear. In two separate sessions, participants received active or sham tVNS before and while performing two emotion recognition tasks, aimed at indexing their ability to recognize emotions from facial and bodily expressions. Active tVNS, compared to sham stimulation, enhanced emotion recognition for whole faces but not for bodies. Our results confirm and further extend recent observations supporting a causal relationship between vagus nerve activity and the ability to infer others' emotional state, but restrict this association to situations in which the emotional state is conveyed by the whole face and/or by salient facial cues, such as eyes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Face scanning in autism spectrum disorder (ASD and attention deficit/hyperactivity disorder (ADHD: human versus dog face scanning

    Directory of Open Access Journals (Sweden)

    Mauro eMuszkat

    2015-10-01

    Full Text Available This study used eye-tracking to explore attention allocation to human and dog faces in children and adolescents with autism spectrum disorder (ASD, attention deficit/hyperactivity disorder (ADHD, and typical development (TD. Significant differences were found among the three groups. TD participants looked longer at the eyes than ASD and ADHD ones, irrespective of the faces presented. In spite of this difference, groups were similar in that they looked more to the eyes than to the mouth areas of interest. The ADHD group gazed longer at the mouth region than the other groups. Furthermore, groups were also similar in that they looked more to the dog than to the human faces. The eye tracking technology proved to be useful for behavioral investigation in different neurodevelopmental disorders.

  13. Face Processing in Children with Autism Spectrum Disorder: Independent or Interactive Processing of Facial Identity and Facial Expression?

    Science.gov (United States)

    Krebs, Julia F.; Biswas, Ajanta; Pascalis, Olivier; Kamp-Becker, Inge; Remschmidt, Helmuth; Schwarzer, Gudrun

    2011-01-01

    The current study investigated if deficits in processing emotional expression affect facial identity processing and vice versa in children with autism spectrum disorder. Children with autism and IQ and age matched typically developing children classified faces either by emotional expression, thereby ignoring facial identity or by facial identity…

  14. [Impact of facial emotional recognition alterations in Dementia of the Alzheimer type].

    Science.gov (United States)

    Rubinstein, Wanda; Cossini, Florencia; Politis, Daniel

    2016-07-01

    Face recognition of basic emotions is independent of other deficits in dementia of the Alzheimer type. Among these deficits, there is disagreement about what emotions are more difficult to recognize. Our aim was to study the presence of alterations in the process of facial recognition of basic emotions, and to investigate if there were differences in the recognition of each type of emotion in Alzheimer's disease. With three tests of recognition of basic facial emotions we evaluated 29 patients who had been diagnosed with dementia of the Alzheimer type and 18 control subjects. Significant differences were obtained in tests of recognition of basic facial emotions and between each. Since the amygdala, one of the brain structures responsible for emotional reaction, is affected in the early stages of this disease, our findings become relevant to understand how this alteration of the process of emotional recognition impacts the difficulties these patients have in both interpersonal relations and behavioral disorders.

  15. Abnormal early gamma responses to emotional faces differentiate unipolar from bipolar disorder patients.

    Science.gov (United States)

    Liu, T Y; Chen, Y S; Su, T P; Hsieh, J C; Chen, L F

    2014-01-01

    This study investigates the cortical abnormalities of early emotion perception in patients with major depressive disorder (MDD) and bipolar disorder (BD) using gamma oscillations. Twenty-three MDD patients, twenty-five BD patients, and twenty-four normal controls were enrolled and their event-related magnetoencephalographic responses were recorded during implicit emotional tasks. Our results demonstrated abnormal gamma activity within 100 ms in the emotion-related regions (amygdala, orbitofrontal (OFC) cortex, anterior insula (AI), and superior temporal pole) in the MDD patients, suggesting that these patients may have dysfunctions or negativity biases in perceptual binding of emotional features at very early stage. Decreased left superior medial frontal cortex (smFC) responses to happy faces in the MDD patients were correlated with their serious level of depression symptoms, indicating that decreased smFC activity perhaps underlies irregular positive emotion processing in depressed patients. In the BD patients, we showed abnormal activation in visual regions (inferior/middle occipital and middle temporal cortices) which responded to emotional faces within 100 ms, supporting that the BD patients may hyperactively respond to emotional features in perceptual binding. The discriminant function of gamma activation in the left smFC, right medial OFC, right AI/inferior OFC, and the right precentral cortex accurately classified 89.6% of patients as unipolar/bipolar disorders.

  16. Dissociable patterns in the control of emotional interference in adults with attention-deficit/hyperactivity disorder (ADHD and in adults with alcohol dependence.

    Directory of Open Access Journals (Sweden)

    Ivo Marx

    Full Text Available OBJECTIVES: To effectively manage current task demands, attention must be focused on task-relevant information while task-irrelevant information is rejected. However, in everyday life, people must cope with emotions, which may interfere with actual task demands and may challenge functional attention allocation. Control of interfering emotions has been associated with the proper functioning of the dorsolateral prefrontal cortex (DLPFC. As DLPFC dysfunction is evident in subjects with ADHD and in subjects with alcohol dependence, the current study sought to examine the bottom-up effect of emotional distraction on task performance in both disorders. METHODS: Male adults with ADHD (n = 22, male adults with alcohol dependence (n = 16, and healthy controls (n = 30 performed an emotional working memory task (n-back task. In the background of the task, we presented neutral and negative stimuli that varied in emotional saliency. RESULTS: In both clinical groups, a working memory deficit was evident. Moreover, both clinical groups displayed deficient emotional interference control. The n-back performance of the controls was not affected by the emotional distractors, whereas that of subjects with ADHD deteriorated in the presence of low salient distractors, and that of alcoholics did not deteriorate until high salient distractors were presented. Subsequent to task performance, subjects with ADHD accurately recognized more distractors than did alcoholics and controls. In alcoholics, picture recognition accuracy was negatively associated with n-back performance, suggesting a functional association between the ability to suppress emotional distractors and successful task performance. In subjects with ADHD, performance accuracy was negatively associated with ADHD inattentive symptoms, suggesting that inattention contributes to the performance deficit. CONCLUSIONS: Subjects with ADHD and alcoholics both display an emotional interference control

  17. Motor, emotional and cognitive deficits in adult BACHD mice : A model for Huntington's disease

    NARCIS (Netherlands)

    Abada, Yah-se K.; Schreiber, Rudy; Ellenbroek, Bart

    2013-01-01

    Rationale: Huntington's disease (HD) is characterized by progressive motor dysfunction, emotional disturbances and cognitive deficits. It is a genetic disease caused by an elongation of the polyglutamine repeats in the huntingtin gene. Whereas HD is a complex disorder, previous studies in mice

  18. The effect of intranasal oxytocin on perceiving and understanding emotion on the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT).

    Science.gov (United States)

    Cardoso, Christopher; Ellenbogen, Mark A; Linnen, Anne-Marie

    2014-02-01

    Evidence suggests that intranasal oxytocin enhances the perception of emotion in facial expressions during standard emotion identification tasks. However, it is not clear whether this effect is desirable in people who do not show deficits in emotion perception. That is, a heightened perception of emotion in faces could lead to "oversensitivity" to the emotions of others in nonclinical participants. The goal of this study was to assess the effects of intranasal oxytocin on emotion perception using ecologically valid social and nonsocial visual tasks. Eighty-two participants (42 women) self-administered a 24 IU dose of intranasal oxytocin or a placebo in a double-blind, randomized experiment and then completed the perceiving and understanding emotion components of the Mayer-Salovey-Caruso Emotional Intelligence Test. In this test, emotion identification accuracy is based on agreement with a normative sample. As expected, participants administered intranasal oxytocin rated emotion in facial stimuli as expressing greater emotional intensity than those given a placebo. Consequently, accurate identification of emotion in faces, based on agreement with a normative sample, was impaired in the oxytocin group relative to placebo. No such effect was observed for tests using nonsocial stimuli. The results are consistent with the hypothesis that intranasal oxytocin enhances the salience of social stimuli in the environment, but not nonsocial stimuli. The present findings support a growing literature showing that the effects of intranasal oxytocin on social cognition can be negative under certain circumstances, in this case promoting "oversensitivity" to emotion in faces in healthy people. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  19. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions.

    Science.gov (United States)

    Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the

  20. Maternal emotion regulation mediates the association between adult attention-deficit/hyperactivity disorder symptoms and parenting.

    Science.gov (United States)

    Mazursky-Horowitz, Heather; Felton, Julia W; MacPherson, Laura; Ehrlich, Katherine B; Cassidy, Jude; Lejuez, C W; Chronis-Tuscano, Andrea

    2015-01-01

    Mothers with elevated Attention-Deficit/Hyperactivity Disorder (ADHD) symptoms demonstrate parenting deficits, as well as difficulties in emotion regulation (ER), which may further impact their ability to effectively parent. However, no empirical research has examined potential mediators that explain the relations between maternal ADHD symptoms and parenting. This prospective longitudinal study examined difficulties with ER as a mediator of the relation between adult ADHD symptoms and parenting among 234 mothers of adolescents recruited from the community when they were between the ages of nine to twelve. Maternal ratings of adult ADHD symptoms, difficulties with ER, and parenting responses to their adolescents' expressions of negative emotions were collected over the course of three years. We found that maternal ADHD symptoms were negatively associated with positive parenting responses to adolescents' negative emotions, and positively associated with harsh parenting and maternal distress reactions. Moreover, maternal ER mediated the relation between adult ADHD symptoms and harsh parenting responses, while controlling for adolescent ADHD and disruptive behavior symptoms. However, maternal ER did not mediate the relation between ADHD symptoms and positive or distressed parental responses. Thus, it appears that ER is one mechanism by which maternal ADHD symptoms are associated with harsh responses to their adolescents' expressions of negative emotion. These findings may have downstream implications for adolescent adjustment.

  1. The insular cortex: relationship to skin conductance responses to facial expression of emotion in temporal lobe epilepsy.

    Science.gov (United States)

    Banks, Sarah J; Bellerose, Jenny; Douglas, Danielle; Jones-Gotman, Marilyn

    2014-03-01

    The insula plays an important role both in emotion processing and in the generation of epileptic seizures. In the current study we examined thickness of insular cortices and bilateral skin conductance responses (SCR) in healthy subjects in addition to a small number of patients with temporal lobe epilepsy. SCR measures arousal and is used to assess non-conscious responses to emotional stimuli. We used two emotion tasks, one explicitly about emotion and the other implicit. The explicit task required judgments about emotions being expressed in photographs of faces, while the implicit one required judgments about the age of the people in the photographs. Patients and healthy differed in labeling neutral faces, but not other emotions. They also differed in their SCR to emotions, though the profile depended on which hand the recordings were from. Finally, we found relationships between the thickness of the insula and SCR to each task: in the healthy group the thickness of the left insula was related to SCR to the emotion-labeling task; in the patient group it was between the thickness of the right insula and SCR in the age-labeling task. These patterns were evident only for the right hand recordings, thus underscoring the importance of bilateral recordings.

  2. Age-Group Differences in Interference from Young and Older Emotional Faces.

    Science.gov (United States)

    Ebner, Natalie C; Johnson, Marcia K

    2010-11-01

    Human attention is selective, focusing on some aspects of events at the expense of others. In particular, angry faces engage attention. Most studies have used pictures of young faces, even when comparing young and older age groups. Two experiments asked (1) whether task-irrelevant faces of young and older individuals with happy, angry, and neutral expressions disrupt performance on a face-unrelated task, (2) whether interference varies for faces of different ages and different facial expressions, and (3) whether young and older adults differ in this regard. Participants gave speeded responses on a number task while irrelevant faces appeared in the background. Both age groups were more distracted by own than other-age faces. In addition, young participants' responses were slower for angry than happy faces, whereas older participants' responses were slower for happy than angry faces. Factors underlying age-group differences in interference from emotional faces of different ages are discussed.

  3. Sensory contribution to vocal emotion deficit in Parkinson's disease after subthalamic stimulation.

    Science.gov (United States)

    Péron, Julie; Cekic, Sezen; Haegelen, Claire; Sauleau, Paul; Patel, Sona; Drapier, Dominique; Vérin, Marc; Grandjean, Didier

    2015-02-01

    Subthalamic nucleus (STN) deep brain stimulation in Parkinson's disease induces modifications in the recognition of emotion from voices (or emotional prosody). Nevertheless, the underlying mechanisms are still only poorly understood, and the role of acoustic features in these deficits has yet to be elucidated. Our aim was to identify the influence of acoustic features on changes in emotional prosody recognition following STN stimulation in Parkinson's disease. To this end, we analysed the performances of patients on vocal emotion recognition in pre-versus post-operative groups, as well as of matched controls, entering the acoustic features of the stimuli into our statistical models. Analyses revealed that the post-operative biased ratings on the Fear scale when patients listened to happy stimuli were correlated with loudness, while the biased ratings on the Sadness scale when they listened to happiness were correlated with fundamental frequency (F0). Furthermore, disturbed ratings on the Happiness scale when the post-operative patients listened to sadness were found to be correlated with F0. These results suggest that inadequate use of acoustic features following subthalamic stimulation has a significant impact on emotional prosody recognition in patients with Parkinson's disease, affecting the extraction and integration of acoustic cues during emotion perception. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Reappraisal deficits promote craving and emotional distress among chronic pain patients at risk for prescription opioid misuse.

    Science.gov (United States)

    Garland, Eric L; Hanley, Adam W; Bedford, Carter E; Zubieta, Jon-Kar; Howard, Matthew O; Nakamura, Yoshio; Donaldson, Gary W; Froeliger, Brett

    2018-06-04

    A subset of chronic pain patients misuse prescription opioids as a means of regulating negative emotions. However, opioid misuse may result in deficits in emotion regulation strategies like reappraisal by virtue of the deleterious effects of chronic opioid exposure. The aim of this study was to characterize differences in reappraisal use among chronic pain patients at risk for opioid misuse and those who report taking opioids as prescribed. A sample of 127 pain patients receiving chronic opioid analgesic pharmacotherapy were classified as at risk for opioid misuse (n = 62) or taking opioids as prescribed (n = 65) using the Current Opioid Misuse Measure (COMM). The Emotion Regulation Questionnaire (ERQ) characterized use of emotion regulation strategies including reappraisal and expressive suppression. Participants also reported levels of opioid craving, emotional distress, and pain severity. Patients at risk for opioid misuse reported significantly less reappraisal use (M = 25.31, SD = 7.33) than those who reportedly took opioids as prescribed (M = 30.28, SD = 7.50), p<.001, but did differ with regard to suppression strategies. Reduced reappraisal use was associated with higher opioid craving and emotional distress that mediated the association between reappraisal deficits and opioid misuse risk. Further, there was a significant indirect effect of opioid misuse on emotional distress via reappraisal use. Opioid misuse risk was associated with reduced use of reappraisal, which in turn was associated with dysregulated negative emotions and increased appetitive drive towards consuming opioids. Studying individual differences in emotion regulation may yield efficacious intervention and prevention approaches to stem the rising tide of the prescription opioid crisis.

  5. Deficits in recognition, identification, and discrimination of facial emotions in patients with bipolar disorder.

    Science.gov (United States)

    Benito, Adolfo; Lahera, Guillermo; Herrera, Sara; Muncharaz, Ramón; Benito, Guillermo; Fernández-Liria, Alberto; Montes, José Manuel

    2013-01-01

    To analyze the recognition, identification, and discrimination of facial emotions in a sample of outpatients with bipolar disorder (BD). Forty-four outpatients with diagnosis of BD and 48 matched control subjects were selected. Both groups were assessed with tests for recognition (Emotion Recognition-40 - ER40), identification (Facial Emotion Identification Test - FEIT), and discrimination (Facial Emotion Discrimination Test - FEDT) of facial emotions, as well as a theory of mind (ToM) verbal test (Hinting Task). Differences between groups were analyzed, controlling the influence of mild depressive and manic symptoms. Patients with BD scored significantly lower than controls on recognition (ER40), identification (FEIT), and discrimination (FEDT) of emotions. Regarding the verbal measure of ToM, a lower score was also observed in patients compared to controls. Patients with mild syndromal depressive symptoms obtained outcomes similar to patients in euthymia. A significant correlation between FEDT scores and global functioning (measured by the Functioning Assessment Short Test, FAST) was found. These results suggest that, even in euthymia, patients with BD experience deficits in recognition, identification, and discrimination of facial emotions, with potential functional implications.

  6. More than words (and faces): evidence for a Stroop effect of prosody in emotion word processing.

    Science.gov (United States)

    Filippi, Piera; Ocklenburg, Sebastian; Bowling, Daniel L; Heege, Larissa; Güntürkün, Onur; Newen, Albert; de Boer, Bart

    2017-08-01

    Humans typically combine linguistic and nonlinguistic information to comprehend emotions. We adopted an emotion identification Stroop task to investigate how different channels interact in emotion communication. In experiment 1, synonyms of "happy" and "sad" were spoken with happy and sad prosody. Participants had more difficulty ignoring prosody than ignoring verbal content. In experiment 2, synonyms of "happy" and "sad" were spoken with happy and sad prosody, while happy or sad faces were displayed. Accuracy was lower when two channels expressed an emotion that was incongruent with the channel participants had to focus on, compared with the cross-channel congruence condition. When participants were required to focus on verbal content, accuracy was significantly lower also when prosody was incongruent with verbal content and face. This suggests that prosody biases emotional verbal content processing, even when conflicting with verbal content and face simultaneously. Implications for multimodal communication and language evolution studies are discussed.

  7. Issues in Data Labelling

    NARCIS (Netherlands)

    Cowie, Roddy; Cox, Cate; Martin, Jeam-Claude; Batliner, Anton; Heylen, Dirk K.J.; Karpouzis, Kostas; Cowie, Roddy; Pelachaud, Catherine; Petta, Paolo

    2011-01-01

    Labelling emotion databases is not a purely technical matter. It is bound up with theoretical issues. Different issues affect labelling of emotional content, labelling of the signs that convey emotion, and labelling of the relevant context. Linked to these are representational issues, involving time

  8. Reading emotions from faces in two indigenous societies.

    Science.gov (United States)

    Crivelli, Carlos; Jarillo, Sergio; Russell, James A; Fernández-Dols, José-Miguel

    2016-07-01

    That all humans recognize certain specific emotions from their facial expression-the Universality Thesis-is a pillar of research, theory, and application in the psychology of emotion. Its most rigorous test occurs in indigenous societies with limited contact with external cultural influences, but such tests are scarce. Here we report 2 such tests. Study 1 was of children and adolescents (N = 68; aged 6-16 years) of the Trobriand Islands (Papua New Guinea, South Pacific) with a Western control group from Spain (N = 113, of similar ages). Study 2 was of children and adolescents (N = 36; same age range) of Matemo Island (Mozambique, Africa). In both studies, participants were shown an array of prototypical facial expressions and asked to point to the person feeling a specific emotion: happiness, fear, anger, disgust, or sadness. The Spanish control group matched faces to emotions as predicted by the Universality Thesis: matching was seen on 83% to 100% of trials. For the indigenous societies, in both studies, the Universality Thesis was moderately supported for happiness: smiles were matched to happiness on 58% and 56% of trials, respectively. For other emotions, however, results were even more modest: 7% to 46% in the Trobriand Islands and 22% to 53% in Matemo Island. These results were robust across age, gender, static versus dynamic display of the facial expressions, and between- versus within-subjects design. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  9. Severity of the aggression/anxiety-depression/attention child behavior checklist profile discriminates between different levels of deficits in emotional regulation in youth with attention-deficit hyperactivity disorder.

    Science.gov (United States)

    Biederman, Joseph; Petty, Carter R; Day, Helen; Goldin, Rachel L; Spencer, Thomas; Faraone, Stephen V; Surman, Craig B H; Wozniak, Janet

    2012-04-01

    We examined whether severity scores (1 SD vs 2 SDs) of a unique profile of the Child Behavior Checklist (CBCL) consisting of the Anxiety/Depression, Aggression, and Attention (AAA) scales would help differentiate levels of deficits in children with attention-deficit hyperactivity disorder (ADHD). Subjects were 197 children with ADHD and 224 without ADHD. We defined deficient emotional self-regulation (DESR) as an aggregate cutoff score of >180 but siblings. In contrast, the CBCL-DESR was associated with higher rates of comorbid disruptive behavior, anxiety disorders, and impaired interpersonal functioning compared with other ADHD children. Severity scores of the AAA CBCL profiles can help distinguish 2 groups of emotional regulation problems in children with ADHD.

  10. ERP Correlates of Target-Distracter Differentiation in Repeated Runs of a Continuous Recognition Task with Emotional and Neutral Faces

    Science.gov (United States)

    Treese, Anne-Cecile; Johansson, Mikael; Lindgren, Magnus

    2010-01-01

    The emotional salience of faces has previously been shown to induce memory distortions in recognition memory tasks. This event-related potential (ERP) study used repeated runs of a continuous recognition task with emotional and neutral faces to investigate emotion-induced memory distortions. In the second and third runs, participants made more…

  11. Childhood Poverty Predicts Adult Amygdala and Frontal Activity and Connectivity in Response to Emotional Faces

    Directory of Open Access Journals (Sweden)

    Arash eJavanbakht

    2015-06-01

    Full Text Available Childhood poverty negatively impacts physical and mental health in adulthood. Altered brain development in response to social and environmental factors associated with poverty likely contributes to this effect, engendering maladaptive patterns of social attribution and/or elevated physiological stress. In this fMRI study, we examined the association between childhood poverty and neural processing of social signals (i.e., emotional faces in adulthood. 52 subjects from a longitudinal prospective study recruited as children, participated in a brain imaging study at 23-25 years of age using the Emotional Faces Assessment Task (EFAT. Childhood poverty, independent of concurrent adult income, was associated with higher amygdala and mPFC responses to threat vs. happy faces. Also, childhood poverty was associated with decreased functional connectivity between left amygdala and mPFC. This study is unique because it prospectively links childhood poverty to emotional processing during adulthood, suggesting a candidate neural mechanism for negative social-emotional bias. Adults who grew up poor appear to be more sensitive to social threat cues and less sensitive to positive social cues.

  12. Cultural in-group advantage: emotion recognition in African American and European American faces and voices.

    Science.gov (United States)

    Wickline, Virginia B; Bailey, Wendy; Nowicki, Stephen

    2009-03-01

    The authors explored whether there were in-group advantages in emotion recognition of faces and voices by culture or geographic region. Participants were 72 African American students (33 men, 39 women), 102 European American students (30 men, 72 women), 30 African international students (16 men, 14 women), and 30 European international students (15 men, 15 women). The participants determined emotions in African American and European American faces and voices. Results showed an in-group advantage-sometimes by culture, less often by race-in recognizing facial and vocal emotional expressions. African international students were generally less accurate at interpreting American nonverbal stimuli than were European American, African American, and European international peers. Results suggest that, although partly universal, emotional expressions have subtle differences across cultures that persons must learn.

  13. Steroids facing emotions

    NARCIS (Netherlands)

    Putman, P.L.J.

    2006-01-01

    The studies reported in this thesis have been performed to gain a better understanding about motivational mediators of selective attention and memory for emotionally relevant stimuli, and about the roles that some steroid hormones play in regulation of human motivation and emotion. The stimuli used

  14. Spatiotemporal brain dynamics of emotional face processing modulations induced by the serotonin 1A/2A receptor agonist psilocybin.

    Science.gov (United States)

    Bernasconi, Fosco; Schmidt, André; Pokorny, Thomas; Kometer, Michael; Seifritz, Erich; Vollenweider, Franz X

    2014-12-01

    Emotional face processing is critically modulated by the serotonergic system. For instance, emotional face processing is impaired by acute psilocybin administration, a serotonin (5-HT) 1A and 2A receptor agonist. However, the spatiotemporal brain mechanisms underlying these modulations are poorly understood. Here, we investigated the spatiotemporal brain dynamics underlying psilocybin-induced modulations during emotional face processing. Electrical neuroimaging analyses were applied to visual evoked potentials in response to emotional faces, following psilocybin and placebo administration. Our results indicate a first time period of strength (i.e., Global Field Power) modulation over the 168-189 ms poststimulus interval, induced by psilocybin. A second time period of strength modulation was identified over the 211-242 ms poststimulus interval. Source estimations over these 2 time periods further revealed decreased activity in response to both neutral and fearful faces within limbic areas, including amygdala and parahippocampal gyrus, and the right temporal cortex over the 168-189 ms interval, and reduced activity in response to happy faces within limbic and right temporo-occipital brain areas over the 211-242 ms interval. Our results indicate a selective and temporally dissociable effect of psilocybin on the neuronal correlates of emotional face processing, consistent with a modulation of the top-down control. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Cognitive emotion regulation in children: Reappraisal of emotional faces modulates neural source activity in a frontoparietal network.

    Science.gov (United States)

    Wessing, Ida; Rehbein, Maimu A; Romer, Georg; Achtergarde, Sandra; Dobel, Christian; Zwitserlood, Pienie; Fürniss, Tilman; Junghöfer, Markus

    2015-06-01

    Emotion regulation has an important role in child development and psychopathology. Reappraisal as cognitive regulation technique can be used effectively by children. Moreover, an ERP component known to reflect emotional processing called late positive potential (LPP) can be modulated by children using reappraisal and this modulation is also related to children's emotional adjustment. The present study seeks to elucidate the neural generators of such LPP effects. To this end, children aged 8-14 years reappraised emotional faces, while neural activity in an LPP time window was estimated using magnetoencephalography-based source localization. Additionally, neural activity was correlated with two indexes of emotional adjustment and age. Reappraisal reduced activity in the left dorsolateral prefrontal cortex during down-regulation and enhanced activity in the right parietal cortex during up-regulation. Activity in the visual cortex decreased with increasing age, more adaptive emotion regulation and less anxiety. Results demonstrate that reappraisal changed activity within a frontoparietal network in children. Decreasing activity in the visual cortex with increasing age is suggested to reflect neural maturation. A similar decrease with adaptive emotion regulation and less anxiety implies that better emotional adjustment may be associated with an advance in neural maturation. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Is fear in your head? A comparison of instructed and real-life expressions of emotion in the face and body.

    Science.gov (United States)

    Abramson, Lior; Marom, Inbal; Petranker, Rotem; Aviezer, Hillel

    2017-04-01

    The majority of emotion perception studies utilize instructed and stereotypical expressions of faces or bodies. While such stimuli are highly standardized and well-recognized, their resemblance to real-life expressions of emotion remains unknown. Here we examined facial and body expressions of fear and anger during real-life situations and compared their recognition to that of instructed expressions of the same emotions. In order to examine the source of the affective signal, expressions of emotion were presented as faces alone, bodies alone, and naturally, as faces with bodies. The results demonstrated striking deviations between recognition of instructed and real-life stimuli, which differed as a function of the emotion expressed. In real-life fearful expressions of emotion, bodies were far better recognized than faces, a pattern not found with instructed expressions of emotion. Anger reactions were better recognized from the body than from the face in both real-life and instructed stimuli. However, the real-life stimuli were overall better recognized than their instructed counterparts. These results indicate that differences between instructed and real-life expressions of emotion are prevalent and raise caution against an overreliance of researchers on instructed affective stimuli. The findings also demonstrate that in real life, facial expression perception may rely heavily on information from the contextualizing body. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. Facial responsiveness of psychopaths to the emotional expressions of others.

    Directory of Open Access Journals (Sweden)

    Janina Künecke

    Full Text Available Psychopathic individuals show selfish, manipulative, and antisocial behavior in addition to emotional detachment and reduced empathy. Their empathic deficits are thought to be associated with a reduced responsiveness to emotional stimuli. Immediate facial muscle responses to the emotional expressions of others reflect the expressive part of emotional responsiveness and are positively related to trait empathy. Empirical evidence for reduced facial muscle responses in adult psychopathic individuals to the emotional expressions of others is rare. In the present study, 261 male criminal offenders and non-offenders categorized dynamically presented facial emotion expressions (angry, happy, sad, and neutral during facial electromyography recording of their corrugator muscle activity. We replicated a measurement model of facial muscle activity, which controls for general facial responsiveness to face stimuli, and modeled three correlated emotion-specific factors (i.e., anger, happiness, and sadness representing emotion specific activity. In a multi-group confirmatory factor analysis, we compared the means of the anger, happiness, and sadness latent factors between three groups: 1 non-offenders, 2 low, and 3 high psychopathic offenders. There were no significant mean differences between groups. Our results challenge current theories that focus on deficits in emotional responsiveness as leading to the development of psychopathy and encourage further theoretical development on deviant emotional processes in psychopathic individuals.

  18. The Perception of Time While Perceiving Dynamic Emotional Faces

    Directory of Open Access Journals (Sweden)

    Wang On eLi

    2015-08-01

    Full Text Available Emotion plays an essential role in the perception of time such that time is perceived to fly when events are enjoyable, while unenjoyable moments are perceived to drag. Previous studies have reported a time-drag effect when participants are presented with emotional facial expressions, regardless of the emotion presented. This effect can hardly be explained by induced emotion given the heterogeneous nature of emotional expressions. We conducted two experiments (n=44 & n=39 to examine the cognitive mechanism underlying this effect by presenting dynamic sequences of emotional expressions to participants. Each sequence started with a particular expression, then morphed to another. The presentation of dynamic facial expressions allows a comparison between the time-drag effect of homogeneous pairs of emotional expressions sharing similar valence and arousal to heterogeneous pairs. Sequences of seven durations (400ms, 600ms, 800ms, 1,000ms, 1,200ms, 1,400ms, 1,600ms were presented to participants, who were asked to judge whether the sequences were closer to 400ms or 1,600ms in a two-alternative forced choice task. The data were then collated according to conditions and fit into cumulative Gaussian curves to estimate the point of subjective equivalence indicating the perceived duration of 1,000ms. Consistent with previous reports, a feeling of time dragging is induced regardless of the sequence presented, such that 1,000ms is perceived to be longer than 1,000ms. In addition, dynamic facial expressions exert a greater effect on perceived time drag than static expressions. The effect is most prominent when the dynamics involve an angry face or a change in valence. The significance of this sensitivity is discussed in terms of emotion perception and its evolutionary significance for our attention mechanism.

  19. Emotion Regulation Difficulties in Boys with Oppositional Defiant Disorder/Conduct Disorder and the Relation with Comorbid Autism Traits and Attention Deficit Traits.

    Directory of Open Access Journals (Sweden)

    Jantiene Schoorl

    Full Text Available Previous research has pointed towards a link between emotion dysregulation and aggressive behavior in children. Emotion regulation difficulties are not specific for children with persistent aggression problems, i.e. oppositional defiant disorder or conduct disorder (ODD/CD, children with other psychiatric conditions, such as autism spectrum disorders or attention-deficit/hyperactivity disorder, have emotion regulation difficulties too. On a behavioral level some overlap exists between these disorders and comorbidity is high. The aim of this study was therefore twofold: 1 to examine emotion regulation difficulties in 65 boys with ODD/CD in comparison to a non-clinical control group (NC of 38 boys (8-12 years using a performance measure (Ultimatum Game, parent report and self-report, and 2 to establish to what extent emotion regulation in the ODD/CD group was correlated with severity of autism and/or attention deficit traits. Results on the Ultimatum Game showed that the ODD/CD group rejected more ambiguous offers than the NC group, which is seen as an indication of poor emotion regulation. Parents also reported that the ODD/CD group experienced more emotion regulation problems in daily life than the NC group. In contrast to these cognitive and behavioral measures, self-reports did not reveal any difference, indicating that boys with ODD/CD do not perceive themselves as having impairments in regulating their emotions. Emotional decision making within the ODD/CD group was not related to variation in autism or attention deficit traits. These results support the idea that emotion dysregulation is an important problem within ODD/CD, yet boys with ODD/CD have reduced awareness of this.

  20. Emotion Regulation Difficulties in Boys with Oppositional Defiant Disorder/Conduct Disorder and the Relation with Comorbid Autism Traits and Attention Deficit Traits.

    Science.gov (United States)

    Schoorl, Jantiene; van Rijn, Sophie; de Wied, Minet; van Goozen, Stephanie; Swaab, Hanna

    2016-01-01

    Previous research has pointed towards a link between emotion dysregulation and aggressive behavior in children. Emotion regulation difficulties are not specific for children with persistent aggression problems, i.e. oppositional defiant disorder or conduct disorder (ODD/CD), children with other psychiatric conditions, such as autism spectrum disorders or attention-deficit/hyperactivity disorder, have emotion regulation difficulties too. On a behavioral level some overlap exists between these disorders and comorbidity is high. The aim of this study was therefore twofold: 1) to examine emotion regulation difficulties in 65 boys with ODD/CD in comparison to a non-clinical control group (NC) of 38 boys (8-12 years) using a performance measure (Ultimatum Game), parent report and self-report, and 2) to establish to what extent emotion regulation in the ODD/CD group was correlated with severity of autism and/or attention deficit traits. Results on the Ultimatum Game showed that the ODD/CD group rejected more ambiguous offers than the NC group, which is seen as an indication of poor emotion regulation. Parents also reported that the ODD/CD group experienced more emotion regulation problems in daily life than the NC group. In contrast to these cognitive and behavioral measures, self-reports did not reveal any difference, indicating that boys with ODD/CD do not perceive themselves as having impairments in regulating their emotions. Emotional decision making within the ODD/CD group was not related to variation in autism or attention deficit traits. These results support the idea that emotion dysregulation is an important problem within ODD/CD, yet boys with ODD/CD have reduced awareness of this.

  1. Callousness and affective face processing in adults: Behavioral and brain-potential indicators.

    Science.gov (United States)

    Brislin, Sarah J; Yancey, James R; Perkins, Emily R; Palumbo, Isabella M; Drislane, Laura E; Salekin, Randall T; Fanti, Kostas A; Kimonis, Eva R; Frick, Paul J; Blair, R James R; Patrick, Christopher J

    2018-03-01

    The investigation of callous-unemotional (CU) traits has been central to contemporary research on child behavior problems, and served as the impetus for inclusion of a specifier for conduct disorder in the latest edition of the official psychiatric diagnostic system. Here, we report results from 2 studies that evaluated the construct validity of callousness as assessed in adults, by testing for affiliated deficits in behavioral and neural processing of fearful faces, as have been shown in youthful samples. We hypothesized that scores on an established measure of callousness would predict reduced recognition accuracy and diminished electocortical reactivity for fearful faces in adult participants. In Study 1, 66 undergraduate participants performed an emotion recognition task in which they viewed affective faces of different types and indicated the emotion expressed by each. In Study 2, electrocortical data were collected from 254 adult twins during viewing of fearful and neutral face stimuli, and scored for event-related response components. Analyses of Study 1 data revealed that higher callousness was associated with decreased recognition accuracy for fearful faces specifically. In Study 2, callousness was associated with reduced amplitude of both N170 and P200 responses to fearful faces. Current findings demonstrate for the first time that callousness in adults is associated with both behavioral and physiological deficits in the processing of fearful faces. These findings support the validity of the CU construct with adults and highlight the possibility of a multidomain measurement framework for continued study of this important clinical construct. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  2. Ratings of Emotion in Laterally Presented Faces: Sex and handedness effects

    NARCIS (Netherlands)

    van Strien, J.W.; van Beek, S.

    2000-01-01

    Sixteen right-handed participants (8 male and 8 female students) and 16 left-handed participants (8 male and 8 female students) were presented with cartoon faces expressing emotions ranging from extremely positive to extremely negative. A forced-choice paradigm was used in which the participants

  3. Public Higher-Education Systems Face Painful Choices as Three Northeastern States Confront Massive Deficits.

    Science.gov (United States)

    Blumenstyk, Goldie

    1989-01-01

    Massachusetts, Connecticut, and New York face giant deficits in their state budgets. The financial impact of the 1986 federal tax reform law was underestimated by colleges and income estimates were overly optimistic for 1988 and 1989. Unpopular, new taxes are seen as the way to solve the budget crunch. (MLW)

  4. Cyber Victimization in High School: Measurement, Overlap with Face-to-Face Victimization, and Associations with Social-Emotional Outcomes

    Science.gov (United States)

    Brown, Christina Flynn; Demaray, Michelle Kilpatrick; Tennant, Jaclyn E.; Jenkins, Lyndsay N.

    2017-01-01

    Cyber victimization is a contemporary problem facing youth and adolescents (Diamanduros, Downs, & Jenkins, 2008; Kowalski & Limber, 2007). It is imperative for researchers and school personnel to understand the associations between cyber victimization and student social-emotional outcomes. This article explores (a) gender differences in…

  5. P2-27: Electrophysiological Correlates of Conscious and Unconscious Processing of Emotional Faces in Individuals with High and Low Autistic Traits

    Directory of Open Access Journals (Sweden)

    Svjetlana Vukusic

    2012-10-01

    Full Text Available LeDoux (1996 The Emotional Brain has suggested that subconsciouss presentation of fearful emotional information is relayed to the amygdala along a rapid subcortical route. Rapid emotion processing is important because it alerts other parts of brain to emotionally salient information. It also produces immediate reflexive responses to threating stimuli in comparison to slower conscious appraisal, which is of important adaptive survival value. Current theoretical models of autism spectrum disorders (ASD have linked impairments in the processing of emotional information to amygdala dysfunction. It can be suggested that impairment in face processing found in autism may be the result of impaired rapid subconscious processing of emotional information which does not make faces socially salient. Previous studies examined subconscious processing of emotional stimuli with backward masking paradigms by using very brief presentation of emotional face stimuli proceeded by a mask. We used an event-related potential (ERP study within a backward masking paradigm with subjects with low and high autistic tendencies as measured by the Autism Spectrum Quotient (AQ questionnaire. The time course of processing of fearful and happy facial expressions and an emotionally neutral face was investigated during subliminal (16 ms and supraliminal (166 ms stimuli presentation. The task consisted of an explicit categorization of emotional and neutral faces. We looked at ERP components N2, P3a, and also N170 for differences between subjects with low ( 19 AQ.

  6. Not just fear and sadness: meta-analytic evidence of pervasive emotion recognition deficits for facial and vocal expressions in psychopathy.

    Science.gov (United States)

    Dawel, Amy; O'Kearney, Richard; McKone, Elinor; Palermo, Romina

    2012-11-01

    The present meta-analysis aimed to clarify whether deficits in emotion recognition in psychopathy are restricted to certain emotions and modalities or whether they are more pervasive. We also attempted to assess the influence of other important variables: age, and the affective factor of psychopathy. A systematic search of electronic databases and a subsequent manual search identified 26 studies that included 29 experiments (N = 1376) involving six emotion categories (anger, disgust, fear, happiness, sadness, surprise) across three modalities (facial, vocal, postural). Meta-analyses found evidence of pervasive impairments across modalities (facial and vocal) with significant deficits evident for several emotions (i.e., not only fear and sadness) in both adults and children/adolescents. These results are consistent with recent theorizing that the amygdala, which is believed to be dysfunctional in psychopathy, has a broad role in emotion processing. We discuss limitations of the available data that restrict the ability of meta-analysis to consider the influence of age and separate the sub-factors of psychopathy, highlighting important directions for future research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. The Processing of Human Emotional Faces by Pet and Lab Dogs: Evidence for Lateralization and Experience Effects

    Science.gov (United States)

    Barber, Anjuli L. A.; Randi, Dania; Müller, Corsin A.; Huber, Ludwig

    2016-01-01

    From all non-human animals dogs are very likely the best decoders of human behavior. In addition to a high sensitivity to human attentive status and to ostensive cues, they are able to distinguish between individual human faces and even between human facial expressions. However, so far little is known about how they process human faces and to what extent this is influenced by experience. Here we present an eye-tracking study with dogs emanating from two different living environments and varying experience with humans: pet and lab dogs. The dogs were shown pictures of familiar and unfamiliar human faces expressing four different emotions. The results, extracted from several different eye-tracking measurements, revealed pronounced differences in the face processing of pet and lab dogs, thus indicating an influence of the amount of exposure to humans. In addition, there was some evidence for the influences of both, the familiarity and the emotional expression of the face, and strong evidence for a left gaze bias. These findings, together with recent evidence for the dog's ability to discriminate human facial expressions, indicate that dogs are sensitive to some emotions expressed in human faces. PMID:27074009

  8. Music to my ears: Age-related decline in musical and facial emotion recognition.

    Science.gov (United States)

    Sutcliffe, Ryan; Rendell, Peter G; Henry, Julie D; Bailey, Phoebe E; Ruffman, Ted

    2017-12-01

    We investigated young-old differences in emotion recognition using music and face stimuli and tested explanatory hypotheses regarding older adults' typically worse emotion recognition. In Experiment 1, young and older adults labeled emotions in an established set of faces, and in classical piano stimuli that we pilot-tested on other young and older adults. Older adults were worse at detecting anger, sadness, fear, and happiness in music. Performance on the music and face emotion tasks was not correlated for either age group. Because musical expressions of fear were not equated for age groups in the pilot study of Experiment 1, we conducted a second experiment in which we created a novel set of music stimuli that included more accessible musical styles, and which we again pilot-tested on young and older adults. In this pilot study, all musical emotions were identified similarly by young and older adults. In Experiment 2, participants also made age estimations in another set of faces to examine whether potential relations between the face and music emotion tasks would be shared with the age estimation task. Older adults did worse in each of the tasks, and had specific difficulty recognizing happy, sad, peaceful, angry, and fearful music clips. Older adults' difficulties in each of the 3 tasks-music emotion, face emotion, and face age-were not correlated with each other. General cognitive decline did not appear to explain our results as increasing age predicted emotion performance even after fluid IQ was controlled for within the older adult group. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. An fMRI study of facial emotion processing in patients with schizophrenia.

    Science.gov (United States)

    Gur, Raquel E; McGrath, Claire; Chan, Robin M; Schroeder, Lee; Turner, Travis; Turetsky, Bruce I; Kohler, Christian; Alsop, David; Maldjian, Joseph; Ragland, J Daniel; Gur, Ruben C

    2002-12-01

    Emotion processing deficits are notable in schizophrenia. The authors evaluated cerebral blood flow response in schizophrenia patients during facial emotion processing to test the hypothesis of diminished limbic activation related to emotional relevance of facial stimuli. Fourteen patients with schizophrenia and 14 matched comparison subjects viewed facial displays of happiness, sadness, anger, fear, and disgust as well as neutral faces. Functional magnetic resonance imaging was used to measure blood-oxygen-level-dependent signal changes as the subjects alternated between tasks of discriminating emotional valence (positive versus negative) and age (over 30 versus under 30) of the faces with an interleaved crosshair reference condition. The groups did not differ in performance on either task. For both tasks, healthy participants showed activation in the fusiform gyrus, occipital lobe, and inferior frontal cortex relative to the resting baseline condition. The increase was greater in the amygdala and hippocampus during the emotional valence discrimination task than during the age discrimination task. In the patients with schizophrenia, minimal focal response was observed for all tasks relative to the resting baseline condition. Contrasting patients and comparison subjects on the emotional valence discrimination task revealed voxels in the left amygdala and bilateral hippocampus in which the comparison subjects had significantly greater activation. Failure to activate limbic regions during emotional valence discrimination may explain emotion processing deficits in patients with schizophrenia. While the lack of limbic recruitment did not significantly impair simple valence discrimination performance in this clinically stable group, it may impact performance of more demanding tasks.

  10. Associations Between Childhood Abuse, Posttraumatic Stress Disorder, and Implicit Emotion Regulation Deficits: Evidence From a Low-Income, Inner-City Population.

    Science.gov (United States)

    Powers, Abigail; Etkin, Amit; Gyurak, Anett; Bradley, Bekh; Jovanovic, Tanja

    2015-01-01

    Childhood abuse is associated with a wide range of negative outcomes, including increased risk for development of emotion dysregulation and psychopathology, such as posttraumatic stress disorder (PTSD). The goal of the present study was to examine associations between child abuse, PTSD symptoms, and performance on an emotional conflict regulation task that assesses implicit emotion regulation abilities. The sample consisted of 67 (94% African American) females recruited from a public, urban hospital. Childhood abuse was measured using the Childhood Trauma Questionnaire, and PTSD was measured using the modified PTSD Symptom Scale. Task accuracy and implicit emotion regulation were measured through an emotional conflict regulation behavioral task. A multivariate analysis of covariance showed that exposure to moderate to severe childhood abuse was significantly related to worse emotional conflict regulation scores independent of current PTSD symptoms, depressive symptoms, and adult trauma exposure, suggesting a deficit in implicit emotion regulation. We also found an interaction between PTSD symptoms and abuse exposure in predicting accuracy on the behavioral task; high levels of PTSD symptoms were associated with poorer task accuracy among individuals who reported moderate to severe exposure to childhood abuse. However, no relationship between implicit emotion regulation abilities and overall PTSD symptom severity was found. This study provides preliminary evidence of an implicit emotion regulation deficit for individuals exposed to significant childhood abuse and further supports the growing evidence that addressing various aspects of emotion dysregulation, such as awareness of emotions and strategies to manage strong emotions, in the context of treatment would be valuable.

  11. The NMDA antagonist ketamine and the 5-HT agonist psilocybin produce dissociable effects on structural encoding of emotional face expressions.

    Science.gov (United States)

    Schmidt, André; Kometer, Michael; Bachmann, Rosilla; Seifritz, Erich; Vollenweider, Franz

    2013-01-01

    Both glutamate and serotonin (5-HT) play a key role in the pathophysiology of emotional biases. Recent studies indicate that the glutamate N-methyl-D-aspartate (NMDA) receptor antagonist ketamine and the 5-HT receptor agonist psilocybin are implicated in emotion processing. However, as yet, no study has systematically compared their contribution to emotional biases. This study used event-related potentials (ERPs) and signal detection theory to compare the effects of the NMDA (via S-ketamine) and 5-HT (via psilocybin) receptor system on non-conscious or conscious emotional face processing biases. S-ketamine or psilocybin was administrated to two groups of healthy subjects in a double-blind within-subject placebo-controlled design. We behaviorally assessed objective thresholds for non-conscious discrimination in all drug conditions. Electrophysiological responses to fearful, happy, and neutral faces were subsequently recorded with the face-specific P100 and N170 ERP. Both S-ketamine and psilocybin impaired the encoding of fearful faces as expressed by a reduced N170 over parieto-occipital brain regions. In contrast, while S-ketamine also impaired the encoding of happy facial expressions, psilocybin had no effect on the N170 in response to happy faces. This study demonstrates that the NMDA and 5-HT receptor systems differentially contribute to the structural encoding of emotional face expressions as expressed by the N170. These findings suggest that the assessment of early visual evoked responses might allow detecting pharmacologically induced changes in emotional processing biases and thus provides a framework to study the pathophysiology of dysfunctional emotional biases.

  12. The recognition of emotional expression in prosopagnosia: decoding whole and part faces.

    Science.gov (United States)

    Stephan, Blossom Christa Maree; Breen, Nora; Caine, Diana

    2006-11-01

    Prosopagnosia is currently viewed within the constraints of two competing theories of face recognition, one highlighting the analysis of features, the other focusing on configural processing of the whole face. This study investigated the role of feature analysis versus whole face configural processing in the recognition of facial expression. A prosopagnosic patient, SC made expression decisions from whole and incomplete (eyes-only and mouth-only) faces where features had been obscured. SC was impaired at recognizing some (e.g., anger, sadness, and fear), but not all (e.g., happiness) emotional expressions from the whole face. Analyses of his performance on incomplete faces indicated that his recognition of some expressions actually improved relative to his performance on the whole face condition. We argue that in SC interference from damaged configural processes seem to override an intact ability to utilize part-based or local feature cues.

  13. Dissociation in Rating Negative Facial Emotions between Behavioral Variant Frontotemporal Dementia and Major Depressive Disorder.

    Science.gov (United States)

    Chiu, Isabelle; Piguet, Olivier; Diehl-Schmid, Janine; Riedl, Lina; Beck, Johannes; Leyhe, Thomas; Holsboer-Trachsler, Edith; Berres, Manfred; Monsch, Andreas U; Sollberger, Marc

    2016-11-01

    Features of behavioral variant frontotemporal dementia (bvFTD) such as executive dysfunction, apathy, and impaired empathic abilities are also observed in major depressive disorder (MDD). This may contribute to the reason why early stage bvFTD is often misdiagnosed as MDD. New assessment tools are thus needed to improve early diagnosis of bvFTD. Although emotion processing is affected in bvFTD and MDD, growing evidence indicates that the pattern of emotion processing deficits varies between the two disorders. As such, emotion processing paradigms have substantial potentials to distinguish bvFTD from MDD. The current study compared 25 patients with bvFTD, 21 patients with MDD, 21 patients with Alzheimer disease (AD) dementia, and 31 healthy participants on a novel facial emotion intensity rating task. Stimuli comprised morphed faces from the Ekman and Friesen stimulus set containing faces of each sex with two different degrees of emotion intensity for each of the six basic emotions. Analyses of covariance uncovered a significant dissociation between bvFTD and MDD patients in rating the intensity of negative emotions overall (i.e., bvFTD patients underrated negative emotions overall, whereas MDD patients overrated negative emotions overall compared with healthy participants). In contrast, AD dementia patients rated negative emotions similarly to healthy participants, suggesting no impact of cognitive deficits on rating facial emotions. By strongly differentiating bvFTD and MDDpatients through negative facial emotions, this sensitive and short rating task might help improve the early diagnosis of bvFTD. Copyright © 2016 American Association for Geriatric Psychiatry. All rights reserved.

  14. Examining the interplay among negative emotionality, cognitive functioning, and attention deficit/hyperactivity disorder symptom severity.

    Science.gov (United States)

    Healey, Dione M; Marks, David J; Halperin, Jeffrey M

    2011-05-01

    Cognition and emotion, traditionally thought of as largely distinct, have recently begun to be conceptualized as dynamically linked processes that interact to influence functioning. This study investigated the moderating effects of cognitive functioning on the relationship between negative emotionality and attention deficit/hyperactivity disorder (ADHD) symptom severity. A total of 216 (140 hyperactive/inattentive; 76 typically developing) preschoolers aged 3-4 years were administered a neuropsychological test battery (i.e., NEPSY). To avoid method bias, child negative emotionality was rated by teachers (Temperament Assessment Battery for Children-Revised), and parents rated symptom severity on the ADHD Rating Scale (ADHD-RS-IV). Hierarchical Linear Regression analyses revealed that both negative emotionality and Perceptual-Motor & Executive Functions accounted for significant unique variance in ADHD symptom severity. Significant interactions indicated that when negative emotionality is low, but not high, neuropsychological functioning accounts for significant variability in ADHD symptoms, with lower functioning predicting more symptoms. Emotional and neuropsychological functioning, both individually and in combination, play a significant role in the expression of ADHD symptom severity.

  15. Impact of emotional salience on episodic memory in attention-deficit/hyperactivity disorder: a functional magnetic resonance imaging study.

    Science.gov (United States)

    Krauel, Kerstin; Duzel, Emrah; Hinrichs, Hermann; Santel, Stephanie; Rellum, Thomas; Baving, Lioba

    2007-06-15

    Patients with attention-deficit/hyperactivity disorder (ADHD) show episodic memory deficits especially in complex memory tasks. We investigated the neural correlates of memory formation in ADHD and their modulation by stimulus salience. We recorded event-related functional magnetic resonance imaging during an episodic memory paradigm with neutral and emotional pictures in 12 male ADHD subjects and 12 healthy adolescents. Emotional salience did significantly augment memory performance in ADHD patients. Successful encoding of neutral pictures was associated with activation of the anterior cingulate cortex (ACC) in healthy adolescents but with activation of the superior parietal lobe (SPL) and precuneus in ADHD patients. Successful encoding of emotional pictures was associated with prefrontal and inferior temporal cortex activation in both groups. Healthy adolescents, moreover, showed deactivation in the inferior parietal lobe. From a pathophysiological point of view, the most striking functional differences between healthy adolescents and ADHD patients were in the ACC and SPL. We suggest that increased SPL activation in ADHD reflected attentional compensation for low ACC activation during the encoding of neutral pictures. The higher salience of emotional stimuli, in contrast, regulated the interplay between ACC and SPL in conjunction with improving memory to the level of healthy adolescents.

  16. Emotional Intelligence deficits in schizophrenia: The impact of non-social cognition.

    Science.gov (United States)

    Frajo-Apor, Beatrice; Pardeller, Silvia; Kemmler, Georg; Welte, Anna-Sophia; Hofer, Alex

    2016-04-01

    Previous studies using the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT) revealed significant performance deficits across all areas of Emotional Intelligence (EI) in schizophrenia patients compared to healthy controls. However, none of these studies has investigated a potential influence of non-social cognition on these findings. 56 schizophrenia outpatients and 84 control subjects were investigated using the MSCEIT and the Brief Assessment of Cognition in Schizophrenia (BACS). Analyses of covariance were performed with adjustment for the BACS composite score and education. To investigate this issue in more detail, a mediation analysis was conducted. Patients showed significantly lower EI and non-social cognition levels compared to healthy controls. After adjustment for BACS composite score and education, only the group difference in the "managing emotions" branch and thus in the "strategic" EI part of the MSCEIT remained statistically significant, whereas for all other MSCEIT branches (perceiving, using, understanding emotions) statistical significance was lost. The mediation analysis revealed that the difference between schizophrenia patients and controls regarding the MSCEIT total score was almost fully attributable to the mediating effect of non-social cognition. Our findings suggest that in schizophrenia patients EI is largely influenced by non-social cognitive functioning. Only the "managing emotions" branch was found to be independent of non-social cognition. Consequently, non-social cognitive performance was mainly responsible for the observed differences in EI between schizophrenia patients and controls. This has to be taken into account when interpreting MSCEIT data in this population. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Dysregulation in cortical reactivity to emotional faces in PTSD patients with high dissociation symptoms

    Directory of Open Access Journals (Sweden)

    Aleksandra Klimova

    2013-09-01

    Full Text Available Background: Predominant dissociation in posttraumatic stress disorder (PTSD is characterized by restricted affective responses to positive stimuli. To date, no studies have examined neural responses to a range of emotional expressions in PTSD with high dissociative symptoms. Objective: This study tested the hypothesis that PTSD patients with high dissociative symptoms will display increased event-related potential (ERP amplitudes in early components (N1, P1 to threatening faces (angry, fearful, and reduced later ERP amplitudes (Vertex Positive Potential (VPP, P3 to happy faces compared to PTSD patients with low dissociative symptoms. Methods: Thirty-nine civilians with PTSD were classified as high dissociative (n=16 or low dissociative (n=23 according to their responses on the Clinician Administered Dissociative States Scale. ERPs were recorded, whilst participants viewed emotional (happy, angry, fear and neutral facial expressions in a passive viewing task. Results: High dissociative PTSD patients displayed significantly increased N120 amplitude to the majority of facial expressions (neutral, happy, and angry compared to low dissociative PTSD patients under conscious and preconscious conditions. The high dissociative PTSD group had significantly reduced VPP amplitude to happy faces in the conscious condition. Conclusion: High dissociative PTSD patients displayed increased early (preconscious cortical responses to emotional stimuli, and specific reductions to happy facial expressions in later (conscious, face-specific components compared to low dissociative PTSD patients. Dissociation in PTSD may act to increase initial pre-attentive processing of affective stimuli, and specifically reduce cortical reactivity to happy faces when consciously processing these stimuli.

  18. Emotion recognition in girls with conduct problems.

    Science.gov (United States)

    Schwenck, Christina; Gensthaler, Angelika; Romanos, Marcel; Freitag, Christine M; Schneider, Wolfgang; Taurines, Regina

    2014-01-01

    A deficit in emotion recognition has been suggested to underlie conduct problems. Although several studies have been conducted on this topic so far, most concentrated on male participants. The aim of the current study was to compare recognition of morphed emotional faces in girls with conduct problems (CP) with elevated or low callous-unemotional (CU+ vs. CU-) traits and a matched healthy developing control group (CG). Sixteen girls with CP-CU+, 16 girls with CP-CU- and 32 controls (mean age: 13.23 years, SD=2.33 years) were included. Video clips with morphed faces were presented in two runs to assess emotion recognition. Multivariate analysis of variance with the factors group and run was performed. Girls with CP-CU- needed more time than the CG to encode sad, fearful, and happy faces and they correctly identified sadness less often. Girls with CP-CU+ outperformed the other groups in the identification of fear. Learning effects throughout runs were the same for all groups except that girls with CP-CU- correctly identified fear less often in the second run compared to the first run. Results need to be replicated with comparable tasks, which might result in subgroup-specific therapeutic recommendations.

  19. Visual perception during mirror-gazing at one's own face in patients with depression.

    Science.gov (United States)

    Caputo, Giovanni B; Bortolomasi, Marco; Ferrucci, Roberta; Giacopuzzi, Mario; Priori, Alberto; Zago, Stefano

    2014-01-01

    In normal observers, gazing at one's own face in the mirror for a few minutes, at a low illumination level, produces the apparition of strange faces. Observers see distortions of their own faces, but they often see hallucinations like monsters, archetypical faces, faces of relatives and deceased, and animals. In this research, patients with depression were compared to healthy controls with respect to strange-face apparitions. The experiment was a 7-minute mirror-gazing test (MGT) under low illumination. When the MGT ended, the experimenter assessed patients and controls with a specifically designed questionnaire and interviewed them, asking them to describe strange-face apparitions. Apparitions of strange faces in the mirror were very reduced in depression patients compared to healthy controls. Depression patients compared to healthy controls showed shorter duration of apparitions; minor number of strange faces; lower self-evaluation rating of apparition strength; lower self-evaluation rating of provoked emotion. These decreases in depression may be produced by deficits of facial expression and facial recognition of emotions, which are involved in the relationship between the patient (or the patient's ego) and his face image (or the patient's bodily self) that is reflected in the mirror.

  20. The extended functional neuroanatomy of emotional processing biases for masked faces in major depressive disorder.

    Directory of Open Access Journals (Sweden)

    Teresa A Victor

    Full Text Available Major depressive disorder (MDD is associated with a mood-congruent processing bias in the amygdala toward face stimuli portraying sad expressions that is evident even when such stimuli are presented below the level of conscious awareness. The extended functional anatomical network that maintains this response bias has not been established, however.To identify neural network differences in the hemodynamic response to implicitly presented facial expressions between depressed and healthy control participants.Unmedicated-depressed participants with MDD (n=22 and healthy controls (HC; n=25 underwent functional MRI as they viewed face stimuli showing sad, happy or neutral face expressions, presented using a backward masking design. The blood-oxygen-level dependent (BOLD signal was measured to identify regions where the hemodynamic response to the emotionally valenced stimuli differed between groups.The MDD subjects showed greater BOLD responses than the controls to masked-sad versus masked-happy faces in the hippocampus, amygdala and anterior inferotemporal cortex. While viewing both masked-sad and masked-happy faces relative to masked-neutral faces, the depressed subjects showed greater hemodynamic responses than the controls in a network that included the medial and orbital prefrontal cortices and anterior temporal cortex.Depressed and healthy participants showed distinct hemodynamic responses to masked-sad and masked-happy faces in neural circuits known to support the processing of emotionally valenced stimuli and to integrate the sensory and visceromotor aspects of emotional behavior. Altered function within these networks in MDD may establish and maintain illness-associated differences in the salience of sensory/social stimuli, such that attention is biased toward negative and away from positive stimuli.

  1. The extended functional neuroanatomy of emotional processing biases for masked faces in major depressive disorder.

    Science.gov (United States)

    Victor, Teresa A; Furey, Maura L; Fromm, Stephen J; Bellgowan, Patrick S F; Öhman, Arne; Drevets, Wayne C

    2012-01-01

    Major depressive disorder (MDD) is associated with a mood-congruent processing bias in the amygdala toward face stimuli portraying sad expressions that is evident even when such stimuli are presented below the level of conscious awareness. The extended functional anatomical network that maintains this response bias has not been established, however. To identify neural network differences in the hemodynamic response to implicitly presented facial expressions between depressed and healthy control participants. Unmedicated-depressed participants with MDD (n=22) and healthy controls (HC; n=25) underwent functional MRI as they viewed face stimuli showing sad, happy or neutral face expressions, presented using a backward masking design. The blood-oxygen-level dependent (BOLD) signal was measured to identify regions where the hemodynamic response to the emotionally valenced stimuli differed between groups. The MDD subjects showed greater BOLD responses than the controls to masked-sad versus masked-happy faces in the hippocampus, amygdala and anterior inferotemporal cortex. While viewing both masked-sad and masked-happy faces relative to masked-neutral faces, the depressed subjects showed greater hemodynamic responses than the controls in a network that included the medial and orbital prefrontal cortices and anterior temporal cortex. Depressed and healthy participants showed distinct hemodynamic responses to masked-sad and masked-happy faces in neural circuits known to support the processing of emotionally valenced stimuli and to integrate the sensory and visceromotor aspects of emotional behavior. Altered function within these networks in MDD may establish and maintain illness-associated differences in the salience of sensory/social stimuli, such that attention is biased toward negative and away from positive stimuli.

  2. Neutral face classification using personalized appearance models for fast and robust emotion detection.

    Science.gov (United States)

    Chiranjeevi, Pojala; Gopalakrishnan, Viswanath; Moogi, Pratibha

    2015-09-01

    Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning-based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, and so on, in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as user stays neutral for majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this paper, we propose a light-weight neutral versus emotion classification engine, which acts as a pre-processer to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at key emotion (KE) points using a statistical texture model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a statistical texture model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves emotion recognition (ER) accuracy and simultaneously reduces computational complexity of the ER system, as validated on multiple databases.

  3. Negative attention bias and processing deficits during the cognitive reappraisal of unpleasant emotions in HIV+ women.

    Science.gov (United States)

    McIntosh, Roger C; Tartar, Jaime L; Widmayer, Susan; Rosselli, Monica

    2015-01-01

    Deficits in emotional processing may be attributed to HIV disease or comorbid psychiatric disorders. Electrocortical markers of emotional attention, i.e., amplitude of the P2 and late positive potential (LPP), were compared between 26 HIV+ women and 25 healthy controls during an emotional regulation paradigm. HIV+ women showed early attention bias to negative stimuli indexed by greater P2 amplitude. In contrast, compared with the passive viewing of unpleasant images, HIV+ women demonstrated attenuation of the early and late LPP during positive reappraisal. This interaction remained significant after adjusting for individual differences in apathy, anxiety, and depression. Post hoc analyses implicated time since HIV diagnosis with LPP attenuation during positive reappraisal. Advancing HIV disease may disrupt neural generators associated with the cognitive reappraisal of emotions independent of psychiatric function.

  4. Dopamine D1 receptors are responsible for stress-induced emotional memory deficit in mice.

    Science.gov (United States)

    Wang, Yongfu; Wu, Jing; Zhu, Bi; Li, Chaocui; Cai, Jing-Xia

    2012-03-01

    It is established that stress impairs spatial learning and memory via the hypothalamus-pituitary-adrenal axis response. Dopamine D1 receptors were also shown to be responsible for a stress-induced deficit of working memory. However, whether stress affects the subsequent emotional learning and memory is not elucidated yet. Here, we employed the well-established one-trial step-through task to study the effect of an acute psychological stress (induced by tail hanging for 5, 10, or 20 min) on emotional learning and memory, and the possible mechanisms as well. We demonstrated that tail hanging induced an obvious stress response. Either an acute tail-hanging stress or a single dose of intraperitoneally injected dopamine D1 receptor antagonist (SCH23390) significantly decreased the step-through latency in the one-trial step-through task. However, SCH23390 prevented the acute tail-hanging stress-induced decrease in the step-through latency. In addition, the effects of tail-hanging stress and/or SCH23390 on the changes in step-through latency were not through non-memory factors such as nociceptive perception and motor function. Our data indicate that the hyperactivation of dopamine D1 receptors mediated the stress-induced deficit of emotional learning and memory. This study may have clinical significance given that psychological stress is considered to play a role in susceptibility to some mental diseases such as depression and post-traumatic stress disorder.

  5. Emotion recognition through static faces and moving bodies: a comparison between typically developed adults and individuals with high level of autistic traits.

    Science.gov (United States)

    Actis-Grosso, Rossana; Bossi, Francesco; Ricciardelli, Paola

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits or with High Functioning Autism Spectrum Disorder (HAT group) was compared in the recognition of four emotions (Happiness, Anger, Fear, and Sadness) either shown in static faces or conveyed by moving body patch-light displays (PLDs). Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage). Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that (i) emotion recognition is not generally impaired in HAT individuals, (ii) the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals.

  6. Emotion recognition through static faces and moving bodies: a comparison between typically-developed adults and individuals with high level of autistic traits

    Directory of Open Access Journals (Sweden)

    Rossana eActis-Grosso

    2015-10-01

    Full Text Available We investigated whether the type of stimulus (pictures of static faces vs. body motion contributes differently to the recognition of emotions. The performance (accuracy and response times of 25 Low Autistic Traits (LAT group young adults (21 males and 20 young adults (16 males with either High Autistic Traits (HAT group or with High Functioning Autism Spectrum Disorder was compared in the recognition of four emotions (Happiness, Anger, Fear and Sadness either shown in static faces or conveyed by moving bodies (patch-light displays, PLDs. Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage. Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that i emotion recognition is not generally impaired in HAT individuals, ii the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals.

  7. Semantic analisis on faces using deep neural networks

    Directory of Open Access Journals (Sweden)

    Nicolás Federico Pellejero

    2018-03-01

    Full Text Available In this paper we address the problem of automatic emotion recognition and classification through video. Nowadays there are excellent results focused on lab-made datasets, with posed facial expressions. On the other hand there is room for a lot of improvement in the case of `in the wild' datasets, where light, face angle to the camera, etc. are taken into account. In these cases it could be very harmful to work with a small dataset. Currently, there are not big enough datasets of adequately labeled faces for the task.\\\\ We use Generative Adversarial Networks in order to train models in a semi-supervised fashion, generating realistic face images in the process, allowing the exploitation of a big cumulus of unlabeled face images.

  8. Facial emotion recognition deficits following moderate-severe Traumatic Brain Injury (TBI): re-examining the valence effect and the role of emotion intensity.

    Science.gov (United States)

    Rosenberg, Hannah; McDonald, Skye; Dethier, Marie; Kessels, Roy P C; Westbrook, R Frederick

    2014-11-01

    Many individuals who sustain moderate-severe traumatic brain injuries (TBI) are poor at recognizing emotional expressions, with a greater impairment in recognizing negative (e.g., fear, disgust, sadness, and anger) than positive emotions (e.g., happiness and surprise). It has been questioned whether this "valence effect" might be an artifact of the wide use of static facial emotion stimuli (usually full-blown expressions) which differ in difficulty rather than a real consequence of brain impairment. This study aimed to investigate the valence effect in TBI, while examining emotion recognition across different intensities (low, medium, and high). Twenty-seven individuals with TBI and 28 matched control participants were tested on the Emotion Recognition Task (ERT). The TBI group was more impaired in overall emotion recognition, and less accurate recognizing negative emotions. However, examining the performance across the different intensities indicated that this difference was driven by some emotions (e.g., happiness) being much easier to recognize than others (e.g., fear and surprise). Our findings indicate that individuals with TBI have an overall deficit in facial emotion recognition, and that both people with TBI and control participants found some emotions more difficult than others. These results suggest that conventional measures of facial affect recognition that do not examine variance in the difficulty of emotions may produce erroneous conclusions about differential impairment. They also cast doubt on the notion that dissociable neural pathways underlie the recognition of positive and negative emotions, which are differentially affected by TBI and potentially other neurological or psychiatric disorders.

  9. Attentional Bias towards Emotional Scenes in Boys with Attention Deficit Hyperactivity Disorder.

    Science.gov (United States)

    Pishyareh, Ebrahim; Tehrani-Doost, Mehdi; Mahmoodi-Gharaie, Javad; Khorrami, Anahita; Joudi, Mitra; Ahmadi, Mehrnoosh

    2012-01-01

    Children with attention-deficit/hyperactivity disorder (ADHD) react explosively and inappropriately to emotional stimuli. It could be hypothesized that these children have some impairment in attending to emotional cues. Based on this hypothesis, we conducted this study to evaluate visual directions of children with ADHD towards paired emotional scenes. Thirty boys between the ages of 6 and 11 years diagnosed with ADHD were compared with 30 age-matched normal boys. All participants were presented paired emotional and neutral scenes in the four following categories: pleasant-neutral; pleasant-unpleasant; unpleasant-neutral; and neutral - neutral. Meanwhile, their visual orientations towards these pictures were evaluated using the eye tracking system. The number and duration of first fixation and duration of first gaze were compared between the two groups using the MANOVA analysis. The performance of each group in different categories was also analyzed using the Friedman test. With regards to duration of first gaze, which is the time taken to fixate on a picture before moving to another picture, ADHD children spent less time on pleasant pictures compared to normal group, while they were looking at pleasant - neutral and unpleasant - pleasant pairs. The duration of first gaze on unpleasant pictures was higher while children with ADHD were looking at unpleasant - neutral pairs (P<0.01). Based on the findings of this study it could be concluded that children with ADHD attend to unpleasant conditions more than normal children which leads to their emotional reactivity.

  10. A Deficit in Face-Voice Integration in Developing Vervet Monkeys Exposed to Ethanol during Gestation

    DEFF Research Database (Denmark)

    Zangenehpour, Shahin; Javadi, Pasha; Ervin, Frank R

    2014-01-01

    . However, a group of normally developing monkeys exhibited a significant preference for the non-matching video. This inability to integrate and thereby discriminate audiovisual stimuli was confined to the integration of faces and voices as revealed by the monkeys' ability to match a dynamic face...... to a complex tone or a black-and-white checkerboard to a pure tone, presumably based on duration and/or onset-offset synchrony. Together, these results suggest that prenatal ethanol exposure negatively affects a specific domain of audiovisual integration. This deficit is confined to the integration...... of information that is presented by the face and the voice and does not affect more elementary aspects of sensory integration....

  11. Reading faces and Facing words

    DEFF Research Database (Denmark)

    Robotham, Julia Emma; Lindegaard, Martin Weis; Delfi, Tzvetelina Shentova

    unilateral lesions, we found no patient with a selective deficit in either reading or face processing. Rather, the patients showing a deficit in processing either words or faces were also impaired with the other category. One patient performed within the normal range on all tasks. In addition, all patients......It has long been argued that perceptual processing of faces and words is largely independent, highly specialised and strongly lateralised. Studies of patients with either pure alexia or prosopagnosia have strongly contributed to this view. The aim of our study was to investigate how visual...... perception of faces and words is affected by unilateral posterior stroke. Two patients with lesions in their dominant hemisphere and two with lesions in their non-dominant hemisphere were tested on sensitive tests of face and word perception during the stable phase of recovery. Despite all patients having...

  12. Different neural and cognitive response to emotional faces in healthy monozygotic twins at risk of depression.

    Science.gov (United States)

    Miskowiak, K W; Glerup, L; Vestbo, C; Harmer, C J; Reinecke, A; Macoveanu, J; Siebner, H R; Kessing, L V; Vinberg, M

    2015-05-01

    Negative cognitive bias and aberrant neural processing of emotional faces are trait-marks of depression. Yet it is unclear whether these changes constitute an endophenotype for depression and are also present in healthy individuals with hereditary risk for depression. Thirty healthy, never-depressed monozygotic (MZ) twins with a co-twin history of depression (high risk group: n = 13) or without co-twin history of depression (low-risk group: n = 17) were enrolled in a functional magnetic resonance imaging (fMRI) study. During fMRI, participants viewed fearful and happy faces while performing a gender discrimination task. After the scan, they were given a faces dot-probe task, a facial expression recognition task and questionnaires assessing mood, personality traits and coping strategies. High-risk twins showed increased neural response to happy and fearful faces in dorsal anterior cingulate cortex (ACC), dorsomedial prefrontal cortex (dmPFC), pre-supplementary motor area and occipito-parietal regions compared to low-risk twins. They also displayed stronger negative coupling between amygdala and pregenual ACC, dmPFC and temporo-parietal regions during emotional face processing. These task-related changes in neural responses in high-risk twins were accompanied by impaired gender discrimination performance during face processing. They also displayed increased attention vigilance for fearful faces and were slower at recognizing facial expressions relative to low-risk controls. These effects occurred in the absence of differences between groups in mood, subjective state or coping. Different neural response and functional connectivity within fronto-limbic and occipito-parietal regions during emotional face processing and enhanced fear vigilance may be key endophenotypes for depression.

  13. Deficits in facial emotion recognition indicate behavioral changes and impaired self-awareness after moderate to severe traumatic brain injury.

    Science.gov (United States)

    Spikman, Jacoba M; Milders, Maarten V; Visser-Keizer, Annemarie C; Westerhof-Evers, Herma J; Herben-Dekker, Meike; van der Naalt, Joukje

    2013-01-01

    Traumatic brain injury (TBI) is a leading cause of disability, specifically among younger adults. Behavioral changes are common after moderate to severe TBI and have adverse consequences for social and vocational functioning. It is hypothesized that deficits in social cognition, including facial affect recognition, might underlie these behavioral changes. Measurement of behavioral deficits is complicated, because the rating scales used rely on subjective judgement, often lack specificity and many patients provide unrealistically positive reports of their functioning due to impaired self-awareness. Accordingly, it is important to find performance based tests that allow objective and early identification of these problems. In the present study 51 moderate to severe TBI patients in the sub-acute and chronic stage were assessed with a test for emotion recognition (FEEST) and a questionnaire for behavioral problems (DEX) with a self and proxy rated version. Patients performed worse on the total score and on the negative emotion subscores of the FEEST than a matched group of 31 healthy controls. Patients also exhibited significantly more behavioral problems on both the DEX self and proxy rated version, but proxy ratings revealed more severe problems. No significant correlation was found between FEEST scores and DEX self ratings. However, impaired emotion recognition in the patients, and in particular of Sadness and Anger, was significantly correlated with behavioral problems as rated by proxies and with impaired self-awareness. This is the first study to find these associations, strengthening the proposed recognition of social signals as a condition for adequate social functioning. Hence, deficits in emotion recognition can be conceived as markers for behavioral problems and lack of insight in TBI patients. This finding is also of clinical importance since, unlike behavioral problems, emotion recognition can be objectively measured early after injury, allowing for early

  14. Deficits in facial emotion recognition indicate behavioral changes and impaired self-awareness after moderate to severe traumatic brain injury.

    Directory of Open Access Journals (Sweden)

    Jacoba M Spikman

    Full Text Available Traumatic brain injury (TBI is a leading cause of disability, specifically among younger adults. Behavioral changes are common after moderate to severe TBI and have adverse consequences for social and vocational functioning. It is hypothesized that deficits in social cognition, including facial affect recognition, might underlie these behavioral changes. Measurement of behavioral deficits is complicated, because the rating scales used rely on subjective judgement, often lack specificity and many patients provide unrealistically positive reports of their functioning due to impaired self-awareness. Accordingly, it is important to find performance based tests that allow objective and early identification of these problems. In the present study 51 moderate to severe TBI patients in the sub-acute and chronic stage were assessed with a test for emotion recognition (FEEST and a questionnaire for behavioral problems (DEX with a self and proxy rated version. Patients performed worse on the total score and on the negative emotion subscores of the FEEST than a matched group of 31 healthy controls. Patients also exhibited significantly more behavioral problems on both the DEX self and proxy rated version, but proxy ratings revealed more severe problems. No significant correlation was found between FEEST scores and DEX self ratings. However, impaired emotion recognition in the patients, and in particular of Sadness and Anger, was significantly correlated with behavioral problems as rated by proxies and with impaired self-awareness. This is the first study to find these associations, strengthening the proposed recognition of social signals as a condition for adequate social functioning. Hence, deficits in emotion recognition can be conceived as markers for behavioral problems and lack of insight in TBI patients. This finding is also of clinical importance since, unlike behavioral problems, emotion recognition can be objectively measured early after injury

  15. Non-verbal emotion communication training induces specific changes in brain function and structure.

    Science.gov (United States)

    Kreifelts, Benjamin; Jacob, Heike; Brück, Carolin; Erb, Michael; Ethofer, Thomas; Wildgruber, Dirk

    2013-01-01

    The perception of emotional cues from voice and face is essential for social interaction. However, this process is altered in various psychiatric conditions along with impaired social functioning. Emotion communication trainings have been demonstrated to improve social interaction in healthy individuals and to reduce emotional communication deficits in psychiatric patients. Here, we investigated the impact of a non-verbal emotion communication training (NECT) on cerebral activation and brain structure in a controlled and combined functional magnetic resonance imaging (fMRI) and voxel-based morphometry study. NECT-specific reductions in brain activity occurred in a distributed set of brain regions including face and voice processing regions as well as emotion processing- and motor-related regions presumably reflecting training-induced familiarization with the evaluation of face/voice stimuli. Training-induced changes in non-verbal emotion sensitivity at the behavioral level and the respective cerebral activation patterns were correlated in the face-selective cortical areas in the posterior superior temporal sulcus and fusiform gyrus for valence ratings and in the temporal pole, lateral prefrontal cortex and midbrain/thalamus for the response times. A NECT-induced increase in gray matter (GM) volume was observed in the fusiform face area. Thus, NECT induces both functional and structural plasticity in the face processing system as well as functional plasticity in the emotion perception and evaluation system. We propose that functional alterations are presumably related to changes in sensory tuning in the decoding of emotional expressions. Taken together, these findings highlight that the present experimental design may serve as a valuable tool to investigate the altered behavioral and neuronal processing of emotional cues in psychiatric disorders as well as the impact of therapeutic interventions on brain function and structure.

  16. Guanfacine modulates the influence of emotional cues on prefrontal cortex activation for cognitive control.

    Science.gov (United States)

    Schulz, Kurt P; Clerkin, Suzanne M; Fan, Jin; Halperin, Jeffrey M; Newcorn, Jeffrey H

    2013-03-01

    Functional interactions between limbic regions that process emotions and frontal networks that guide response functions provide a substrate for emotional cues to influence behavior. Stimulation of postsynaptic α₂ adrenoceptors enhances the function of prefrontal regions in these networks. However, the impact of this stimulation on the emotional biasing of behavior has not been established. This study tested the effect of the postsynaptic α₂ adrenoceptor agonist guanfacine on the emotional biasing of response execution and inhibition in prefrontal cortex. Fifteen healthy young adults were scanned twice with functional magnetic resonance imaging while performing a face emotion go/no-go task following counterbalanced administration of single doses of oral guanfacine (1 mg) and placebo in a double-blind, cross-over design. Lower perceptual sensitivity and less response bias for sad faces resulted in fewer correct responses compared to happy and neutral faces but had no effect on correct inhibitions. Guanfacine increased the sensitivity and bias selectively for sad faces, resulting in response accuracy comparable to happy and neutral faces, and reversed the valence-dependent variation in response-related activation in left dorsolateral prefrontal cortex (DLPFC), resulting in enhanced activation for response execution cued by sad faces relative to happy and neutral faces, in line with other frontoparietal regions. These results provide evidence that guanfacine stimulation of postsynaptic α₂ adrenoceptors moderates DLPFC activation associated with the emotional biasing of response execution processes. The findings have implications for the α₂ adrenoceptor agonist treatment of attention-deficit hyperactivity disorder.

  17. Influence of spatial frequency and emotion expression on face processing in patients with panic disorder.

    Science.gov (United States)

    Shim, Miseon; Kim, Do-Won; Yoon, Sunkyung; Park, Gewnhi; Im, Chang-Hwan; Lee, Seung-Hwan

    2016-06-01

    Deficits in facial emotion processing is a major characteristic of patients with panic disorder. It is known that visual stimuli with different spatial frequencies take distinct neural pathways. This study investigated facial emotion processing involving stimuli presented at broad, high, and low spatial frequencies in patients with panic disorder. Eighteen patients with panic disorder and 19 healthy controls were recruited. Seven event-related potential (ERP) components: (P100, N170, early posterior negativity (EPN); vertex positive potential (VPP), N250, P300; and late positive potential (LPP)) were evaluated while the participants looked at fearful and neutral facial stimuli presented at three spatial frequencies. When a fearful face was presented, panic disorder patients showed a significantly increased P100 amplitude in response to low spatial frequency compared to high spatial frequency; whereas healthy controls demonstrated significant broad spatial frequency dependent processing in P100 amplitude. Vertex positive potential amplitude was significantly increased in high and broad spatial frequency, compared to low spatial frequency in panic disorder. Early posterior negativity amplitude was significantly different between HSF and BSF, and between LSF and BSF processing in both groups, regardless of facial expression. The possibly confounding effects of medication could not be controlled. During early visual processing, patients with panic disorder prefer global to detailed information. However, in later processing, panic disorder patients overuse detailed information for the perception of facial expressions. These findings suggest that unique spatial frequency-dependent facial processing could shed light on the neural pathology associated with panic disorder. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Methylphenidate and emotional-motivational processing in attention-deficit/hyperactivity disorder.

    Science.gov (United States)

    Conzelmann, Annette; Woidich, Eva; Mucha, Ronald F; Weyers, Peter; Müller, Mathias; Lesch, Klaus-Peter; Jacob, Christian P; Pauli, Paul

    2016-08-01

    In line with the assumption that emotional-motivational deficits are one core dysfunction in ADHD, in one of our previous studies we observed a reduced reactivity towards pleasant pictures in adult ADHD patients as compared to controls. This was indicated by a lack of attenuation of the startle reflex specifically during pleasant pictures in ADHD patients. The first choice medical agents in ADHD, methylphenidate (MPH), is discussed to normalize these dysfunctions. However, experimental evidence in the sense of double-blind placebo-controlled study designs is lacking. Therefore, we investigated 61 adult ADHD patients twice, one time with placebo and one time with MPH with the same experimental design as in our study previously and assessed emotion processing during the presentation of pleasant, neutral and unpleasant pictures. We obtained startle reflex data as well as valence and arousal ratings in association with the pictures. As previously shown, ADHD patients showed a diminished startle attenuation during pleasant pictures while startle potentiation during unpleasant pictures was normal. Valence and arousal ratings unsuspiciously increased with increasing pleasantness and arousal of the pictures, respectively. There were no significant influences of MPH. The study replicates that ADHD patients show a reduced reactivity towards pleasant stimuli. MPH did not normalize this dysfunction. Possibly, MPH only influences emotions during more complex behavioural tasks that involve executive functions in adults with ADHD. Our results emphasize the importance for the use of double-blind placebo-controlled designs in psychopharmacological research.

  19. A deficit in face-voice integration in developing vervet monkeys exposed to ethanol during gestation.

    Directory of Open Access Journals (Sweden)

    Shahin Zangenehpour

    Full Text Available Children with fetal alcohol spectrum disorders display behavioural and intellectual impairments that strongly implicate dysfunction within the frontal cortex. Deficits in social behaviour and cognition are amongst the most pervasive outcomes of prenatal ethanol exposure. Our naturalistic vervet monkey model of fetal alcohol exposure (FAE provides an unparalleled opportunity to study the neurobehavioral outcomes of prenatal ethanol exposure in a controlled experimental setting. Recent work has revealed a significant reduction of the neuronal population in the frontal lobes of these monkeys. We used an intersensory matching procedure to investigate audiovisual perception of socially relevant stimuli in young FAE vervet monkeys. Here we show a domain-specific deficit in audiovisual integration of socially relevant stimuli. When FAE monkeys were shown a pair of side-by-side videos of a monkey concurrently presenting two different calls along with a single audio track matching the content of one of the calls, they were not able to match the correct video to the single audio track. This was manifest by their average looking time being equally spent towards both the matching and non-matching videos. However, a group of normally developing monkeys exhibited a significant preference for the non-matching video. This inability to integrate and thereby discriminate audiovisual stimuli was confined to the integration of faces and voices as revealed by the monkeys' ability to match a dynamic face to a complex tone or a black-and-white checkerboard to a pure tone, presumably based on duration and/or onset-offset synchrony. Together, these results suggest that prenatal ethanol exposure negatively affects a specific domain of audiovisual integration. This deficit is confined to the integration of information that is presented by the face and the voice and does not affect more elementary aspects of sensory integration.

  20. A deficit in face-voice integration in developing vervet monkeys exposed to ethanol during gestation.

    Science.gov (United States)

    Zangenehpour, Shahin; Javadi, Pasha; Ervin, Frank R; Palmour, Roberta M; Ptito, Maurice

    2014-01-01

    Children with fetal alcohol spectrum disorders display behavioural and intellectual impairments that strongly implicate dysfunction within the frontal cortex. Deficits in social behaviour and cognition are amongst the most pervasive outcomes of prenatal ethanol exposure. Our naturalistic vervet monkey model of fetal alcohol exposure (FAE) provides an unparalleled opportunity to study the neurobehavioral outcomes of prenatal ethanol exposure in a controlled experimental setting. Recent work has revealed a significant reduction of the neuronal population in the frontal lobes of these monkeys. We used an intersensory matching procedure to investigate audiovisual perception of socially relevant stimuli in young FAE vervet monkeys. Here we show a domain-specific deficit in audiovisual integration of socially relevant stimuli. When FAE monkeys were shown a pair of side-by-side videos of a monkey concurrently presenting two different calls along with a single audio track matching the content of one of the calls, they were not able to match the correct video to the single audio track. This was manifest by their average looking time being equally spent towards both the matching and non-matching videos. However, a group of normally developing monkeys exhibited a significant preference for the non-matching video. This inability to integrate and thereby discriminate audiovisual stimuli was confined to the integration of faces and voices as revealed by the monkeys' ability to match a dynamic face to a complex tone or a black-and-white checkerboard to a pure tone, presumably based on duration and/or onset-offset synchrony. Together, these results suggest that prenatal ethanol exposure negatively affects a specific domain of audiovisual integration. This deficit is confined to the integration of information that is presented by the face and the voice and does not affect more elementary aspects of sensory integration.

  1. Cross-modal perception (face and voice in emotions. ERPs and behavioural measures

    Directory of Open Access Journals (Sweden)

    Michela Balconi

    2007-04-01

    Full Text Available Emotion decoding constitutes a case of multimodal processing of cues from multiple channels. Previous behavioural and neuropsychological studies indicated that, when we have to decode emotions on the basis of multiple perceptive information, a cross-modal integration has place. The present study investigates the simultaneous processing of emotional tone of voice and emotional facial expression by event-related potentials (ERPs, through an ample range of different emotions (happiness, sadness, fear, anger, surprise, and disgust. Auditory emotional stimuli (a neutral word pronounced in an affective tone and visual patterns (emotional facial expressions were matched in congruous (the same emotion in face and voice and incongruous (different emotions pairs. Subjects (N=30 were required to process the stimuli and to indicate their comprehension (by stimpad. ERPs variations and behavioural data (response time, RTs were submitted to repeated measures analysis of variance (ANOVA. We considered two time intervals (150-250; 250-350 ms post-stimulus, in order to explore the ERP variations. ANOVA showed two different ERP effects, a negative deflection (N2, more anterior-distributed (Fz, and a positive deflection (P2, more posterior-distributed, with different cognitive functions. In the first case N2 may be considered a marker of the emotional content (sensitive to type of emotion, whereas P2 may represent a cross-modal integration marker, it being varied as a function of the congruous/incongruous condition, showing a higher peak for congruous stimuli than incongruous stimuli. Finally, a RT reduction was found for some emotion types for congruous condition (i.e. sadness and an inverted effect for other emotions (i.e. fear, anger, and surprise.

  2. Computerized measurement of facial expression of emotions in schizophrenia.

    Science.gov (United States)

    Alvino, Christopher; Kohler, Christian; Barrett, Frederick; Gur, Raquel E; Gur, Ruben C; Verma, Ragini

    2007-07-30

    Deficits in the ability to express emotions characterize several neuropsychiatric disorders and are a hallmark of schizophrenia, and there is need for a method of quantifying expression, which is currently done by clinical ratings. This paper presents the development and validation of a computational framework for quantifying emotional expression differences between patients with schizophrenia and healthy controls. Each face is modeled as a combination of elastic regions, and expression changes are modeled as a deformation between a neutral face and an expressive face. Functions of these deformations, known as the regional volumetric difference (RVD) functions, form distinctive quantitative profiles of expressions. Employing pattern classification techniques, we have designed expression classifiers for the four universal emotions of happiness, sadness, anger and fear by training on RVD functions of expression changes. The classifiers were cross-validated and then applied to facial expression images of patients with schizophrenia and healthy controls. The classification score for each image reflects the extent to which the expressed emotion matches the intended emotion. Group-wise statistical analysis revealed this score to be significantly different between healthy controls and patients, especially in the case of anger. This score correlated with clinical severity of flat affect. These results encourage the use of such deformation based expression quantification measures for research in clinical applications that require the automated measurement of facial affect.

  3. Automatic Processing of Changes in Facial Emotions in Dysphoria: A Magnetoencephalography Study.

    Science.gov (United States)

    Xu, Qianru; Ruohonen, Elisa M; Ye, Chaoxiong; Li, Xueqiao; Kreegipuu, Kairi; Stefanics, Gabor; Luo, Wenbo; Astikainen, Piia

    2018-01-01

    It is not known to what extent the automatic encoding and change detection of peripherally presented facial emotion is altered in dysphoria. The negative bias in automatic face processing in particular has rarely been studied. We used magnetoencephalography (MEG) to record automatic brain responses to happy and sad faces in dysphoric (Beck's Depression Inventory ≥ 13) and control participants. Stimuli were presented in a passive oddball condition, which allowed potential negative bias in dysphoria at different stages of face processing (M100, M170, and M300) and alterations of change detection (visual mismatch negativity, vMMN) to be investigated. The magnetic counterpart of the vMMN was elicited at all stages of face processing, indexing automatic deviance detection in facial emotions. The M170 amplitude was modulated by emotion, response amplitudes being larger for sad faces than happy faces. Group differences were found for the M300, and they were indexed by two different interaction effects. At the left occipital region of interest, the dysphoric group had larger amplitudes for sad than happy deviant faces, reflecting negative bias in deviance detection, which was not found in the control group. On the other hand, the dysphoric group showed no vMMN to changes in facial emotions, while the vMMN was observed in the control group at the right occipital region of interest. Our results indicate that there is a negative bias in automatic visual deviance detection, but also a general change detection deficit in dysphoria.

  4. Emotional regulation and bodily sensation: interoceptive awareness is intact in borderline personality disorder.

    Science.gov (United States)

    Hart, Nova; McGowan, John; Minati, Ludovico; Critchley, Hugo D

    2013-08-01

    Emotional dysregulation is a core component of borderline personality disorder (BPD). Theoretical models suggest that deficits in labeling physiological sensations of emotion contribute to affective instability in BPD. Interoceptive awareness refers to the ability to perceive changes in internal bodily states, and is linked to the subjective experience and control of emotions. The authors tested whether differences in interoceptive awareness accounted for emotional instability in BPD. Patients diagnosed with BPD (n = 24) were compared to healthy controls (n = 30) on two established measures of interoceptive awareness, a heartbeat perception task and a heartbeat monitoring task. Contrary to their hypothesis, the authors observed no significant differences in objective measures of interoceptive awareness. Their findings provide strong evidence against the notion that difficulties in emotional regulation in BPD are connected to differences in interoceptive awareness.

  5. Emotion recognition and social skills in child and adolescent offspring of parents with schizophrenia.

    Science.gov (United States)

    Horton, Leslie E; Bridgwater, Miranda A; Haas, Gretchen L

    2017-05-01

    Emotion recognition, a social cognition domain, is impaired in people with schizophrenia and contributes to social dysfunction. Whether impaired emotion recognition emerges as a manifestation of illness or predates symptoms is unclear. Findings from studies of emotion recognition impairments in first-degree relatives of people with schizophrenia are mixed and, to our knowledge, no studies have investigated the link between emotion recognition and social functioning in that population. This study examined facial affect recognition and social skills in 16 offspring of parents with schizophrenia (familial high-risk/FHR) compared to 34 age- and sex-matched healthy controls (HC), ages 7-19. As hypothesised, FHR children exhibited impaired overall accuracy, accuracy in identifying fearful faces, and overall recognition speed relative to controls. Age-adjusted facial affect recognition accuracy scores predicted parent's overall rating of their child's social skills for both groups. This study supports the presence of facial affect recognition deficits in FHR children. Importantly, as the first known study to suggest the presence of these deficits in young, asymptomatic FHR children, it extends findings to a developmental stage predating symptoms. Further, findings point to a relationship between early emotion recognition and social skills. Improved characterisation of deficits in FHR children could inform early intervention.

  6. Attentional Bias towards Emotional Scenes in Boys with Attention Deficit Hyperactivity Disorder

    Directory of Open Access Journals (Sweden)

    Ebrahim Pishyareh

    2012-06-01

    Full Text Available Objective: Children with attention-deficit / hyperactivity disorder (ADHD react explosively and inappropriately to emotional stimuli. It could be hypothesized that these children have some impairment in attending to emotional cues. Based on this hypothesis, we conducted this study to evaluate visual directions of children with ADHD towards paired emotional scenes.Method: thirty boys between the ages of 6 and 11 years diagnosed with ADHD were compared with 30 age-matched normal boys. All participants were presented paired emotional and neutral scenes in the four following categories: pleasant-neutral; pleasant-unpleasant; unpleasant-neutral; and neutral – neutral. Meanwhile, their visual orientations towards these pictures were evaluated using the eye tracking system. The number and duration of first fixation and duration of first gaze were compared between the two groups using the MANOVA analysis. The performance of each group in different categories was also analyzed using the Friedman test.Results: With regards to duration of first gaze, which is the time taken to fixate on a picture before moving to another picture, ADHD children spent less time on pleasant pictures compared to normal group ,while they were looking at pleasant – neutral and unpleasant – pleasant pairs. The duration of first gaze on unpleasant pictures was higher while children with ADHD were looking at unpleasant – neutral pairs (P<0.01.Conclusion: based on the findings of this study it could be concluded that children with ADHD attend to unpleasant conditions more than normal children which leads to their emotional reactivity.

  7. Masculinities and Emotional Deficit: Linkages between Masculine Gender Pattern and Lack of Emotional Skills in Men who Mistreat Women in Intimacy

    OpenAIRE

    Verdú Delgado, Ana Dolores; Mañas-Viejo, Carmen

    2017-01-01

    This paper explores violence against women in the context of partner relationships, through testimonies of professionals from Social Services in five towns in the province of Alicante (Spain), and also of the psychologists who participate in the coordination and implementation of two intervention programs for inmate aggressors in Valencia and Alicante (Spain). Our analysis focuses on the linkages between gender and certain emotional deficits in men who mistreat women in intimacy. Among these ...

  8. Face Emotion Processing in Depressed Children and Adolescents with and without Comorbid Conduct Disorder

    Science.gov (United States)

    Schepman, Karen; Taylor, Eric; Collishaw, Stephan; Fombonne, Eric

    2012-01-01

    Studies of adults with depression point to characteristic neurocognitive deficits, including differences in processing facial expressions. Few studies have examined face processing in juvenile depression, or taken account of other comorbid disorders. Three groups were compared: depressed children and adolescents with conduct disorder (n = 23),…

  9. Psychopathic traits are associated with reduced attention to the eyes of emotional faces among adult male non-offenders

    Directory of Open Access Journals (Sweden)

    Steven Mark Gillespie

    2015-10-01

    Full Text Available Psychopathic traits are linked with impairments in emotional facial expression recognition. These impairments may, in part, reflect reduced attention to the eyes of emotional faces. Although reduced attention to the eyes has been noted among children with conduct problems and callous-unemotional traits, similar findings are yet to be found in relation to psychopathic traits among adult male participants. Here we investigated the relationship of primary (selfish, uncaring and secondary (impulsive, antisocial psychopathic traits with attention to the eyes among adult male non-offenders during an emotion recognition task. We measured the number of fixations, and overall dwell time, on the eyes and the mouth of male and female faces showing the six basic emotions at varying levels of intensity. We found no relationship of primary or secondary psychopathic traits with recognition accuracy. However, primary psychopathic traits were associated with a reduced number of fixations, and lower overall dwell time, on the eyes relative to the mouth across expressions, intensity, and sex. Furthermore, the relationship of primary psychopathic traits with attention to the eyes of angry and fearful faces was influenced by the sex and intensity of the expression. We also showed that a greater number of fixations on the eyes, relative to the mouth, was associated with increased accuracy for angry and fearful expression recognition. These results are the first to show effects of psychopathic traits on attention to the eyes of emotional faces in an adult male sample, and may support amygdala based accounts of psychopathy. These findings may also have methodological implications for clinical studies of emotion recognition.

  10. Do proposed facial expressions of contempt, shame, embarrassment, and compassion communicate the predicted emotion?

    Science.gov (United States)

    Widen, Sherri C; Christy, Anita M; Hewett, Kristen; Russell, James A

    2011-08-01

    Shame, embarrassment, compassion, and contempt have been considered candidates for the status of basic emotions on the grounds that each has a recognisable facial expression. In two studies (N=88, N=60) on recognition of these four facial expressions, observers showed moderate agreement on the predicted emotion when assessed with forced choice (58%; 42%), but low agreement when assessed with free labelling (18%; 16%). Thus, even though some observers endorsed the predicted emotion when it was presented in a list, over 80% spontaneously interpreted these faces in a way other than the predicted emotion.

  11. Social and Emotional Loneliness Among Divorced and Married Men and Women : Comparing the Deficit and Cognitive Perspectives

    NARCIS (Netherlands)

    Dykstra, Pearl A.; Fokkema, Tineke

    2007-01-01

    Data from the 1998 survey “Divorce in the Netherlands” (N = 2,223) are used to analyze differences in loneliness among divorced and married men and women. The results indicate that it makes sense to distinguish social from emotional loneliness. This is consistent with the deficit perspective, which

  12. Social and emotional loneliness among divorced and married men and women: comparing the deficit and cognitive perspectives

    NARCIS (Netherlands)

    Dykstra, P.A.; Fokkema, C.M.

    2007-01-01

    Data from the 1998 survey “Divorce in the Netherlands” (N = 2,223) are used to analyze differences in loneliness among divorced and married men and women. The results indicate that it makes sense to distinguish social from emotional loneliness. This is consistent with the deficit perspective, which

  13. Memory for faces with emotional expressions in Alzheimer's disease and healthy older participants: positivity effect is not only due to familiarity.

    Science.gov (United States)

    Sava, Alina-Alexandra; Krolak-Salmon, Pierre; Delphin-Combe, Floriane; Cloarec, Morgane; Chainay, Hanna

    2017-01-01

    Young individuals better memorize initially seen faces with emotional rather than neutral expressions. Healthy older participants and Alzheimer's disease (AD) patients show better memory for faces with positive expressions. The socioemotional selectivity theory postulates that this positivity effect in memory reflects a general age-related preference for positive stimuli, subserving emotion regulation. Another explanation might be that older participants use compensatory strategies, often considering happy faces as previously seen. The question about the existence of this effect in tasks not permitting such compensatory strategies is still open. Thus, we compared the performance of healthy participants and AD patients for positive, neutral, and negative faces in such tasks. Healthy older participants and AD patients showed a positivity effect in memory, but there was no difference between emotional and neutral faces in young participants. Our results suggest that the positivity effect in memory is not entirely due to the sense of familiarity for smiling faces.

  14. Impact of race and diagnostic label on older adults' emotions, illness beliefs, and willingness to help a family member with osteoarthritis.

    Science.gov (United States)

    Mingo, Chivon A; McIlvane, Jessica M; Haley, William E; Luong, My-Linh N

    2015-04-01

    To examine how race and the diagnostic label of Osteoarthritis (OA) affects older adults' emotions, illness beliefs, and willingness to help a family member. African American and White older adults were randomly assigned to read vignettes describing a sister suffering from chronic pain and disability, either with or without the OA label. Race × diagnostic label ANOVAs were conducted. Compared to Whites, African Americans were more optimistic that OA could improve with health care, and showed greater willingness to help their sister. The OA label had little impact on emotions, beliefs, or willingness to help. African Americans rated the sister as having more control of their problem than Whites without the OA label, but providing the diagnosis eliminated this difference. The diagnostic label of OA had little effect on these older adults, but racial differences indicate that cultural values regarding family caregiving are important in arthritis care. © The Author(s) 2013.

  15. Analysis of Misunderstanding Caused by Different Interpretations of Speech Act Labels in Tintin and Asterix Comic Series

    Directory of Open Access Journals (Sweden)

    Farah Attamimi

    2011-01-01

    Full Text Available This paper presents analysis of misunderstanding occurred in a conversation which is caused by different interpretation of speech act labels between the speaker and the hearer. Misunderstanding occurred in these comic series causes various emotional effects to the hearer involved in the conversation. The hearer might feel happy, impressed, embarrassed, or even proud of what the speaker conveys through his/ her utterance. It depends on the face wants used and intended between the participants in the conversation. According to Goffman in Brown and Levinson (1987, “face is something that is emotionally invested, and that can be lost, maintained, or enhanced, and must be constantly attended to in interaction” (p. 60. There are two kinds of face wants. The positive purpose is called face saving act, while the negative one is called face threatening act. The data in this paper are taken from Tintin and Asterix comic series. The theories used cover pragmatics area, especially taxonomy of speech act theory (Yule, 1996; Mey, 2001; Leech, 1991 and theory of the notion of face by Erving Goffman (as cited in Yule, 1996; Thomas, 1995. Therefore, this paper will try to convey how the misinterpretation of speech act labels affects the participants in the conversation.

  16. Gender differences in human single neuron responses to male emotional faces.

    Science.gov (United States)

    Newhoff, Morgan; Treiman, David M; Smith, Kris A; Steinmetz, Peter N

    2015-01-01

    Well-documented differences in the psychology and behavior of men and women have spurred extensive exploration of gender's role within the brain, particularly regarding emotional processing. While neuroanatomical studies clearly show differences between the sexes, the functional effects of these differences are less understood. Neuroimaging studies have shown inconsistent locations and magnitudes of gender differences in brain hemodynamic responses to emotion. To better understand the neurophysiology of these gender differences, we analyzed recordings of single neuron activity in the human brain as subjects of both genders viewed emotional expressions. This study included recordings of single-neuron activity of 14 (6 male) epileptic patients in four brain areas: amygdala (236 neurons), hippocampus (n = 270), anterior cingulate cortex (n = 256), and ventromedial prefrontal cortex (n = 174). Neural activity was recorded while participants viewed a series of avatar male faces portraying positive, negative or neutral expressions. Significant gender differences were found in the left amygdala, where 23% (n = 15∕66) of neurons in men were significantly affected by facial emotion, vs. 8% (n = 6∕76) of neurons in women. A Fisher's exact test comparing the two ratios found a highly significant difference between the two (p differences between genders at the single-neuron level in the human amygdala. These differences may reflect gender-based distinctions in evolved capacities for emotional processing and also demonstrate the importance of including subject gender as an independent factor in future studies of emotional processing by single neurons in the human amygdala.

  17. Improved emotional conflict control triggered by the processing priority of negative emotion.

    Science.gov (United States)

    Yang, Qian; Wang, Xiangpeng; Yin, Shouhang; Zhao, Xiaoyue; Tan, Jinfeng; Chen, Antao

    2016-04-18

    The prefrontal cortex is responsible for emotional conflict resolution, and this control mechanism is affected by the emotional valence of distracting stimuli. In the present study, we investigated effects of negative and positive stimuli on emotional conflict control using a face-word Stroop task in combination with functional brain imaging. Emotional conflict was absent in the negative face context, in accordance with the null activation observed in areas regarding emotional face processing (fusiform face area, middle temporal/occipital gyrus). Importantly, these visual areas negatively coupled with the dorsolateral prefrontal cortex (DLPFC). However, the significant emotional conflict was observed in the positive face context, this effect was accompanied by activation in areas associated with emotional face processing, and the default mode network (DMN), here, DLPFC mainly negatively coupled with DMN, rather than visual areas. These results suggested that the conflict control mechanism exerted differently between negative faces and positive faces, it implemented more efficiently in the negative face condition, whereas it is more devoted to inhibiting internal interference in the positive face condition. This study thus provides a plausible mechanism of emotional conflict resolution that the rapid pathway for negative emotion processing efficiently triggers control mechanisms to preventively resolve emotional conflict.

  18. Visual Perception during Mirror-Gazing at One’s Own Face in Patients with Depression

    Directory of Open Access Journals (Sweden)

    Giovanni B. Caputo

    2014-01-01

    Full Text Available In normal observers, gazing at one’s own face in the mirror for a few minutes, at a low illumination level, produces the apparition of strange faces. Observers see distortions of their own faces, but they often see hallucinations like monsters, archetypical faces, faces of relatives and deceased, and animals. In this research, patients with depression were compared to healthy controls with respect to strange-face apparitions. The experiment was a 7-minute mirror-gazing test (MGT under low illumination. When the MGT ended, the experimenter assessed patients and controls with a specifically designed questionnaire and interviewed them, asking them to describe strange-face apparitions. Apparitions of strange faces in the mirror were very reduced in depression patients compared to healthy controls. Depression patients compared to healthy controls showed shorter duration of apparitions; minor number of strange faces; lower self-evaluation rating of apparition strength; lower self-evaluation rating of provoked emotion. These decreases in depression may be produced by deficits of facial expression and facial recognition of emotions, which are involved in the relationship between the patient (or the patient’s ego and his face image (or the patient’s bodily self that is reflected in the mirror.

  19. Emotion recognition abilities across stimulus modalities in schizophrenia and the role of visual attention.

    Science.gov (United States)

    Simpson, Claire; Pinkham, Amy E; Kelsven, Skylar; Sasson, Noah J

    2013-12-01

    Emotion can be expressed by both the voice and face, and previous work suggests that presentation modality may impact emotion recognition performance in individuals with schizophrenia. We investigated the effect of stimulus modality on emotion recognition accuracy and the potential role of visual attention to faces in emotion recognition abilities. Thirty-one patients who met DSM-IV criteria for schizophrenia (n=8) or schizoaffective disorder (n=23) and 30 non-clinical control individuals participated. Both groups identified emotional expressions in three different conditions: audio only, visual only, combined audiovisual. In the visual only and combined conditions, time spent visually fixating salient features of the face were recorded. Patients were significantly less accurate than controls in emotion recognition during both the audio and visual only conditions but did not differ from controls on the combined condition. Analysis of visual scanning behaviors demonstrated that patients attended less than healthy individuals to the mouth in the visual condition but did not differ in visual attention to salient facial features in the combined condition, which may in part explain the absence of a deficit for patients in this condition. Collectively, these findings demonstrate that patients benefit from multimodal stimulus presentations of emotion and support hypotheses that visual attention to salient facial features may serve as a mechanism for accurate emotion identification. © 2013.

  20. Do complaints of everyday cognitive failures in high schizotypy relate to emotional working memory deficits in the lab?

    Science.gov (United States)

    Carrigan, Nicole; Barkus, Emma; Ong, Adriel; Wei, Maryann

    2017-10-01

    Individuals high on schizotypy complain of increased cognitive failures in everyday life. However, the neuropsychological performance of this group does not consistently indicate underlying ability deficits. It is possible that current neuropsychological tests lack ecological validity. Given the increased affective reactivity of high schizotypes, they may be more sensitive to emotional content interfering with cognitive ability. This study sought to explore whether an affective n-back working memory task would elicit impaired performance in schizotypy, echoing complaints concerning real world cognition. 127 healthy participants completed self-report measures of schizotypy and cognitive failures and an affective n-back working memory task. This task was varied across three levels of load (1- to 3-back) and four types of stimulus emotion (neutral, fearful, happy, sad). Differences between high (n=39) and low (n=48) schizotypy groups on performance outcomes of hits and false alarms were examined, with emotion and load as within-groups variables. As expected, high schizotypes reported heightened vulnerability to cognitive failures. They also demonstrated a relative working memory impairment for emotional versus neutral stimuli, whereas low schizotypes did not. High schizotypes performed most poorly in response to fearful stimuli. For false alarms, there was an interaction between schizotypy, load, and emotion, such that high schizotypy was associated with deficits in response to fearful stimuli only at higher levels of task difficulty. Inclusion of self-reported cognitive failures did not account for this. These findings suggest that the "gap" between subjective and objective cognition in schizotypy may reflect the heightened emotional demands associated with cognitive functioning in the real world, although other factors also seem to play a role. There is a need to improve the ecological validity of objective assessments, whilst also recognizing that self

  1. Mesial temporal lobe epilepsy diminishes functional connectivity during emotion perception.

    Science.gov (United States)

    Steiger, Bettina K; Muller, Angela M; Spirig, Esther; Toller, Gianina; Jokeit, Hennric

    2017-08-01

    Unilateral mesial temporal lobe epilepsy (MTLE) has been associated with impaired recognition of emotional facial expressions. Correspondingly, imaging studies showed decreased activity of the amygdala and cortical face processing regions in response to emotional faces. However, functional connectivity among regions involved in emotion perception has not been studied so far. To address this, we examined intrinsic functional connectivity (FC) modulated by the perception of dynamic fearful faces among the amygdala and limbic, frontal, temporal and brainstem regions. Regions of interest were identified in an activation analysis by presenting a block-design with dynamic fearful faces and dynamic landscapes to 15 healthy individuals. This led to 10 predominately right-hemispheric regions. Functional connectivity between these regions during the perception of fearful faces was examined in drug-refractory patients with left- (n=16) or right-sided (n=17) MTLE, epilepsy patients with extratemporal seizure onset (n=15) and a second group of 15 healthy controls. Healthy controls showed a widespread functional network modulated by the perception of fearful faces that encompassed bilateral amygdalae, limbic, cortical, subcortical and brainstem regions. In patients with left MTLE, a downsized network of frontal and temporal regions centered on the right amygdala was present. Patients with right MTLE showed almost no significant functional connectivity. A maintained network in the epilepsy control group indicates that findings in mesial temporal lobe epilepsy could not be explained by clinical factors such as seizures and antiepileptic medication. Functional networks underlying facial emotion perception are considerably changed in left and right MTLE. Alterations are present for both hemispheres in either MTLE group, but are more pronounced in right MTLE. Disruption of the functional network architecture possibly contributes to deficits in facial emotion recognition frequently

  2. BESST (Bochum Emotional Stimulus Set)--a pilot validation study of a stimulus set containing emotional bodies and faces from frontal and averted views.

    Science.gov (United States)

    Thoma, Patrizia; Soria Bauser, Denise; Suchan, Boris

    2013-08-30

    This article introduces the freely available Bochum Emotional Stimulus Set (BESST), which contains pictures of bodies and faces depicting either a neutral expression or one of the six basic emotions (happiness, sadness, fear, anger, disgust, and surprise), presented from two different perspectives (0° frontal view vs. camera averted by 45° to the left). The set comprises 565 frontal view and 564 averted view pictures of real-life bodies with masked facial expressions and 560 frontal and 560 averted view faces which were synthetically created using the FaceGen 3.5 Modeller. All stimuli were validated in terms of categorization accuracy and the perceived naturalness of the expression. Additionally, each facial stimulus was morphed into three age versions (20/40/60 years). The results show high recognition of the intended facial expressions, even under speeded forced-choice conditions, as corresponds to common experimental settings. The average naturalness ratings for the stimuli range between medium and high. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  3. Mapping structural covariance networks of facial emotion recognition in early psychosis: A pilot study.

    Science.gov (United States)

    Buchy, Lisa; Barbato, Mariapaola; Makowski, Carolina; Bray, Signe; MacMaster, Frank P; Deighton, Stephanie; Addington, Jean

    2017-11-01

    People with psychosis show deficits recognizing facial emotions and disrupted activation in the underlying neural circuitry. We evaluated associations between facial emotion recognition and cortical thickness using a correlation-based approach to map structural covariance networks across the brain. Fifteen people with an early psychosis provided magnetic resonance scans and completed the Penn Emotion Recognition and Differentiation tasks. Fifteen historical controls provided magnetic resonance scans. Cortical thickness was computed using CIVET and analyzed with linear models. Seed-based structural covariance analysis was done using the mapping anatomical correlations across the cerebral cortex methodology. To map structural covariance networks involved in facial emotion recognition, the right somatosensory cortex and bilateral fusiform face areas were selected as seeds. Statistics were run in SurfStat. Findings showed increased cortical covariance between the right fusiform face region seed and right orbitofrontal cortex in controls than early psychosis subjects. Facial emotion recognition scores were not significantly associated with thickness in any region. A negative effect of Penn Differentiation scores on cortical covariance was seen between the left fusiform face area seed and right superior parietal lobule in early psychosis subjects. Results suggest that facial emotion recognition ability is related to covariance in a temporal-parietal network in early psychosis. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Happy faces are preferred regardless of familiarity--sad faces are preferred only when familiar.

    Science.gov (United States)

    Liao, Hsin-I; Shimojo, Shinsuke; Yeh, Su-Ling

    2013-06-01

    Familiarity leads to preference (e.g., the mere exposure effect), yet it remains unknown whether it is objective familiarity, that is, repetitive exposure, or subjective familiarity that contributes to preference. In addition, it is unexplored whether and how different emotions influence familiarity-related preference. The authors investigated whether happy or sad faces are preferred or perceived as more familiar and whether this subjective familiarity judgment correlates with preference for different emotional faces. An emotional face--happy or sad--was paired with a neutral face, and participants rated the relative preference and familiarity of each of the paired faces. For preference judgment, happy faces were preferred and sad faces were less preferred, compared with neutral faces. For familiarity judgment, happy faces did not show any bias, but sad faces were perceived as less familiar than neutral faces. Item-by-item correlational analyses show preference for sad faces--but not happy faces--positively correlate with familiarity. These results suggest a direct link between positive emotion and preference, and argue at least partly against a common cause for familiarity and preference. Instead, facial expression of different emotional valence modulates the link between familiarity and preference.

  5. Neural Correlates of Task-Irrelevant First and Second Language Emotion Words — Evidence from the Face-Word Stroop Task

    Directory of Open Access Journals (Sweden)

    Lin Fan

    2016-11-01

    Full Text Available Emotionally valenced words have thus far not been empirically examined in a bilingual population with the emotional face-word Stroop paradigm. Chinese-English bilinguals were asked to identify the facial expressions of emotion with their first (L1 or second (L2 language task-irrelevant emotion words superimposed on the face pictures. We attempted to examine how the emotional content of words modulates behavioral performance and cerebral functioning in the bilinguals’ two languages. The results indicated that there were significant congruency effects for both L1 and L2 emotion words, and that identifiable differences in the magnitude of Stroop effect between the two languages were also observed, suggesting L1 is more capable of activating the emotional response to word stimuli. For event-related potentials (ERPs data, an N350-550 effect was observed only in L1 task with greater negativity for incongruent than congruent trials. The size of N350-550 effect differed across languages, whereas no identifiable language distinction was observed in the effect of conflict slow potential (conflict SP. Finally, more pronounced negative amplitude at 230-330 ms was observed in L1 than in L2, but only for incongruent trials. This negativity, likened to an orthographic decoding N250, may reflect the extent of attention to emotion word processing at word-form level, while N350-550 reflects a complicated set of processes in the conflict processing. Overall, the face-word congruency effect has reflected identifiable language distinction at 230-330 and 350-550 ms, which provides supporting evidence for the theoretical proposals assuming attenuated emotionality of L2 processing.

  6. Emotional Lability in Children and Adolescents with Attention Deficit/Hyperactivity Disorder (ADHD): Clinical Correlates and Familial Prevalence

    Science.gov (United States)

    Sobanski, Esther; Banaschewski, Tobias; Asherson, Philip; Buitelaar, Jan; Chen, Wai; Franke, Barbara; Holtmann, Martin; Krumm, Bertram; Sergeant, Joseph; Sonuga-Barke, Edmund; Stringaris, Argyris; Taylor, Eric; Anney, Richard; Ebstein, Richard P.; Gill, Michael; Miranda, Ana; Mulas, Fernando; Oades, Robert D.; Roeyers, Herbert; Rothenberger, Aribert; Steinhausen, Hans-Christoph; Faraone, Stephen V.

    2010-01-01

    Background: The goal of this study was to investigate the occurrence, severity and clinical correlates of emotional lability (EL) in children with attention deficit/hyperactivity disorder (ADHD), and to examine factors contributing to EL and familiality of EL in youth with ADHD. Methods: One thousand, one hundred and eighty-six children with ADHD…

  7. Theory of mind and its relationship with executive functions and emotion recognition in borderline personality disorder.

    Science.gov (United States)

    Baez, Sandra; Marengo, Juan; Perez, Ana; Huepe, David; Font, Fernanda Giralt; Rial, Veronica; Gonzalez-Gadea, María Luz; Manes, Facundo; Ibanez, Agustin

    2015-09-01

    Impaired social cognition has been claimed to be a mechanism underlying the development and maintenance of borderline personality disorder (BPD). One important aspect of social cognition is the theory of mind (ToM), a complex skill that seems to be influenced by more basic processes, such as executive functions (EF) and emotion recognition. Previous ToM studies in BPD have yielded inconsistent results. This study assessed the performance of BPD adults on ToM, emotion recognition, and EF tasks. We also examined whether EF and emotion recognition could predict the performance on ToM tasks. We evaluated 15 adults with BPD and 15 matched healthy controls using different tasks of EF, emotion recognition, and ToM. The results showed that BPD adults exhibited deficits in the three domains, which seem to be task-dependent. Furthermore, we found that EF and emotion recognition predicted the performance on ToM. Our results suggest that tasks that involve real-life social scenarios and contextual cues are more sensitive to detect ToM and emotion recognition deficits in BPD individuals. Our findings also indicate that (a) ToM variability in BPD is partially explained by individual differences on EF and emotion recognition; and (b) ToM deficits of BPD patients are partially explained by the capacity to integrate cues from face, prosody, gesture, and social context to identify the emotions and others' beliefs. © 2014 The British Psychological Society.

  8. Gender Differences in Human Single Neuron Responses to Male Emotional Faces

    Directory of Open Access Journals (Sweden)

    Morgan eNewhoff

    2015-09-01

    Full Text Available Well-documented differences in the psychology and behavior of men and women have spurred extensive exploration of gender's role within the brain, particularly regarding emotional processing. While neuroanatomical studies clearly show differences between the sexes, the functional effects of these differences are less understood. Neuroimaging studies have shown inconsistent locations and magnitudes of gender differences in brain hemodynamic responses to emotion. To better understand the neurophysiology of these gender differences, we analyzed recordings of single neuron activity in the human brain as subjects of both genders viewed emotional expressions.This study included recordings of single-neuron activity of 14 (6 male epileptic patients in four brain areas: amygdala (236 neurons, hippocampus (n=270, anterior cingulate cortex (n=256, and ventromedial prefrontal cortex (n=174. Neural activity was recorded while participants viewed a series of avatar male faces portraying positive, negative or neutral expressions.Significant gender differences were found in the left amygdala, where 23% (n=15/66 of neurons in men were significantly affected by facial emotion, versus 8% (n=6/76 of neurons in women. A Fisher's exact test comparing the two ratios found a highly significant difference between the two (p<0.01. These results show specific differences between genders at the single-neuron level in the human amygdala. These differences may reflect gender-based distinctions in evolved capacities for emotional processing and also demonstrate the importance of including subject gender as an independent factor in future studies of emotional processing by single neurons in the human amygdala.

  9. When dynamic, the head and face alone can express pride.

    Science.gov (United States)

    Nelson, Nicole L; Russell, James A

    2011-08-01

    Prior research suggested that pride is recognized only when a head and facial expression (e.g., tilted head with a slight smile) is combined with a postural expression (e.g., expanded body and arm gestures). However, these studies used static photographs. In the present research, participants labeled the emotion conveyed by four dynamic cues to pride, presented as video clips: head and face alone, body posture alone, voice alone, and an expression in which head and face, body posture, and voice were presented simultaneously. Participants attributed pride to the head and face alone, even when postural or vocal information was absent. Pride can be conveyed without body posture or voice. 2011 APA, all rights reserved

  10. Elevated responses to constant facial emotions in different faces in the human amygdala: an fMRI study of facial identity and expression

    Directory of Open Access Journals (Sweden)

    Weiller Cornelius

    2004-11-01

    Full Text Available Abstract Background Human faces provide important signals in social interactions by inferring two main types of information, individual identity and emotional expression. The ability to readily assess both, the variability and consistency among emotional expressions in different individuals, is central to one's own interpretation of the imminent environment. A factorial design was used to systematically test the interaction of either constant or variable emotional expressions with constant or variable facial identities in areas involved in face processing using functional magnetic resonance imaging. Results Previous studies suggest a predominant role of the amygdala in the assessment of emotional variability. Here we extend this view by showing that this structure activated to faces with changing identities that display constant emotional expressions. Within this condition, amygdala activation was dependent on the type and intensity of displayed emotion, with significant responses to fearful expressions and, to a lesser extent so to neutral and happy expressions. In contrast, the lateral fusiform gyrus showed a binary pattern of increased activation to changing stimulus features while it was also differentially responsive to the intensity of displayed emotion when processing different facial identities. Conclusions These results suggest that the amygdala might serve to detect constant facial emotions in different individuals, complementing its established role for detecting emotional variability.

  11. An Event-Related Potential Study on the Effects of Cannabis on Emotion Processing

    Science.gov (United States)

    Troup, Lucy J.; Bastidas, Stephanie; Nguyen, Maia T.; Andrzejewski, Jeremy A.; Bowers, Matthew; Nomi, Jason S.

    2016-01-01

    The effect of cannabis on emotional processing was investigated using event-related potential paradigms (ERPs). ERPs associated with emotional processing of cannabis users, and non-using controls, were recorded and compared during an implicit and explicit emotional expression recognition and empathy task. Comparisons in P3 component mean amplitudes were made between cannabis users and controls. Results showed a significant decrease in the P3 amplitude in cannabis users compared to controls. Specifically, cannabis users showed reduced P3 amplitudes for implicit compared to explicit processing over centro-parietal sites which reversed, and was enhanced, at fronto-central sites. Cannabis users also showed a decreased P3 to happy faces, with an increase to angry faces, compared to controls. These effects appear to increase with those participants that self-reported the highest levels of cannabis consumption. Those cannabis users with the greatest consumption rates showed the largest P3 deficits for explicit processing and negative emotions. These data suggest that there is a complex relationship between cannabis consumption and emotion processing that appears to be modulated by attention. PMID:26926868

  12. The effect of comorbid depression on facial and prosody emotion recognition in first-episode schizophrenia spectrum.

    Science.gov (United States)

    Herniman, Sarah E; Allott, Kelly A; Killackey, Eóin; Hester, Robert; Cotton, Sue M

    2017-01-15

    Comorbid depression is common in first-episode schizophrenia spectrum (FES) disorders. Both depression and FES are associated with significant deficits in facial and prosody emotion recognition performance. However, it remains unclear whether people with FES and comorbid depression, compared to those without comorbid depression, have overall poorer emotion recognition, or instead, a different pattern of emotion recognition deficits. The aim of this study was to compare facial and prosody emotion recognition performance between those with and without comorbid depression in FES. This study involved secondary analysis of baseline data from a randomized controlled trial of vocational intervention for young people with first-episode psychosis (N=82; age range: 15-25 years). Those with comorbid depression (n=24) had more accurate recognition of sadness in faces compared to those without comorbid depression. Severity of depressive symptoms was also associated with more accurate recognition of sadness in faces. Such results did not recur for prosody emotion recognition. In addition to the cross-sectional design, limitations of this study include the absence of facial and prosodic recognition of neutral emotions. Findings indicate a mood congruent negative bias in facial emotion recognition in those with comorbid depression and FES, and provide support for cognitive theories of depression that emphasise the role of such biases in the development and maintenance of depression. Longitudinal research is needed to determine whether mood-congruent negative biases are implicated in the development and maintenance of depression in FES, or whether such biases are simply markers of depressed state. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Amygdala habituation to emotional faces in adolescents with internalizing disorders, adolescents with childhood sexual abuse related PTSD and healthy adolescents

    Directory of Open Access Journals (Sweden)

    Bianca G. van den Bulk

    2016-10-01

    Full Text Available Adolescents with internalizing disorders and adolescents with childhood sexual abuse related post-traumatic stress disorder (CSA-related PTSD show a large overlap in symptomatology. In addition, brain research indicated hyper-responsiveness and sustained activation instead of habituation of amygdala activation to emotional faces in both groups. Little is known, however, about whether the same patterns of amygdala habituation are present in these two groups. The current study examined habituation patterns of amygdala activity to emotional faces (fearful, happy and neutral in adolescents with a DSM-IV depressive and/or anxiety disorder (N = 25, adolescents with CSA-related PTSD (N = 19 and healthy controls (N = 26. Behaviourally, the adolescents from the internalizing and CSA-related PTSD group reported more anxiety to fearful and neutral faces than adolescents from the control group and adolescents from the CSA-related PTSD group reacted slower compared to the internalizing group. At the whole brain level, there was a significant interaction between time and group within the left amygdala. Follow-up ROI analysis showed elevated initial activity in the amygdala and rapid habituation in the CSA-related PTSD group compared to the internalizing group. These findings suggest that habituation patterns of amygdala activation provide additional information on problems with emotional face processing. Furthermore, the results suggest there are differences in the underlying neurobiological mechanisms related to emotional face processing for adolescents with internalizing disorders and adolescents with CSA-related PTSD. Possibly CSA-related PTSD is characterized by a stronger primary emotional response driven by the amygdala.

  14. Amygdala habituation to emotional faces in adolescents with internalizing disorders, adolescents with childhood sexual abuse related PTSD and healthy adolescents.

    Science.gov (United States)

    van den Bulk, Bianca G; Somerville, Leah H; van Hoof, Marie-José; van Lang, Natasja D J; van der Wee, Nic J A; Crone, Eveline A; Vermeiren, Robert R J M

    2016-10-01

    Adolescents with internalizing disorders and adolescents with childhood sexual abuse related post-traumatic stress disorder (CSA-related PTSD) show a large overlap in symptomatology. In addition, brain research indicated hyper-responsiveness and sustained activation instead of habituation of amygdala activation to emotional faces in both groups. Little is known, however, about whether the same patterns of amygdala habituation are present in these two groups. The current study examined habituation patterns of amygdala activity to emotional faces (fearful, happy and neutral) in adolescents with a DSM-IV depressive and/or anxiety disorder (N=25), adolescents with CSA-related PTSD (N=19) and healthy controls (N=26). Behaviourally, the adolescents from the internalizing and CSA-related PTSD group reported more anxiety to fearful and neutral faces than adolescents from the control group and adolescents from the CSA-related PTSD group reacted slower compared to the internalizing group. At the whole brain level, there was a significant interaction between time and group within the left amygdala. Follow-up ROI analysis showed elevated initial activity in the amygdala and rapid habituation in the CSA-related PTSD group compared to the internalizing group. These findings suggest that habituation patterns of amygdala activation provide additional information on problems with emotional face processing. Furthermore, the results suggest there are differences in the underlying neurobiological mechanisms related to emotional face processing for adolescents with internalizing disorders and adolescents with CSA-related PTSD. Possibly CSA-related PTSD is characterized by a stronger primary emotional response driven by the amygdala. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Do bodily expressions compete with facial expressions? Time course of integration of emotional signals from the face and the body.

    Science.gov (United States)

    Gu, Yuanyuan; Mai, Xiaoqin; Luo, Yue-jia

    2013-01-01

    The decoding of social signals from nonverbal cues plays a vital role in the social interactions of socially gregarious animals such as humans. Because nonverbal emotional signals from the face and body are normally seen together, it is important to investigate the mechanism underlying the integration of emotional signals from these two sources. We conducted a study in which the time course of the integration of facial and bodily expressions was examined via analysis of event-related potentials (ERPs) while the focus of attention was manipulated. Distinctive integrating features were found during multiple stages of processing. In the first stage, threatening information from the body was extracted automatically and rapidly, as evidenced by enhanced P1 amplitudes when the subjects viewed compound face-body images with fearful bodies compared with happy bodies. In the second stage, incongruency between emotional information from the face and the body was detected and captured by N2. Incongruent compound images elicited larger N2s than did congruent compound images. The focus of attention modulated the third stage of integration. When the subjects' attention was focused on the face, images with congruent emotional signals elicited larger P3s than did images with incongruent signals, suggesting more sustained attention and elaboration of congruent emotional information extracted from the face and body. On the other hand, when the subjects' attention was focused on the body, images with fearful bodies elicited larger P3s than did images with happy bodies, indicating more sustained attention and elaboration of threatening information from the body during evaluative processes.

  16. Electrophysiological Evidence for Adult Age-Related Sparing and Decrements in Emotion Perception and Attention

    Directory of Open Access Journals (Sweden)

    Joshua W. Pollock

    2012-08-01

    Full Text Available The present study examined adult age differences in processing emotional faces using a psychological refractory period (PRP paradigm. We used both behavioral and event-related potential (P1 component measures. Task 1 was tone discrimination (fuzzy vs. pure tones and Task 2 was emotional facial discrimination (happy vs. angry faces. The stimulus onset asynchrony (SOA between the two tasks was 100 ms, 300 ms, and 900 ms. Earlier research observed age deficits in emotional facial discrimination for negative (angry than for positive (happy faces (Baena et al., 2010. Thus, we predicted that older adults would show decreased attentional efficiency in carrying out dual-task processing on the P1 (a component linked to amygdalar modulation of visual perception; Rotshtein et al., 2010. Both younger and older groups showed significantly higher P1 amplitudes at 100- and 300-ms SOAs than at the 900-ms SOA, and this suggests that both age groups could process Task 2 emotions without central attention. Also, younger adults showed significantly higher P1 activations for angry than for happy faces, but older adults showed no difference. These results are consistent with the idea that younger adults exhibited amygdalar modulation of visual perception, but that older adults did not.

  17. Dissociable neural effects of stimulus valence and preceding context during the inhibition of responses to emotional faces.

    Science.gov (United States)

    Schulz, Kurt P; Clerkin, Suzanne M; Halperin, Jeffrey M; Newcorn, Jeffrey H; Tang, Cheuk Y; Fan, Jin

    2009-09-01

    Socially appropriate behavior requires the concurrent inhibition of actions that are inappropriate in the context. This self-regulatory function requires an interaction of inhibitory and emotional processes that recruits brain regions beyond those engaged by either processes alone. In this study, we isolated brain activity associated with response inhibition and emotional processing in 24 healthy adults using event-related functional magnetic resonance imaging (fMRI) and a go/no-go task that independently manipulated the context preceding no-go trials (ie, number of go trials) and the valence (ie, happy, sad, and neutral) of the face stimuli used as trial cues. Parallel quadratic trends were seen in correct inhibitions on no-go trials preceded by increasing numbers of go trials and associated activation for correct no-go trials in inferior frontal gyrus pars opercularis, pars triangularis, and pars orbitalis, temporoparietal junction, superior parietal lobule, and temporal sensory association cortices. Conversely, the comparison of happy versus neutral faces and sad versus neutral faces revealed valence-dependent activation in the amygdala, anterior insula cortex, and posterior midcingulate cortex. Further, an interaction between inhibition and emotion was seen in valence-dependent variations in the quadratic trend in no-go activation in the right inferior frontal gyrus and left posterior insula cortex. These results suggest that the inhibition of response to emotional cues involves the interaction of partly dissociable limbic and frontoparietal networks that encode emotional cues and use these cues to exert inhibitory control over the motor, attention, and sensory functions needed to perform the task, respectively. 2008 Wiley-Liss, Inc.

  18. Do Valenced Odors and Trait Body Odor Disgust Affect Evaluation of Emotion in Dynamic Faces?

    Science.gov (United States)

    Syrjänen, Elmeri; Liuzza, Marco Tullio; Fischer, Håkan; Olofsson, Jonas K

    2017-12-01

    Disgust is a core emotion evolved to detect and avoid the ingestion of poisonous food as well as the contact with pathogens and other harmful agents. Previous research has shown that multisensory presentation of olfactory and visual information may strengthen the processing of disgust-relevant information. However, it is not known whether these findings extend to dynamic facial stimuli that changes from neutral to emotionally expressive, or if individual differences in trait body odor disgust may influence the processing of disgust-related information. In this preregistered study, we tested whether a classification of dynamic facial expressions as happy or disgusted, and an emotional evaluation of these facial expressions, would be affected by individual differences in body odor disgust sensitivity, and by exposure to a sweat-like, negatively valenced odor (valeric acid), as compared with a soap-like, positively valenced odor (lilac essence) or a no-odor control. Using Bayesian hypothesis testing, we found evidence that odors do not affect recognition of emotion in dynamic faces even when body odor disgust sensitivity was used as moderator. However, an exploratory analysis suggested that an unpleasant odor context may cause faster RTs for faces, independent of their emotional expression. Our results further our understanding of the scope and limits of odor effects on facial perception affect and suggest further studies should focus on reproducibility, specifying experimental circumstances where odor effects on facial expressions may be present versus absent.

  19. Facial Expression Aftereffect Revealed by Adaption to Emotion-Invisible Dynamic Bubbled Faces

    Science.gov (United States)

    Luo, Chengwen; Wang, Qingyun; Schyns, Philippe G.; Kingdom, Frederick A. A.; Xu, Hong

    2015-01-01

    Visual adaptation is a powerful tool to probe the short-term plasticity of the visual system. Adapting to local features such as the oriented lines can distort our judgment of subsequently presented lines, the tilt aftereffect. The tilt aftereffect is believed to be processed at the low-level of the visual cortex, such as V1. Adaptation to faces, on the other hand, can produce significant aftereffects in high-level traits such as identity, expression, and ethnicity. However, whether face adaptation necessitate awareness of face features is debatable. In the current study, we investigated whether facial expression aftereffects (FEAE) can be generated by partially visible faces. We first generated partially visible faces using the bubbles technique, in which the face was seen through randomly positioned circular apertures, and selected the bubbled faces for which the subjects were unable to identify happy or sad expressions. When the subjects adapted to static displays of these partial faces, no significant FEAE was found. However, when the subjects adapted to a dynamic video display of a series of different partial faces, a significant FEAE was observed. In both conditions, subjects could not identify facial expression in the individual adapting faces. These results suggest that our visual system is able to integrate unrecognizable partial faces over a short period of time and that the integrated percept affects our judgment on subsequently presented faces. We conclude that FEAE can be generated by partial face with little facial expression cues, implying that our cognitive system fills-in the missing parts during adaptation, or the subcortical structures are activated by the bubbled faces without conscious recognition of emotion during adaptation. PMID:26717572

  20. [Emotional intelligence and oscillatory responses on the emotional facial expressions].

    Science.gov (United States)

    Kniazev, G G; Mitrofanova, L G; Bocharov, A V

    2013-01-01

    Emotional intelligence-related differences in oscillatory responses to emotional facial expressions were investigated in 48 subjects (26 men and 22 women) in age 18-30 years. Participants were instructed to evaluate emotional expression (angry, happy and neutral) of each presented face on an analog scale ranging from -100 (very hostile) to + 100 (very friendly). High emotional intelligence (EI) participants were found to be more sensitive to the emotional content of the stimuli. It showed up both in their subjective evaluation of the stimuli and in a stronger EEG theta synchronization at an earlier (between 100 and 500 ms after face presentation) processing stage. Source localization using sLORETA showed that this effect was localized in the fusiform gyrus upon the presentation of angry faces and in the posterior cingulate gyrus upon the presentation of happy faces. At a later processing stage (500-870 ms) event-related theta synchronization in high emotional intelligence subject was higher in the left prefrontal cortex upon the presentation of happy faces, but it was lower in the anterior cingulate cortex upon presentation of angry faces. This suggests the existence of a mechanism that can be selectively increase the positive emotions and reduce negative emotions.

  1. The Relative Power of an Emotion's Facial Expression, Label, and Behavioral Consequence to Evoke Preschoolers' Knowledge of Its Cause

    Science.gov (United States)

    Widen, Sherri C.; Russell, James A.

    2004-01-01

    Lay people and scientists alike assume that, especially for young children, facial expressions are a strong cue to another's emotion. We report a study in which children (N=120; 3-4 years) described events that would cause basic emotions (surprise, fear, anger, disgust, sadness) presented as its facial expression, as its label, or as its…

  2. Face puzzle—two new video-based tasks for measuring explicit and implicit aspects of facial emotion recognition

    Science.gov (United States)

    Kliemann, Dorit; Rosenblau, Gabriela; Bölte, Sven; Heekeren, Hauke R.; Dziobek, Isabel

    2013-01-01

    Recognizing others' emotional states is crucial for effective social interaction. While most facial emotion recognition tasks use explicit prompts that trigger consciously controlled processing, emotional faces are almost exclusively processed implicitly in real life. Recent attempts in social cognition suggest a dual process perspective, whereby explicit and implicit processes largely operate independently. However, due to differences in methodology the direct comparison of implicit and explicit social cognition has remained a challenge. Here, we introduce a new tool to comparably measure implicit and explicit processing aspects comprising basic and complex emotions in facial expressions. We developed two video-based tasks with similar answer formats to assess performance in respective facial emotion recognition processes: Face Puzzle, implicit and explicit. To assess the tasks' sensitivity to atypical social cognition and to infer interrelationship patterns between explicit and implicit processes in typical and atypical development, we included healthy adults (NT, n = 24) and adults with autism spectrum disorder (ASD, n = 24). Item analyses yielded good reliability of the new tasks. Group-specific results indicated sensitivity to subtle social impairments in high-functioning ASD. Correlation analyses with established implicit and explicit socio-cognitive measures were further in favor of the tasks' external validity. Between group comparisons provide first hints of differential relations between implicit and explicit aspects of facial emotion recognition processes in healthy compared to ASD participants. In addition, an increased magnitude of between group differences in the implicit task was found for a speed-accuracy composite measure. The new Face Puzzle tool thus provides two new tasks to separately assess explicit and implicit social functioning, for instance, to measure subtle impairments as well as potential improvements due to social cognitive

  3. Effects of Acute Alcohol Consumption on the Processing of Emotion in Faces: Implications for Understanding Alcohol-Related Aggression

    Science.gov (United States)

    Attwood, Angela S.; Munafò, Marcus R.

    2016-01-01

    The negative consequences of chronic alcohol abuse are well known, but heavy episodic consumption ("binge drinking") is also associated with significant personal and societal harms. Aggressive tendencies are increased after alcohol but the mechanisms underlying these changes are not fully understood. While effects on behavioural control are likely to be important, other effects may be involved given the widespread action of alcohol. Altered processing of social signals is associated with changes in social behaviours, including aggression, but until recently there has been little research investigating the effects of acute alcohol consumption on these outcomes. Recent work investigating the effects of acute alcohol on emotional face processing has suggested reduced sensitivity to submissive signals (sad faces) and increased perceptual bias towards provocative signals (angry faces) after alcohol consumption, which may play a role in alcohol-related aggression. Here we discuss a putative mechanism that may explain how alcohol consumption influences emotional processing and subsequent aggressive responding, via disruption of OFC-amygdala connectivity. While the importance of emotional processing on social behaviours is well established, research into acute alcohol consumption and emotional processing is still in its infancy. Further research is needed and we outline a research agenda to address gaps in the literature. PMID:24920135

  4. Does aging impair first impression accuracy? Differentiating emotion recognition from complex social inferences.

    Science.gov (United States)

    Krendl, Anne C; Rule, Nicholas O; Ambady, Nalini

    2014-09-01

    Young adults can be surprisingly accurate at making inferences about people from their faces. Although these first impressions have important consequences for both the perceiver and the target, it remains an open question whether first impression accuracy is preserved with age. Specifically, could age differences in impressions toward others stem from age-related deficits in accurately detecting complex social cues? Research on aging and impression formation suggests that young and older adults show relative consensus in their first impressions, but it is unknown whether they differ in accuracy. It has been widely shown that aging disrupts emotion recognition accuracy, and that these impairments may predict deficits in other social judgments, such as detecting deceit. However, it is unclear whether general impression formation accuracy (e.g., emotion recognition accuracy, detecting complex social cues) relies on similar or distinct mechanisms. It is important to examine this question to evaluate how, if at all, aging might affect overall accuracy. Here, we examined whether aging impaired first impression accuracy in predicting real-world outcomes and categorizing social group membership. Specifically, we studied whether emotion recognition accuracy and age-related cognitive decline (which has been implicated in exacerbating deficits in emotion recognition) predict first impression accuracy. Our results revealed that emotion recognition accuracy did not predict first impression accuracy, nor did age-related cognitive decline impair it. These findings suggest that domains of social perception outside of emotion recognition may rely on mechanisms that are relatively unimpaired by aging. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  5. Dispositional fear, negative affectivity, and neuroimaging response to visually suppressed emotional faces.

    Science.gov (United States)

    Vizueta, Nathalie; Patrick, Christopher J; Jiang, Yi; Thomas, Kathleen M; He, Sheng

    2012-01-02

    "Invisible" stimulus paradigms provide a method for investigating basic affective processing in clinical and non-clinical populations. Neuroimaging studies utilizing continuous flash suppression (CFS) have shown increased amygdala response to invisible fearful versus neutral faces. The current study used CFS in conjunction with functional MRI to test for differences in brain reactivity to visible and invisible emotional faces in relation to two distinct trait dimensions relevant to psychopathology: negative affectivity (NA) and fearfulness. Subjects consisted of college students (N=31) assessed for fear/fearlessness along with dispositional NA. The main brain regions of interest included the fusiform face area (FFA), superior temporal sulcus (STS), and amygdala. Higher NA, but not trait fear, was associated with enhanced response to fearful versus neutral faces in STS and right amygdala (but not FFA), within the invisible condition specifically. The finding that NA rather than fearfulness predicted degree of amygdala reactivity to suppressed faces implicates the input subdivision of the amygdala in the observed effects. Given the central role of NA in anxiety and mood disorders, the current data also support use of the CFS methodology for investigating the neurobiology of these disorders. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. Neural markers of emotional face perception across psychotic disorders and general population.

    Science.gov (United States)

    Sabharwal, Amri; Kotov, Roman; Szekely, Akos; Leung, Hoi-Chung; Barch, Deanna M; Mohanty, Aprajita

    2017-07-01

    There is considerable variation in negative and positive symptoms of psychosis, global functioning, and emotional face perception (EFP), not only in schizophrenia but also in other psychotic disorders and healthy individuals. However, EFP impairment and its association with worse symptoms and global functioning have been examined largely in the domain of schizophrenia. The present study adopted a dimensional approach to examine the association of behavioral and neural measures of EFP with symptoms of psychosis and global functioning across individuals with schizophrenia spectrum (SZ; N = 28) and other psychotic (OP; N = 29) disorders, and never-psychotic participants (NP; N = 21). Behavioral and functional MRI data were recorded as participants matched emotional expressions of faces and geometrical shapes. Lower accuracy and increased activity in early visual regions, hippocampus, and amygdala during emotion versus shape matching were associated with higher negative, but not positive, symptoms and lower global functioning, across all participants. This association remained even after controlling for group-related (SZ, OP, and NP) variance, dysphoria, and antipsychotic medication status, except in amygdala. Furthermore, negative symptoms mediated the relationship between behavioral and brain EFP measures and global functioning. This study provides some of the first evidence supporting the specific relationship of EFP measures with negative symptoms and global functioning across psychotic and never-psychotic samples, and transdiagnostically across different psychotic disorders. Present findings help bridge the gap between basic EFP-related neuroscience research and clinical research in psychosis, and highlight EFP as a potential symptom-specific marker that tracks global functioning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. Compelled commercial speech: the Food and Drug Administration's effort to smoke out the tobacco industry through graphic warning labels.

    Science.gov (United States)

    Haynes, Bryan M; Andrews, Anne Hampton; Jacob, C Reade

    2013-01-01

    FDA's proposed graphic warning labels for cigarette packages have been scrutinized for potentially violating the First Amendment's free speech clause. This article addresses the distinction between the commercial speech and compelled speech doctrines and their applicability in analyzing the constitutionality of the labels. The government's position is that the labels evoke an emotional response and educate consumers, while tobacco companies argue that the labels forcibly promote the government's message. Two federal appellate courts, applying different legal standards, have arrived at different conclusions. This article advocates that the Supreme Court, if faced with review of the labels, should apply strict scrutiny and declare the labels unconstitutional.

  8. Behavioral dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia.

    Science.gov (United States)

    Daini, Roberta; Comparetti, Chiara M; Ricciardelli, Paola

    2014-01-01

    Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition.

  9. No Differences in Emotion Recognition Strategies in Children with Autism Spectrum Disorder: Evidence from Hybrid Faces

    OpenAIRE

    Evers, Kris; Kerkhof, Inneke; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2014-01-01

    Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD). However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybr...

  10. Clinical features and subjective/physiological responses to emotional stimuli in the presence of emotion dysregulation in attention-deficit hyperactivity disorder.

    Science.gov (United States)

    Taskiran, Candan; Karaismailoglu, Serkan; Cak Esen, Halime Tuna; Tuzun, Zeynep; Erdem, Aysen; Balkanci, Zeynep Dicle; Dolgun, Anil Barak; Cengel Kultur, Sadriye Ebru

    2018-05-01

    Emotion dysregulation (ED) has long been recognized in clinical descriptions of attention-deficit hyperactivity disorder (ADHD), but a renewed interest in ED has advanced research on the overlap between the two entities. Autonomic reactivity (AR) is a neurobiological correlate of emotion regulation; however, the association between ADHD and AR remains unclear. Our aim was to explore the clinical differences, AR, and subjective emotional responses to visual emotional stimuli in ADHD children with and without ED. School-aged ADHD children with (n = 28) and without (n = 20) ED, according to the definition of deficiency in emotional self-regulation (DESR), and healthy controls (n = 22) were interviewed by using the Schedule for Affective Disorders and Schizophrenia for School Aged Children-Present and Lifetime version (K-SADS-PL) to screen frequent psychopathologies for these ages. All subjects were evaluated with Child Behavior Checklist 6-18 (CBCL), the Strengths and Difficulties Questionnaire (SDQ), the McMaster Family Assessment Device (FAD), the School-Age Temperament Inventory (SATI), and Conners' Parent Rating Scale (CPRS-48), which were completed by parents. To evaluate emotional responses, the International Affective Picture System (IAPS) and the subjective and physiological responses (electrodermal activity and heart rate reactivity) to selected pictures were examined. Regarding clinically distinctive features, the ADHD+ED group differed from the ADHD-ED and the control groups in terms of having higher temperamental negative reactivity, more oppositional/conduct problems, and lower prosocial behaviors. In the AR measures, children in the ADHD+ED group rated unpleasant stimuli as more negative, but they still had lower heart rate reactivity (HRR) than the ADHD-ED and control groups; moreover, unlike the two other groups, the ADHD+ED group showed no differences in HRR between different emotional stimuli. The presented findings are unique in terms of their

  11. Recognition of emotional facial expressions in adolescents with anorexia nervosa and adolescents with major depression.

    Science.gov (United States)

    Sfärlea, Anca; Greimel, Ellen; Platt, Belinda; Dieler, Alica C; Schulte-Körne, Gerd

    2018-04-01

    Anorexia nervosa (AN) has been suggested to be associated with abnormalities in facial emotion recognition. Most prior studies on facial emotion recognition in AN have investigated adult samples, despite the onset of AN being particularly often during adolescence. In addition, few studies have examined whether impairments in facial emotion recognition are specific to AN or might be explained by frequent comorbid conditions that are also associated with deficits in emotion recognition, such as depression. The present study addressed these gaps by investigating recognition of emotional facial expressions in adolescent girls with AN (n = 26) compared to girls with major depression (MD; n = 26) and healthy girls (HC; n = 37). Participants completed one task requiring identification of emotions (happy, sad, afraid, angry, neutral) in faces and two control tasks. Neither of the clinical groups showed impairments. The AN group was more accurate than the HC group in recognising afraid facial expressions and more accurate than the MD group in recognising happy, sad, and afraid expressions. Misclassification analyses identified subtle group differences in the types of errors made. The results suggest that the deficits in facial emotion recognition found in adult AN samples are not present in adolescent patients. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Differences in neural and cognitive response to emotional faces in middle-aged dizygotic twins at familial risk of depression

    DEFF Research Database (Denmark)

    Miskowiak, K W; Svendsen, A M B; Harmer, C J

    2017-01-01

    -twin history of depression (high-risk) and 20 were without co-twin history of depression (low-risk). During fMRI, participants viewed fearful and happy faces while performing a gender discrimination task. After the scan, they were given a faces dot-probe task, a facial expression recognition task......BACKGROUND: Negative bias and aberrant neural processing of emotional faces are trait-marks of depression but findings in healthy high-risk groups are conflicting. METHODS: Healthy middle-aged dizygotic twins (N = 42) underwent functional magnetic resonance imaging (fMRI): 22 twins had a co...... the amygdala and ventral prefrontal cortex and pregenual anterior cingulate. This was accompanied by greater fear-specific fronto-temporal response and reduced fronto-occipital response to all emotional faces relative to baseline. The risk groups showed no differences in mood, subjective state or coping...

  13. Children's Recognition of Emotional Facial Expressions Through Photographs and Drawings.

    Science.gov (United States)

    Brechet, Claire

    2017-01-01

    The author's purpose was to examine children's recognition of emotional facial expressions, by comparing two types of stimulus: photographs and drawings. The author aimed to investigate whether drawings could be considered as a more evocative material than photographs, as a function of age and emotion. Five- and 7-year-old children were presented with photographs and drawings displaying facial expressions of 4 basic emotions (i.e., happiness, sadness, anger, and fear) and were asked to perform a matching task by pointing to the face corresponding to the target emotion labeled by the experimenter. The photographs we used were selected from the Radboud Faces Database and the drawings were designed on the basis of both the facial components involved in the expression of these emotions and the graphic cues children tend to use when asked to depict these emotions in their own drawings. Our results show that drawings are better recognized than photographs, for sadness, anger, and fear (with no difference for happiness, due to a ceiling effect). And that the difference between the 2 types of stimuli tends to be more important for 5-year-olds compared to 7-year-olds. These results are discussed in view of their implications, both for future research and for practical application.

  14. Matching faces with emotional expressions

    Directory of Open Access Journals (Sweden)

    Wenfeng eChen

    2011-08-01

    Full Text Available There is some evidence that faces with a happy expression are recognized better than faces with other expressions. However, little is known about whether this happy face advantage also applies to perceptual face matching, and whether similar differences exist among other expressions. Using a sequential matching paradigm, we systematically compared the effects of seven basic facial expressions on identity recognition. Identity matching was quickest when a pair of faces had an identical happy/sad/neutral expression, poorer when they had a fearful/surprise/angry expression, and poorest when they had a disgust expression. Faces with a happy/sad/fear/surprise expression were matched faster than those with an anger/disgust expression when the second face in a pair had a neutral expression. These results demonstrate that effects of facial expression on identity recognition are not limited to happy faces when a learned face is immediately tested. The results suggest different influences of expression in perceptual matching and long-term recognition memory.

  15. Emotion knowledge in young neglected children.

    Science.gov (United States)

    Sullivan, Margaret W; Bennett, David S; Carpenter, Kim; Lewis, Michael

    2008-08-01

    Young neglected children may be at risk for emotion knowledge deficits. Children with histories of neglect or with no maltreatment were initially seen at age 4 and again 1 year later to assess their emotion knowledge. Higher IQ was associated with better emotion knowledge, but neglected children had consistently poorer emotion knowledge over time compared to non-neglected children after controlling for IQ. Because both neglected status and IQ may contribute to deficits in emotional knowledge, both should be assessed when evaluating these children to appropriately design and pace emotion knowledge interventions.

  16. The effect of oxytocin on attention to angry and happy faces in chronic depression.

    Science.gov (United States)

    Domes, Gregor; Normann, Claus; Heinrichs, Markus

    2016-04-06

    Chronic depression is characterized by a high degree of early life trauma, psychosocial impairment, and deficits in social cognition. Undisturbed recognition and processing of facial emotions are basic prerequisites for smooth social interactions. Intranasal application of the neuropeptide oxytocin has been reported to enhance emotion recognition in neuropsychiatric disorders and healthy individuals. We therefore investigated whether oxytocin modulates attention to emotional faces in patients with chronic depression. In this double-blind, randomized, controlled study, 43 patients received a single dose of oxytocin or placebo nasal spray and were tested while fulfilling a facial dot probe task. We assessed reaction times to neutral probes presented at the location of one of two faces depicting happy, angry, or neutral expressions as a prime. When comparing reaction times to the congruent (prime and probe at the same location) with incongruent presentation of facial emotions, neither the placebo nor oxytocin group showed an attentional preference for emotional facial expressions in terms of a threat bias. However, oxytocin treatment did reveal two specific effects: it generally reduced the allocation of attention towards angry facial expressions, and it increased sustained attention towards happy faces, specifically under conditions of heightened awareness, i.e. trials with longer primes. We investigated a heterogeneous group of medicated male and female patients. We conclude that oxytocin does modulate basic factors of facial emotion processing in chronic depression. Our findings encourage further investigations assessing the therapeutic potential of oxytocin in chronic depression. EUDRA-CT 2010-020956-69 . Date registered: 23 February 2011.

  17. Training approach-avoidance of smiling faces affects emotional vulnerability in socially anxious individuals

    Science.gov (United States)

    Rinck, Mike; Telli, Sibel; Kampmann, Isabel L.; Woud, Marcella L.; Kerstholt, Merel; te Velthuis, Sarai; Wittkowski, Matthias; Becker, Eni S.

    2013-01-01

    Previous research revealed an automatic behavioral bias in high socially anxious individuals (HSAs): although their explicit evaluations of smiling faces are positive, they show automatic avoidance of these faces. This is reflected by faster pushing than pulling of smiling faces in an Approach-Avoidance Task (AAT; Heuer et al., 2007). The current study addressed the causal role of this avoidance bias for social anxiety. To this end, we used the AAT to train HSAs, either to approach smiling faces or to avoid them. We examined whether such an AAT training could change HSAs' automatic avoidance tendencies, and if yes, whether AAT effects would generalize to a new approach task with new facial stimuli, and to mood and anxiety in a social threat situation (a video-recorded self-presentation). We found that HSAs trained to approach smiling faces did indeed approach female faces faster after the training than HSAs trained to avoid smiling faces. Moreover, approach-faces training reduced emotional vulnerability: it led to more positive mood and lower anxiety after the self-presentation than avoid-faces training. These results suggest that automatic approach-avoidance tendencies have a causal role in social anxiety, and that they can be modified by a simple computerized training. This may open new avenues in the therapy of social phobia. PMID:23970862

  18. Training Approach-Avoidance of Smiling Faces Affects Emotional Vulnerability in Socially Anxious Individuals

    Directory of Open Access Journals (Sweden)

    Mike eRinck

    2013-08-01

    Full Text Available Previous research revealed an automatic behavioral bias in high socially anxious individuals (HSAs: Although their explicit evaluations of smiling faces are positive, they show automatic avoidance of these faces. This is reflected by faster pushing than pulling of smiling faces in an Approach-Avoidance Task (AAT; Heuer, Rinck, & Becker, 2007. The current study addressed the causal role of this avoidance bias for social anxiety. To this end, we used the AAT to train HSAs, either to approach smiling faces or to avoid them. We examined whether such an AAT training could change HSAs’ automatic avoidance tendencies, and if yes, whether AAT effects would generalize to a new approach task with new facial stimuli, and to mood and anxiety in a social threat situation (a video-recorded self-presentation. We found that HSAs trained to approach smiling faces did indeed approach female faces faster after the training than HSAs trained to avoid smiling faces. Moreover, approach-faces training reduced emotional vulnerability: It led to more positive mood and lower anxiety after the self-presentation than avoid-faces training. These results suggest that automatic approach-avoidance tendencies have a causal role in social anxiety, and that they can be modified by a simple computerized training. This may open new avenues in the therapy of social phobia.

  19. Unconsciously Triggered Emotional Conflict by Emotional Facial Expressions

    Science.gov (United States)

    Chen, Antao; Cui, Qian; Zhang, Qinglin

    2013-01-01

    The present study investigated whether emotional conflict and emotional conflict adaptation could be triggered by unconscious emotional information as assessed in a backward-masked affective priming task. Participants were instructed to identify the valence of a face (e.g., happy or sad) preceded by a masked happy or sad face. The results of two experiments revealed the emotional conflict effect but no emotional conflict adaptation effect. This demonstrates that emotional conflict can be triggered by unconsciously presented emotional information, but participants may not adjust their subsequent performance trial-by trial to reduce this conflict. PMID:23409084

  20. Facial emotion perception in Chinese patients with schizophrenia and non-psychotic first-degree relatives.

    Science.gov (United States)

    Li, Huijie; Chan, Raymond C K; Zhao, Qing; Hong, Xiaohong; Gong, Qi-Yong

    2010-03-17

    Although there is a consensus that patients with schizophrenia have certain deficits in perceiving and expressing facial emotions, previous studies of facial emotion perception in schizophrenia do not present consistent results. The objective of this study was to explore facial emotion perception deficits in Chinese patients with schizophrenia and their non-psychotic first-degree relatives. Sixty-nine patients with schizophrenia, 56 of their first-degree relatives (33 parents and 23 siblings), and 92 healthy controls (67 younger healthy controls matched to the patients and siblings, and 25 older healthy controls matched to the parents) completed a set of facial emotion perception tasks, including facial emotion discrimination, identification, intensity, valence, and corresponding face identification tasks. The results demonstrated that patients with schizophrenia performed significantly worse than their siblings and younger healthy controls in accuracy in a variety of facial emotion perception tasks, whereas the siblings of the patients performed as well as the corresponding younger healthy controls in all of the facial emotion perception tasks. Patients with schizophrenia also showed significantly reduced speed than younger healthy controls, while siblings of patients did not demonstrate significant differences with both patients and younger healthy controls in speed. Meanwhile, we also found that parents of the schizophrenia patients performed significantly worse than the corresponding older healthy controls in accuracy in terms of facial emotion identification, valence, and the composite index of the facial discrimination, identification, intensity and valence tasks. Moreover, no significant differences were found between the parents of patients and older healthy controls in speed after controlling the years of education and IQ. Taken together, the results suggest that facial emotion perception deficits may serve as potential endophenotypes for schizophrenia

  1. Spanish parents' emotion talk and their children's understanding of emotion.

    Science.gov (United States)

    Aznar, Ana; Tenenbaum, Harriet R

    2013-01-01

    Relations between parent-child emotion talk and children's emotion understanding were examined in 63 Spanish mothers and fathers and their 4- (M = 53.35 months, SD = 3.86) and 6-year-old (M = 76.62 months, SD = 3.91) children. Parent-child emotion talk was analyzed during two storytelling tasks: a play-related storytelling task and a reminiscence task (conversation about past experiences). Children's emotion understanding was assessed twice through a standardized test of emotion comprehension (TEC; Pons et al., 2004), once before one of the two parent-child storytelling sessions and again 6 months later. Mothers' use of emotion labels during the play-related storytelling task predicted children's emotion understanding after controlling for children's previous emotion understanding. Whereas fathers' use of emotion labels during the play-related storytelling task was correlated with children's emotion understanding, it did not predict children's emotion understanding after controlling for previous emotion understanding. Implications of these findings for future research on children's socioemotional development are discussed.

  2. Love withdrawal predicts electrocortical responses to emotional faces with performance feedback: a follow-up and extension.

    Science.gov (United States)

    Huffmeijer, Renske; Bakermans-Kranenburg, Marian J; Alink, Lenneke R A; van IJzendoorn, Marinus H

    2014-06-02

    Parental use of love withdrawal is thought to affect children's later psychological functioning because it creates a link between children's performance and relational consequences. In addition, recent studies have begun to show that experiences of love withdrawal also relate to the neural processing of socio-emotional information relevant to a performance-relational consequence link, and can moderate effects of oxytocin on social information processing and behavior. The current study follows-up on our previous results by attempting to confirm and extend previous findings indicating that experiences of maternal love withdrawal are related to electrocortical responses to emotional faces presented with performance feedback. More maternal love withdrawal was related to enhanced early processing of facial feedback stimuli (reflected in more positive VPP amplitudes, and confirming previous findings). However, attentional engagement with and processing of the stimuli at a later stage were diminished in those reporting higher maternal love withdrawal (reflected in less positive LPP amplitudes, and diverging from previous findings). Maternal love withdrawal affects the processing of emotional faces presented with performance feedback differently in different stages of neural processing.

  3. Association of Maternal Interaction with Emotional Regulation in 4 and 9 Month Infants During the Still Face Paradigm

    Science.gov (United States)

    Lowe, Jean R.; MacLean, Peggy C.; Duncan, Andrea F.; Aragón, Crystal; Schrader, Ronald M.; Caprihan, Arvind; Phillips, John P.

    2013-01-01

    This study used the Still Face Paradigm to investigate the relationship of maternal interaction on infants’ emotion regulation responses. Seventy infant-mother dyads were seen at 4 months and 25 of these same dyads were re-evaluated at 9 months. Maternal interactions were coded for attention seeking and contingent responding. Emotional regulation was described by infant stress reaction and overall positive affect. Results indicated that at both 4 and 9 months mothers who used more contingent responding interactions had infants who showed more positive affect. In contrast, mothers who used more attention seeking play had infants who showed less positive affect after the Still Face Paradigm. Patterns of stress reaction were reversed, as mothers who used more attention seeking play had infants with less negative affect. Implications for intervention and emotional regulation patterns over time are discussed. PMID:22217393

  4. Early life stress and trauma and enhanced limbic activation to emotionally valenced faces in depressed and healthy children.

    Science.gov (United States)

    Suzuki, Hideo; Luby, Joan L; Botteron, Kelly N; Dietrich, Rachel; McAvoy, Mark P; Barch, Deanna M

    2014-07-01

    Previous studies have examined the relationships between structural brain characteristics and early life stress in adults. However, there is limited evidence for functional brain variation associated with early life stress in children. We hypothesized that early life stress and trauma would be associated with increased functional brain activation response to negative emotional faces in children with and without a history of depression. Psychiatric diagnosis and life events in children (starting at age 3-5 years) were assessed in a longitudinal study. A follow-up magnetic resonance imaging (MRI) study acquired data (N = 115 at ages 7-12, 51% girls) on functional brain response to fearful, sad, and happy faces relative to neutral faces. We used a region-of-interest mask within cortico-limbic areas and conducted regression analyses and repeated-measures analysis of covariance. Greater activation responses to fearful, sad, and happy faces in the amygdala and its neighboring regions were found in children with greater life stress. Moreover, an association between life stress and left hippocampal and globus pallidus activity depended on children's diagnostic status. Finally, all children with greater life trauma showed greater bilateral amygdala and cingulate activity specific to sad faces but not the other emotional faces, although right amygdala activity was moderated by psychiatric status. These findings suggest that limbic hyperactivity may be a biomarker of early life stress and trauma in children and may have implications in the risk trajectory for depression and other stress-related disorders. However, this pattern varied based on emotion type and history of psychopathology. Copyright © 2014 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  5. Intonation processing deficits of emotional words among Mandarin Chinese speakers with congenital amusia: an ERP study

    OpenAIRE

    Lu, Xuejing; Ho, Hao Tam; Liu, Fang; Wu, Daxing; Thompson, William F.

    2015-01-01

    Background: Congenital amusia is a disorder that is known to affect the processing of musical pitch. Although individuals with amusia rarely show language deficits in daily life, a number of findings point to possible impairments in speech prosody that amusic individuals may compensate for by drawing on linguistic information. Using EEG, we investigated (1) whether the processing of speech prosody is impaired in amusia and (2) whether emotional linguistic information can compensate for this i...

  6. Borderline personality disorder and emotional intelligence.

    Science.gov (United States)

    Peter, Mathell; Schuurmans, Hanneke; Vingerhoets, Ad J J M; Smeets, Guus; Verkoeijen, Peter; Arntz, Arnoud

    2013-02-01

    The present study investigated emotional intelligence (EI) in borderline personality disorder (BPD). It was hypothesized that patients with BPD (n = 61) compared with patients with other personality disorders (PDs; n = 69) and nonpatients (n = 248) would show higher scores on the ability to perceive emotions and impairments in the ability to regulate emotions. EI was assessed with the Mayer-Salovey-Caruso Emotional Intelligence Test (Mayer, Salovey, and Caruso [New York: MHS, 2002]). As compared with the PD group and the nonpatient group, the patients with BPD displayed the anticipated deficits in their ability to understand, whereas no differences emerged with respect to their ability to perceive, use, and regulate emotions. In addition, a negative relationship was found between the severity of BPD and total EI score. However, this relationship disappeared when intelligence quotient was partialled out. These results suggest that BPD is associated with emotion understanding deficits, whereas temporary severity of BPD is associated with emotion regulation deficits.

  7. Perfusion deficits detected by arterial spin-labeling in patients with TIA with negative diffusion and vascular imaging.

    Science.gov (United States)

    Qiao, X J; Salamon, N; Wang, D J J; He, R; Linetsky, M; Ellingson, B M; Pope, W B

    2013-01-01

    A substantial portion of clinically diagnosed TIA cases is imaging-negative. The purpose of the current study is to determine if arterial spin-labeling is helpful in detecting perfusion abnormalities in patients presenting clinically with TIA. Pseudocontinuous arterial spin-labeling with 3D background-suppressed gradient and spin-echo was acquired on 49 patients suspected of TIA within 24 hours of symptom onset. All patients were free of stroke history and had no lesion-specific findings on general MR, DWI, and MRA sequences. The calculated arterial spin-labeling CBF maps were scored from 1-3 on the basis of presence and severity of perfusion disturbance by 3 independent observers blinded to patient history. An age-matched cohort of 36 patients diagnosed with no cerebrovascular events was evaluated as a control. Interobserver agreement was assessed by use of the Kendall concordance test. Scoring of perfusion abnormalities on arterial spin-labeling scans of the TIA cohort was highly concordant among the 3 observers (W = 0.812). The sensitivity and specificity of arterial spin-labeling in the diagnosis of perfusion abnormalities in TIA was 55.8% and 90.7%, respectively. In 93.3% (70/75) of the arterial spin-labeling CBF map readings with positive scores (≥2), the brain regions where perfusion abnormalities were identified by 3 observers matched with the neurologic deficits at TIA onset. In this preliminary study, arterial spin-labeling showed promise in the detection of perfusion abnormalities that correlated with clinically diagnosed TIA in patients with otherwise normal neuroimaging results.

  8. Different neural and cognitive response to emotional faces in healthy monozygotic twins at risk of depression

    DEFF Research Database (Denmark)

    Miskowiak, K W; Glerup, L; Vestbo, C

    2015-01-01

    while performing a gender discrimination task. After the scan, they were given a faces dot-probe task, a facial expression recognition task and questionnaires assessing mood, personality traits and coping strategies. RESULTS: High-risk twins showed increased neural response to happy and fearful faces...... processing. These task-related changes in neural responses in high-risk twins were accompanied by impaired gender discrimination performance during face processing. They also displayed increased attention vigilance for fearful faces and were slower at recognizing facial expressions relative to low......BACKGROUND: Negative cognitive bias and aberrant neural processing of emotional faces are trait-marks of depression. Yet it is unclear whether these changes constitute an endophenotype for depression and are also present in healthy individuals with hereditary risk for depression. METHOD: Thirty...

  9. Facing Complaining Customer and Suppressed Emotion at Worksite Related to Sleep Disturbance in Korea.

    Science.gov (United States)

    Lim, Sung Shil; Lee, Wanhyung; Hong, Kwanyoung; Jeung, Dayee; Chang, Sei Jin; Yoon, Jin Ha

    2016-11-01

    This study aimed to investigate the effect of facing complaining customer and suppressed emotion at worksite on sleep disturbance among working population. We enrolled 13,066 paid workers (male = 6,839, female = 6,227, age Working Condition Survey (2011). The odds ratio (OR) and 95% confidence intervals (CI) for sleep disturbance occurrence were calculated using multiple logistic regression models. Among workers in working environments where they always engage complaining customers had a significantly higher risk for sleep disturbance than rarely group (The OR [95% CI]; 5.46 [3.43-8.68] in male, 5.59 [3.30-9.46] in female workers). The OR (95% CI) for sleep disturbance was 1.78 (1.16-2.73) and 1.63 (1.02-2.63), for the male and female groups always suppressing their emotions at the workplace compared with those rarely group. Compared to those who both rarely engaged complaining customers and rarely suppressed their emotions at work, the OR (CI) for sleep disturbance was 9.66 (4.34-20.80) and 10.17 (4.46-22.07), for men and women always exposed to both factors. Sleep disturbance was affected by interactions of both emotional demands (engaging complaining customers and suppressing emotions at the workplace). The level of emotional demand, including engaging complaining customers and suppressing emotions at the workplace is significantly associated with sleep disturbance among Korean working population.

  10. Influence of emotional expression on memory recognition bias in schizophrenia as revealed by fMRI.

    Science.gov (United States)

    Sergerie, Karine; Armony, Jorge L; Menear, Matthew; Sutton, Hazel; Lepage, Martin

    2010-07-01

    We recently showed that, in healthy individuals, emotional expression influences memory for faces both in terms of accuracy and, critically, in memory response bias (tendency to classify stimuli as previously seen or not, regardless of whether this was the case). Although schizophrenia has been shown to be associated with deficit in episodic memory and emotional processing, the relation between these processes in this population remains unclear. Here, we used our previously validated paradigm to directly investigate the modulation of emotion on memory recognition. Twenty patients with schizophrenia and matched healthy controls completed functional magnetic resonance imaging (fMRI) study of recognition memory of happy, sad, and neutral faces. Brain activity associated with the response bias was obtained by correlating this measure with the contrast subjective old (ie, hits and false alarms) minus subjective new (misses and correct rejections) for sad and happy expressions. Although patients exhibited an overall lower memory performance than controls, they showed the same effects of emotion on memory, both in terms of accuracy and bias. For sad faces, the similar behavioral pattern between groups was mirrored by a largely overlapping neural network, mostly involved in familiarity-based judgments (eg, parahippocampal gyrus). In contrast, controls activated a much larger set of regions for happy faces, including areas thought to underlie recollection-based memory retrieval (eg, superior frontal gyrus and hippocampus) and in novelty detection (eg, amygdala). This study demonstrates that, despite an overall lower memory accuracy, emotional memory is intact in schizophrenia, although emotion-specific differences in brain activation exist, possibly reflecting different strategies.

  11. Dealing with feelings: characterization of trait alexithymia on emotion regulation strategies and cognitive-emotional processing.

    Directory of Open Access Journals (Sweden)

    Marte Swart

    Full Text Available BACKGROUND: Alexithymia, or "no words for feelings", is a personality trait which is associated with difficulties in emotion recognition and regulation. It is unknown whether this deficit is due primarily to regulation, perception, or mentalizing of emotions. In order to shed light on the core deficit, we tested our subjects on a wide range of emotional tasks. We expected the high alexithymics to underperform on all tasks. METHOD: Two groups of healthy individuals, high and low scoring on the cognitive component of the Bermond-Vorst Alexithymia Questionnaire, completed questionnaires of emotion regulation and performed several emotion processing tasks including a micro expression recognition task, recognition of emotional prosody and semantics in spoken sentences, an emotional and identity learning task and a conflicting beliefs and emotions task (emotional mentalizing. RESULTS: The two groups differed on the Emotion Regulation Questionnaire, Berkeley Expressivity Questionnaire and Empathy Quotient. Specifically, the Emotion Regulation Quotient showed that alexithymic individuals used more suppressive and less reappraisal strategies. On the behavioral tasks, as expected, alexithymics performed worse on recognition of micro expressions and emotional mentalizing. Surprisingly, groups did not differ on tasks of emotional semantics and prosody and associative emotional-learning. CONCLUSION: Individuals scoring high on the cognitive component of alexithymia are more prone to suppressive emotion regulation strategies rather than reappraisal strategies. Regarding emotional information processing, alexithymia is associated with reduced performance on measures of early processing as well as higher order mentalizing. However, difficulties in the processing of emotional language were not a core deficit in our alexithymic group.

  12. Spanish Parents' Emotion Talk and their Children's Understanding of Emotion

    Directory of Open Access Journals (Sweden)

    Ana eAznar

    2013-09-01

    Full Text Available Relations between parent-child emotion talk and children’s emotion understanding were examined in 63 Spanish mothers and fathers and their 4- (M = 53.35 months, SD = 3.86 and 6-year-old (M = 76.62 months, SD = 3.91 children. Parent-child emotion talk was analyzed during two storytelling tasks: a play-related storytelling task and a reminiscence task (conversation about past experiences. Children’s emotion understanding was assessed twice through a standardized test of emotion comprehension (TEC; Pons, Harris, & de Rosnay, 2004, once before one of the two parent-child storytelling sessions and again six months later. Mothers’ use of emotion labels during the play-related storytelling task predicted children’s emotion understanding after controlling for children’s previous emotion understanding. Whereas fathers’ use of emotion labels during the play-related storytelling task was correlated with children’s emotion understanding, it did not predict children’s emotion understanding after controlling for previous emotion understanding. Implications of these findings for future research on children’s socioemotional development are discussed.  

  13. Emotion regulation in spider phobia: role of the medial prefrontal cortex

    Science.gov (United States)

    Schäfer, Axel; Walter, Bertram; Stark, Rudolf; Vaitl, Dieter; Schienle, Anne

    2009-01-01

    Phobic responses are strong emotional reactions towards phobic objects, which can be described as a deficit in the automatic regulation of emotions. Difficulties in the voluntary cognitive control of these emotions suggest a further phobia-specific deficit in effortful emotion regulation mechanisms. The actual study is based on this emotion regulation conceptualization of specific phobias. The aim is to investigate the neural correlates of these two emotion regulation deficits in spider phobics. Sixteen spider phobic females participated in a functional magnetic resonance imaging (fMRI) study in which they were asked to voluntarily up- and down-regulate their emotions elicited by spider and generally aversive pictures with a reappraisal strategy. In line with the hypothesis concerning an automatic emotion regulation deficit, increased activity in the insula and reduced activity in the ventromedial prefrontal cortex was observed. Furthermore, phobia-specific effortful regulation within phobics was associated with altered activity in medial prefrontal cortex areas. Altogether, these results suggest that spider phobic subjects are indeed characterized by a deficit in the automatic as well as the effortful regulation of emotions elicited by phobic compared with aversive stimuli. These two forms of phobic emotion regulation deficits are associated with altered activity in different medial prefrontal cortex subregions. PMID:19398537

  14. Abnormal patterns of cerebral lateralisation as revealed by the Universal Chimeric Faces Task in individuals with autistic disorder.

    Science.gov (United States)

    Taylor, Sandie; Workman, Lance; Yeomans, Heather

    2012-01-01

    A previous study by Workman, Chilvers, Yeomans, and Taylor (2006), using the "Universal" Chimeric Faces Task (UCFT) for six emotional expressions, demonstrated that an overall left hemispatial/right hemisphere (RH) advantage has begun to develop by the age of 7-8. Moreover, the development of this left hemispatial advantage was observed to correlate positively with the ability to read emotions in the faces of others. Adopting the UCFT, the current study compared autistic children (11-15) with unimpaired children of two age groups (5-6 and 7-8) from this previous study. The autistic children showed a left hemispatial/RH advantage only for the two emotional expressions of "happiness" and "anger". Results for the autistic children revealed a similar overall pattern of lateralisation to the 5-6-year-olds and one that is less lateralised than the pattern for the 7-8-year-olds. Autistic children appear to show a developmental deficit for left hemispatial/RH advantage for emotional expression with the exception of "happiness" and "anger." The findings are discussed in terms of role hemisphericity and an approach-avoidance model.

  15. Repetition Blindness for Faces: A Comparison of Face Identity, Expression, and Gender Judgments

    OpenAIRE

    Murphy, Karen; Ward, Zoe

    2017-01-01

    Repetition blindness (RB) refers to the impairment in reporting two identical targets within a rapid serial visual presentation stream. While numerous studies have demonstrated RB for words and picture of objects, very few studies have examined RB for faces. This study extended this research by examining RB when the two faces were complete repeats (same emotion and identity), identity repeats (same individual, different emotion), and emotion repeats (different individual, same emotion) for id...

  16. Implicit conditioning of faces via the social regulation of emotion: ERP evidence of early attentional biases for security conditioned faces.

    Science.gov (United States)

    Beckes, Lane; Coan, James A; Morris, James P

    2013-08-01

    Not much is known about the neural and psychological processes that promote the initial conditions necessary for positive social bonding. This study explores one method of conditioned bonding utilizing dynamics related to the social regulation of emotion and attachment theory. This form of conditioning involves repeated presentations of negative stimuli followed by images of warm, smiling faces. L. Beckes, J. Simpson, and A. Erickson (2010) found that this conditioning procedure results in positive associations with the faces measured via a lexical decision task, suggesting they are perceived as comforting. This study found that the P1 ERP was similarly modified by this conditioning procedure and the P1 amplitude predicted lexical decision times to insecure words primed by the faces. The findings have implications for understanding how the brain detects supportive people, the flexibility and modifiability of early ERP components, and social bonding more broadly. Copyright © 2013 Society for Psychophysiological Research.

  17. Effects of cue modality and emotional category on recognition of nonverbal emotional signals in schizophrenia.

    Science.gov (United States)

    Vogel, Bastian D; Brück, Carolin; Jacob, Heike; Eberle, Mark; Wildgruber, Dirk

    2016-07-07

    Impaired interpretation of nonverbal emotional cues in patients with schizophrenia has been reported in several studies and a clinical relevance of these deficits for social functioning has been assumed. However, it is unclear to what extent the impairments depend on specific emotions or specific channels of nonverbal communication. Here, the effect of cue modality and emotional categories on accuracy of emotion recognition was evaluated in 21 patients with schizophrenia and compared to a healthy control group (n = 21). To this end, dynamic stimuli comprising speakers of both genders in three different sensory modalities (auditory, visual and audiovisual) and five emotional categories (happy, alluring, neutral, angry and disgusted) were used. Patients with schizophrenia were found to be impaired in emotion recognition in comparison to the control group across all stimuli. Considering specific emotions more severe deficits were revealed in the recognition of alluring stimuli and less severe deficits in the recognition of disgusted stimuli as compared to all other emotions. Regarding cue modality the extent of the impairment in emotional recognition did not significantly differ between auditory and visual cues across all emotional categories. However, patients with schizophrenia showed significantly more severe disturbances for vocal as compared to facial cues when sexual interest is expressed (alluring stimuli), whereas more severe disturbances for facial as compared to vocal cues were observed when happiness or anger is expressed. Our results confirmed that perceptual impairments can be observed for vocal as well as facial cues conveying various social and emotional connotations. The observed differences in severity of impairments with most severe deficits for alluring expressions might be related to specific difficulties in recognizing the complex social emotional information of interpersonal intentions as compared to "basic" emotional states. Therefore

  18. A comparative study of deficit pattern in theory of mind and emotion regulation methods in evaluating patients with bipolar disorder and normal individuals

    OpenAIRE

    Ali Fakhari; Khalegh Minashiri; Abolfazl Fallahi; Mohammad Taher Panah

    2013-01-01

    BACKGROUND: This study compared patterns of deficit in "theory of mind" and "emotion regulation" in patientswith bipolar disorder and normal individuals. METHODS: In this causal-comparative study, subjects were 20 patients with bipolar disorder and 20 normalindividuals. Patients were selected via convenience sampling method among hospitalized patients at Razi hospital ofTabriz, Iran. The data was collected through two scales: Reading the Mind in the Eyes Test and Emotion RegulationQuestionnai...

  19. Textual emotion recognition for enhancing enterprise computing

    Science.gov (United States)

    Quan, Changqin; Ren, Fuji

    2016-05-01

    The growing interest in affective computing (AC) brings a lot of valuable research topics that can meet different application demands in enterprise systems. The present study explores a sub area of AC techniques - textual emotion recognition for enhancing enterprise computing. Multi-label emotion recognition in text is able to provide a more comprehensive understanding of emotions than single label emotion recognition. A representation of 'emotion state in text' is proposed to encompass the multidimensional emotions in text. It ensures the description in a formal way of the configurations of basic emotions as well as of the relations between them. Our method allows recognition of the emotions for the words bear indirect emotions, emotion ambiguity and multiple emotions. We further investigate the effect of word order for emotional expression by comparing the performances of bag-of-words model and sequence model for multi-label sentence emotion recognition. The experiments show that the classification results under sequence model are better than under bag-of-words model. And homogeneous Markov model showed promising results of multi-label sentence emotion recognition. This emotion recognition system is able to provide a convenient way to acquire valuable emotion information and to improve enterprise competitive ability in many aspects.

  20. Intranasal Oxytocin Administration Dampens Amygdala Reactivity towards Emotional Faces in Male and Female PTSD Patients.

    Science.gov (United States)

    Koch, Saskia Bj; van Zuiden, Mirjam; Nawijn, Laura; Frijling, Jessie L; Veltman, Dick J; Olff, Miranda

    2016-05-01

    Post-traumatic stress disorder (PTSD) is a disabling psychiatric disorder. As a substantial part of PTSD patients responds poorly to currently available psychotherapies, pharmacological interventions boosting treatment response are needed. Because of its anxiolytic and pro-social properties, the neuropeptide oxytocin (OT) has been proposed as promising strategy for treatment augmentation in PTSD. As a first step to investigate the therapeutic potential of OT in PTSD, we conducted a double-blind, placebo-controlled, cross-over functional MRI study examining OT administration effects (40 IU) on amygdala reactivity toward emotional faces in unmedicated male and female police officers with (n=37, 21 males) and without (n=40, 20 males) PTSD. Trauma-exposed controls were matched to PTSD patients based on age, sex, years of service and educational level. Under placebo, the expected valence-dependent amygdala reactivity (ie, greater activity toward fearful-angry faces compared with happy-neutral faces) was absent in PTSD patients. OT administration dampened amygdala reactivity toward all emotional faces in male and female PTSD patients, but enhanced amygdala reactivity in healthy male and female trauma-exposed controls, independent of sex and stimulus valence. In PTSD patients, greater anxiety prior to scanning and amygdala reactivity during the placebo session were associated with greater reduction of amygdala reactivity after OT administration. Taken together, our results indicate presumably beneficial neurobiological effects of OT administration in male and female PTSD patients. Future studies should investigate OT administration in clinical settings to fully appreciate its therapeutic potential.

  1. Real-time speech-driven animation of expressive talking faces

    Science.gov (United States)

    Liu, Jia; You, Mingyu; Chen, Chun; Song, Mingli

    2011-05-01

    In this paper, we present a real-time facial animation system in which speech drives mouth movements and facial expressions synchronously. Considering five basic emotions, a hierarchical structure with an upper layer of emotion classification is established. Based on the recognized emotion label, the under-layer classification at sub-phonemic level has been modelled on the relationship between acoustic features of frames and audio labels in phonemes. Using certain constraint, the predicted emotion labels of speech are adjusted to gain the facial expression labels which are combined with sub-phonemic labels. The combinations are mapped into facial action units (FAUs), and audio-visual synchronized animation with mouth movements and facial expressions is generated by morphing between FAUs. The experimental results demonstrate that the two-layer structure succeeds in both emotion and sub-phonemic classifications, and the synthesized facial sequences reach a comparative convincing quality.

  2. Facial emotion identification in early-onset psychosis.

    Science.gov (United States)

    Barkl, Sophie J; Lah, Suncica; Starling, Jean; Hainsworth, Cassandra; Harris, Anthony W F; Williams, Leanne M

    2014-12-01

    Facial emotion identification (FEI) deficits are common in patients with chronic schizophrenia and are strongly related to impaired functioning. The objectives of this study were to determine whether FEI deficits are present and emotion specific in people experiencing early-onset psychosis (EOP), and related to current clinical symptoms and functioning. Patients with EOP (n=34, mean age=14.11, 53% female) and healthy controls (HC, n=42, mean age 13.80, 51% female) completed a task of FEI that measured accuracy, error pattern and response time. Relative to HC, patients with EOP (i) had lower accuracy for identifying facial expressions of emotions, especially fear, anger and disgust, (ii) were more likely to misattribute other emotional expressions as fear or disgust, and (iii) were slower at accurately identifying all facial expressions. FEI accuracy was not related to clinical symptoms or current functioning. Deficits in FEI (especially for fear, anger and disgust) are evident in EOP. Our findings suggest that while emotion identification deficits may reflect a trait susceptibility marker, functional deficits may represent a sequelae of illness. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Mapping the impairment in decoding static facial expressions of emotion in prosopagnosia.

    Science.gov (United States)

    Fiset, Daniel; Blais, Caroline; Royer, Jessica; Richoz, Anne-Raphaëlle; Dugas, Gabrielle; Caldara, Roberto

    2017-08-01

    Acquired prosopagnosia is characterized by a deficit in face recognition due to diverse brain lesions, but interestingly most prosopagnosic patients suffering from posterior lesions use the mouth instead of the eyes for face identification. Whether this bias is present for the recognition of facial expressions of emotion has not yet been addressed. We tested PS, a pure case of acquired prosopagnosia with bilateral occipitotemporal lesions anatomically sparing the regions dedicated for facial expression recognition. PS used mostly the mouth to recognize facial expressions even when the eye area was the most diagnostic. Moreover, PS directed most of her fixations towards the mouth. Her impairment was still largely present when she was instructed to look at the eyes, or when she was forced to look at them. Control participants showed a performance comparable to PS when only the lower part of the face was available. These observations suggest that the deficits observed in PS with static images are not solely attentional, but are rooted at the level of facial information use. This study corroborates neuroimaging findings suggesting that the Occipital Face Area might play a critical role in extracting facial features that are integrated for both face identification and facial expression recognition in static images. © The Author (2017). Published by Oxford University Press.

  4. Impaired Perception of Emotional Expression in Amyotrophic Lateral Sclerosis.

    Science.gov (United States)

    Oh, Seong Il; Oh, Ki Wook; Kim, Hee Jin; Park, Jin Seok; Kim, Seung Hyun

    2016-07-01

    The increasing recognition that deficits in social emotions occur in amyotrophic lateral sclerosis (ALS) is helping to explain the spectrum of neuropsychological dysfunctions, thus supporting the view of ALS as a multisystem disorder involving neuropsychological deficits as well as motor deficits. The aim of this study was to characterize the emotion perception abilities of Korean patients with ALS based on the recognition of facial expressions. Twenty-four patients with ALS and 24 age- and sex-matched healthy controls completed neuropsychological tests and facial emotion recognition tasks [ChaeLee Korean Facial Expressions of Emotions (ChaeLee-E)]. The ChaeLee-E test includes facial expressions for seven emotions: happiness, sadness, anger, disgust, fear, surprise, and neutral. The ability to perceive facial emotions was significantly worse among ALS patients performed than among healthy controls [65.2±18.0% vs. 77.1±6.6% (mean±SD), p=0.009]. Eight of the 24 patients (33%) scored below the 5th percentile score of controls for recognizing facial emotions. Emotion perception deficits occur in Korean ALS patients, particularly regarding facial expressions of emotion. These findings expand the spectrum of cognitive and behavioral dysfunction associated with ALS into emotion processing dysfunction.

  5. Facial emotion recognition in male antisocial personality disorders with or without adult attention deficit hyperactivity disorder.

    Science.gov (United States)

    Bagcioglu, Erman; Isikli, Hasmet; Demirel, Husrev; Sahin, Esat; Kandemir, Eyup; Dursun, Pinar; Yuksek, Erhan; Emul, Murat

    2014-07-01

    We aimed to investigate facial emotion recognition abilities in violent individuals with antisocial personality disorder who have comorbid attention deficient hyperactivity disorder (ADHD) or not. The photos of happy, surprised, fearful, sad, angry, disgust, and neutral facial expressions and Wender Utah Rating Scale have been performed in all groups. The mean ages were as follows: in antisocial personality disorder with ADHD 22.0 ± 1.59, in pure antisocial individuals 21.90 ± 1.80 and in controls 22.97 ± 2.85 (p>0.05). The mean score in Wender Utah Rating Scale was significantly different between groups (p0.05) excluding disgust faces which was significantly impaired in ASPD+ADHD and pure ASPD groups. Antisocial individuals with attention deficient and hyperactivity had spent significantly more time to each facial emotion than healthy controls (pantisocial individual had more time to recognize disgust and neutral faces than healthy controls (pantisocial individuals and antisocial individuals with ADHD. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. P2-23: Deficits on Preference but Not Attention in Patients with Depression: Evidence from Gaze Cue

    Directory of Open Access Journals (Sweden)

    Jingling Li

    2012-10-01

    Full Text Available Gaze is an important social cue and can easily capture attention. Our preference judgment is biased by others' gaze; that is, we prefer objects gazed by happy or neutral faces and dislike objects gazed by disgust faces. Since patients with depression have a negative bias in emotional perception, we hypothesized that they may have different preference judgment on the gazed objects than healthy controls. Twenty-one patients with major depressive disorder and 21 healthy age-matched controls completed an object categorization task and then rated their preference on those objects. In the categorization task, a schematic face either gazed toward or away from the to-be-categorized object. The results showed that both groups categorized faster for gazed objects than non-gazed objects, suggesting that patients did not have deficits on their attention to gaze cues. Nevertheless, healthy controls preferred gazed objects more than non-gazed objects, while patients did not have significant preference. Our result indicated that patients with depression have deficits on their social cognition rather than basic attentional mechanism.

  7. Positivity bias in judging ingroup members' emotional expressions.

    Science.gov (United States)

    Lazerus, Talya; Ingbretsen, Zachary A; Stolier, Ryan M; Freeman, Jonathan B; Cikara, Mina

    2016-12-01

    We investigated how group membership impacts valence judgments of ingroup and outgroup members' emotional expressions. In Experiment 1, participants, randomized into 2 novel, competitive groups, rated the valence of in- and outgroup members' facial expressions (e.g., fearful, happy, neutral) using a circumplex affect grid. Across all emotions, participants judged ingroup members' expressions as more positive than outgroup members' expressions. In Experiment 2, participants categorized fearful and happy expressions as being either positive or negative using a mouse-tracking paradigm. Participants exhibited the most direct trajectories toward the "positive" label for ingroup happy expressions and an initial attraction toward positive for ingroup expressions of fear, with outgroup emotion trajectories falling in between. Experiment 3 replicated Experiment 2 and demonstrated that the effect could not be accounted for by targets' gaze direction. Overall, people judged ingroup faces as more positive, regardless of emotion, both in deliberate and implicit judgments. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  8. How stable is activation in the amygdala and prefrontal cortex in adolescence? A study of emotional face processing across three measurements

    NARCIS (Netherlands)

    van den Bulk, B.G.; Koolschijn, P.C.M.P.; Meens, P.H.F.; van Lang, N.D.J.; van der Wee, N.J.A.; Rombouts, S.A.R.B.; Vermeiren, R.R.J.M.; Crone, E.A.

    2013-01-01

    Prior developmental functional magnetic resonance imaging (fMRI) studies have demonstrated elevated activation patterns in the amygdala and prefrontal cortex (PFC) in response to viewing emotional faces. As adolescence is a time of substantial variability in mood and emotional responsiveness, the

  9. Empathetic perspective-taking is impaired in schizophrenia: evidence from a study of emotion attribution and theory of mind.

    Science.gov (United States)

    Langdon, Robyn; Coltheart, Max; Ward, Philip B

    2006-03-01

    Schizophrenia and autism are clinically distinct yet both disorders are characterised by theory of mind (ToM) deficits. Autistic individuals fail to appreciate false beliefs, yet understand the causal connections between behavioural events and simple emotions. Findings of this type have promoted the view that ToM deficits in autism reflect a domain-specific difficulty with appreciating the representational nature of epistemic mental states (i.e., beliefs and intentions and not emotions). This study examines whether the same holds true for schizophrenia. A picture-sequencing task assessed capacity to infer false beliefs in patients with schizophrenia and healthy controls. To assess emotion attribution, participants were shown cartoon strips of events likely to elicit strong emotional reactions in story characters. Characters' faces were blanked out. Participants were instructed to think about how the characters would be feeling in order to match up the cards depicting facial affect appropriately. Participants later named emotions depicted in facial affect cards. Patients were as capable as controls of identifying cartoon facial expressions, yet had greater difficulties with: (a) attributing emotions based on circumstances; and (b) inferring false beliefs. Schizophrenia patients, unlike autistic individuals, suffer a domain-general difficulty with empathetic perspective-taking that affects equally their appreciation of other people's beliefs, percepts, and emotions.

  10. Mapping emotions through time: how affective trajectories inform the language of emotion.

    Science.gov (United States)

    Kirkland, Tabitha; Cunningham, William A

    2012-04-01

    The words used to describe emotions can provide insight into the basic processes that contribute to emotional experience. We propose that emotions arise partly from interacting evaluations of one's current affective state, previous affective state, predictions for how these may change in the future, and the experienced outcomes following these predictions. These states can be represented and inferred from neural systems that encode shifts in outcomes and make predictions. In two studies, we demonstrate that emotion labels are reliably differentiated from one another using only simple cues about these affective trajectories through time. For example, when a worse-than-expected outcome follows the prediction that something good will happen, that situation is labeled as causing anger, whereas when a worse-than-expected outcome follows the prediction that something bad will happen, that situation is labeled as causing sadness. Emotion categories are more differentiated when participants are required to think categorically than when participants have the option to consider multiple emotions and degrees of emotions. This work indicates that information about affective movement through time and changes in affective trajectory may be a fundamental aspect of emotion categories. Future studies of emotion must account for the dynamic way that we absorb and process information. (PsycINFO Database Record (c) 2012 APA, all rights reserved).

  11. The moderating role of cognitive control deficits in the link from emotional dissonance to burnout symptoms and absenteeism.

    Science.gov (United States)

    Diestel, Stefan; Schmidt, Klaus-Helmut

    2011-07-01

    The present study examines whether cognitive control deficits (CCDs) as a personal vulnerability factor amplify the relationship between emotional dissonance (ED; perceived discrepancy between felt and expressed emotions) and burnout symptoms (emotional exhaustion and depersonalization) as well as absenteeism. CCDs refer to daily failures and impairments of attention regulation, impulse control, and memory. The prediction of the moderator effect of CCDs draws on the argument that portraying emotions which are not genuinely felt is a form of self-regulation taxing and depleting a limited resource capacity. Interindividual differences in the resource capacity are reflected by the measure of CCDs. Drawing on two German samples (one cross-sectional and one longitudinal sample; NTOTAL = 645) of service employees, the present study analyzed interactive effects of ED and CCDs on exhaustion, depersonalization, and two indicators of absenteeism. As was hypothesized, latent moderated structural equation modeling revealed that the adverse impacts of ED on both burnout symptoms and absence behavior were amplified as a function of CCDs. Theoretical and practical implications of the present results will be discussed.

  12. Memory consolidation of socially relevant stimuli during sleep in healthy children and children with attention-deficit/hyperactivity disorder and oppositional defiant disorder: What you can see in their eyes.

    Science.gov (United States)

    Prehn-Kristensen, Alexander; Molzow, Ina; Förster, Alexandra; Siebenhühner, Nadine; Gesch, Maxime; Wiesner, Christian D; Baving, Lioba

    2017-02-01

    Children with attention-deficit/hyperactivity disorder (ADHD) display deficits in sleep-dependent memory consolidation, and being comorbid with oppositional defiant disorder (ODD), results in deficits in face processing. The aim of the present study was to investigate the role of sleep in recognizing faces in children with ADHD+ODD. Sixteen healthy children and 16 children diagnosed with ADHD+ODD participated in a sleep and a wake condition. During encoding (sleep condition at 8p.m.; wake condition at 8a.m.) pictures of faces were rated according to their emotional content; the retrieval session (12h after encoding session) contained a recognition task including pupillometry. Pupillometry and behavioral data revealed that healthy children benefited from sleep compared to wake with respect to face picture recognition; in contrast recognition performance in patients with ADHD+ODD was not improved after sleep compared to wake. It is discussed whether in patients with ADHD+ODD social stimuli are preferentially consolidated during daytime. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  13. Atypical Processing of Gaze Cues and Faces Explains Comorbidity between Autism Spectrum Disorder (ASD) and Attention Deficit/Hyperactivity Disorder (ADHD)

    Science.gov (United States)

    Groom, Madeleine J.; Kochhar, Puja; Hamilton, Antonia; Liddle, Elizabeth B.; Simeou, Marina; Hollis, Chris

    2017-01-01

    This study investigated the neurobiological basis of comorbidity between autism spectrum disorder (ASD) and attention deficit/hyperactivity disorder (ADHD). We compared children with ASD, ADHD or ADHD+ASD and typically developing controls (CTRL) on behavioural and electrophysiological correlates of gaze cue and face processing. We measured effects…

  14. Processing Distracting Non-face Emotional Images: No Evidence of an Age-Related Positivity Effect.

    Science.gov (United States)

    Madill, Mark; Murray, Janice E

    2017-01-01

    Cognitive aging may be accompanied by increased prioritization of social and emotional goals that enhance positive experiences and emotional states. The socioemotional selectivity theory suggests this may be achieved by giving preference to positive information and avoiding or suppressing negative information. Although there is some evidence of a positivity bias in controlled attention tasks, it remains unclear whether a positivity bias extends to the processing of affective stimuli presented outside focused attention. In two experiments, we investigated age-related differences in the effects of to-be-ignored non-face affective images on target processing. In Experiment 1, 27 older (64-90 years) and 25 young adults (19-29 years) made speeded valence judgments about centrally presented positive or negative target images taken from the International Affective Picture System. To-be-ignored distractor images were presented above and below the target image and were either positive, negative, or neutral in valence. The distractors were considered task relevant because they shared emotional characteristics with the target stimuli. Both older and young adults responded slower to targets when distractor valence was incongruent with target valence relative to when distractors were neutral. Older adults responded faster to positive than to negative targets but did not show increased interference effects from positive distractors. In Experiment 2, affective distractors were task irrelevant as the target was a three-digit array and did not share emotional characteristics with the distractors. Twenty-six older (63-84 years) and 30 young adults (18-30 years) gave speeded responses on a digit disparity task while ignoring the affective distractors positioned in the periphery. Task performance in either age group was not influenced by the task-irrelevant affective images. In keeping with the socioemotional selectivity theory, these findings suggest that older adults preferentially

  15. Six months methylphenidate treatment improves emotion dysregulation in adolescents with attention deficit/hyperactivity disorder: a prospective study

    Directory of Open Access Journals (Sweden)

    Suzer Gamli I

    2018-05-01

    Full Text Available Ipek Suzer Gamli,1 Aysegul Yolga Tahiroglu2 1Sanliurfa Education and Research Hospital, Eyyubiye, Sanliurfa, Turkey; 2Child and Adolescent Psychiatry Department, Cukurova University School of Medicine, Saricam, Adana, Turkey Purpose: Individuals with attention deficit/hyperactivity disorder (ADHD may suffer from emotional dysregulation (ED, although this symptom is not listed among the diagnostic criteria. Methylphenidate (MPH is useful in reducing emotional symptoms in ADHD. The aim of the present study was to determine both psychosocial risk factors and presence of ED in adolescents with ADHD before and after MPH treatment. Participants and methods: Eighty-two patients aged 12–18 years with ADHD were included as participants. The Kiddie Schedule for Affective Disorders and Schizophrenia for School-Age Children – Present and Lifetime, the Difficulties in Emotion Regulation Scale (DERS, sociodemographic form, and the Inventory of Statements About Self-Injury were administered. Results were compared before and after 6 months MPH treatment. Results: A significant improvement was detected on DERS for impulsivity (15.9±6.8 initial vs 14.2±6.5 final test, p<0.01 and total score (88.4±23.3 initial vs 82.4±2.7 final test, p<0.05 across all patients taking MPH regardless of subtype and sex. Despite treatment, a significant difference remained for impulsivity, strategies, and total score in patients with comorbid oppositional defiant disorder (ODD compared with those without ODD, but no difference was detected for conduct disorder comorbidity. In patients who self-harm, scores for goals, impulsivity, strategies, clarity, and total score were higher before treatment: furthermore, impulsivity and total score remained high after treatment. In maltreated patients, goals, impulsivity, strategies, and total scores were significantly higher before treatment; however, their symptoms were ameliorated after treatment with MPH. Conclusion: Individuals with

  16. An Investigation of Emotion Recognition and Theory of Mind in People with Chronic Heart Failure

    Science.gov (United States)

    Habota, Tina; McLennan, Skye N.; Cameron, Jan; Ski, Chantal F.; Thompson, David R.; Rendell, Peter G.

    2015-01-01

    Objectives Cognitive deficits are common in patients with chronic heart failure (CHF), but no study has investigated whether these deficits extend to social cognition. The present study provided the first empirical assessment of emotion recognition and theory of mind (ToM) in patients with CHF. In addition, it assessed whether each of these social cognitive constructs was associated with more general cognitive impairment. Methods A group comparison design was used, with 31 CHF patients compared to 38 demographically matched controls. The Ekman Faces test was used to assess emotion recognition, and the Mind in the Eyes test to measure ToM. Measures assessing global cognition, executive functions, and verbal memory were also administered. Results There were no differences between groups on emotion recognition or ToM. The CHF group’s performance was poorer on some executive measures, but memory was relatively preserved. In the CHF group, both emotion recognition performance and ToM ability correlated moderately with global cognition (r = .38, p = .034; r = .49, p = .005, respectively), but not with executive function or verbal memory. Conclusion CHF patients with lower cognitive ability were more likely to have difficulty recognizing emotions and inferring the mental states of others. Clinical implications of these findings are discussed. PMID:26529409

  17. Threatening faces fail to guide attention for adults with autistic-like traits.

    Science.gov (United States)

    English, Michael C W; Maybery, Murray T; Visser, Troy A W

    2017-02-01

    Individuals diagnosed with autistic spectrum conditions often show deficits in processing emotional faces relative to neurotypical peers. However, little is known about whether similar deficits exist in neurotypical individuals who show high-levels of autistic-like traits. To address this question, we compared performance on an attentional blink task in a large sample of adults who showed low- or high-levels of autistic-like traits on the Autism Spectrum Quotient. We found that threatening faces inserted as the second target in a rapid serial visual presentation were identified more accurately among individuals with low- compared to high-levels of autistic-like traits. This is the first study to show that attentional blink abnormalities seen in autism extend to the neurotypical population with autistic-like traits, adding to the growing body of research suggesting that autistic-related patterns of behaviors extend into a subset of the neurotypical population. Autism Res 2017, 10: 311-320. © 2016 International Society for Autism Research, Wiley Periodicals, Inc. © 2016 International Society for Autism Research, Wiley Periodicals, Inc.

  18. Emotion recognition through static faces and moving bodies: a comparison between typically-developed adults and individuals with high level of autistic traits

    OpenAIRE

    Rossana eActis-Grosso; Rossana eActis-Grosso; Francesco eBossi; Paola eRicciardelli; Paola eRicciardelli

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits (HAT group) or with High Functioning Autism Spectrum Disorder was compared in the recognition of four emotions (Happiness, Anger, Fear and Sadness) either shown in static faces or c...

  19. Emotion recognition through static faces and moving bodies: a comparison between typically developed adults and individuals with high level of autistic traits

    OpenAIRE

    Actis-Grosso, Rossana; Bossi, Francesco; Ricciardelli, Paola

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits or with High Functioning Autism Spectrum Disorder (HAT group) was compared in the recognition of four emotions (Happiness, Anger, Fear, and Sadness) either shown in static faces or ...

  20. How social cognition deficits affect psychopathology: A neuroscientific approach

    Directory of Open Access Journals (Sweden)

    Andrić Sanja

    2015-01-01

    Full Text Available Humans are substantially a social species. Effective mental treatment cannot be obtained without addressing social behavior. Social cognition refers to the mental processes underlying social interactions, which allow individuals to make sense of the other peoples' behavior, to decipher emotions on their faces, and to draw conclusions about their intentions. The core domains of this multifaceted concept are theory of mind, social cue perception, attributional style and emotion perception/ processing. The amygdala, orbital frontal cortex and temporal cortex areas are typically activated during the processing of information within social-emotional context. The aforementioned brain areas are recognized as the major components of the so-called 'social brain'- specialized for the social interactions in humans. Adequate perceiving and processing of the social information is essential for an effective social functioning, which becomes obvious when it goes awry. Various psychiatric disorders are characterized by social cognitive deficits, among which schizophrenias, depression-anxiety and autism spectrum disorders were most broadly studied to date. Growing evidence suggest that these deficits underlie poor functional outcomes in patients with mental health impairments and have an important role in the initiation and maintenance of the disorders' symptoms. One of the most important goals of social neuroscience research is to provide a treatment intervention that will improve patients' social cognitive skills and the functional outcome. All together, the present review aims to provide a contemporary overview of the concept of social cognition, to outline its relation to psychopathology, and to discuss the implications for clinical practice and treatment.

  1. A Multidimensional Approach to the Study of Emotion Recognition in Autism Spectrum Disorders

    Directory of Open Access Journals (Sweden)

    Jean eXavier

    2015-12-01

    Full Text Available Although deficits in emotion recognition have been widely reported in Autism Spectrum Disorder (ASD, experiments have been restricted to either facial or vocal expressions. Here, we explored multimodal emotion processing in children with ASD (N=19 and with typical development (TD, N=19, considering uni (faces and voices and multimodal (faces/voices simultaneously stimuli and developmental comorbidities (neuro-visual, language and motor impairments.Compared to TD controls, children with ASD had rather high and heterogeneous emotion recognition scores but showed also several significant differences: lower emotion recognition scores for visual stimuli, for neutral emotion, and a greater number of saccades during visual task. Multivariate analyses showed that: (1 the difficulties they experienced with visual stimuli were partially alleviated multimodal stimuli. (2 Developmental age was significantly associated with emotion recognition in TD children, whereas it was the case only for the multimodal task in children with ASD. (3 Language impairments tended to be associated with emotion recognition scores of ASD children in the auditory modality. Conversely, in the visual or bimodal (visuo-auditory tasks, the impact of developmental coordination disorder or neuro-visual impairments was not found.We conclude that impaired emotion processing constitutes a dimension to explore in the field of ASD, as research has the potential to define more homogeneous subgroups and tailored interventions. However, it is clear that developmental age, the nature of the stimuli, and other developmental comorbidities must also be taken into account when studying this dimension.

  2. Personality, Attentional Biases towards Emotional Faces and Symptoms of Mental Disorders in an Adolescent Sample.

    Science.gov (United States)

    O'Leary-Barrett, Maeve; Pihl, Robert O; Artiges, Eric; Banaschewski, Tobias; Bokde, Arun L W; Büchel, Christian; Flor, Herta; Frouin, Vincent; Garavan, Hugh; Heinz, Andreas; Ittermann, Bernd; Mann, Karl; Paillère-Martinot, Marie-Laure; Nees, Frauke; Paus, Tomas; Pausova, Zdenka; Poustka, Luise; Rietschel, Marcella; Robbins, Trevor W; Smolka, Michael N; Ströhle, Andreas; Schumann, Gunter; Conrod, Patricia J

    2015-01-01

    To investigate the role of personality factors and attentional biases towards emotional faces, in establishing concurrent and prospective risk for mental disorder diagnosis in adolescence. Data were obtained as part of the IMAGEN study, conducted across 8 European sites, with a community sample of 2257 adolescents. At 14 years, participants completed an emotional variant of the dot-probe task, as well two personality measures, namely the Substance Use Risk Profile Scale and the revised NEO Personality Inventory. At 14 and 16 years, participants and their parents were interviewed to determine symptoms of mental disorders. Personality traits were general and specific risk indicators for mental disorders at 14 years. Increased specificity was obtained when investigating the likelihood of mental disorders over a 2-year period, with the Substance Use Risk Profile Scale showing incremental validity over the NEO Personality Inventory. Attentional biases to emotional faces did not characterise or predict mental disorders examined in the current sample. Personality traits can indicate concurrent and prospective risk for mental disorders in a community youth sample, and identify at-risk youth beyond the impact of baseline symptoms. This study does not support the hypothesis that attentional biases mediate the relationship between personality and psychopathology in a community sample. Task and sample characteristics that contribute to differing results among studies are discussed.

  3. Double attention bias for positive and negative emotional faces in clinical depression: evidence from an eye-tracking study.

    Science.gov (United States)

    Duque, Almudena; Vázquez, Carmelo

    2015-03-01

    According to cognitive models, attentional biases in depression play key roles in the onset and subsequent maintenance of the disorder. The present study examines the processing of emotional facial expressions (happy, angry, and sad) in depressed and non-depressed adults. Sixteen unmedicated patients with Major Depressive Disorder (MDD) and 34 never-depressed controls (ND) completed an eye-tracking task to assess different components of visual attention (orienting attention and maintenance of attention) in the processing of emotional faces. Compared to ND, participants with MDD showed a negative attentional bias in attentional maintenance indices (i.e. first fixation duration and total fixation time) for sad faces. This attentional bias was positively associated with the severity of depressive symptoms. Furthermore, the MDD group spent a marginally less amount of time viewing happy faces compared with the ND group. No differences were found between the groups with respect to angry faces and orienting attention indices. The current study is limited by its cross-sectional design. These results support the notion that attentional biases in depression are specific to depression-related information and that they operate in later stages in the deployment of attention. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Variations in the serotonin-transporter gene are associated with attention bias patterns to positive and negative emotion faces.

    Science.gov (United States)

    Pérez-Edgar, Koraly; Bar-Haim, Yair; McDermott, Jennifer Martin; Gorodetsky, Elena; Hodgkinson, Colin A; Goldman, David; Ernst, Monique; Pine, Daniel S; Fox, Nathan A

    2010-03-01

    Both attention biases to threat and a serotonin-transporter gene polymorphism (5-HTTLPR) have been linked to heightened neural activation to threat and the emergence of anxiety. The short allele of 5-HTTLPR may act via its effect on neurotransmitter availability, while attention biases shape broad patterns of cognitive processing. We examined individual differences in attention bias to emotion faces as a function of 5-HTTLPR genotype. Adolescents (N=117) were classified for presumed SLC6A4 expression based on 5-HTTLPR-low (SS, SL(G), or L(G)L(G)), intermediate (SL(A) or L(A)L(G)), or high (L(A)L(A)). Participants completed the dot-probe task, measuring attention biases toward or away from angry and happy faces. Biases for angry faces increased with the genotype-predicted neurotransmission levels (low>intermediate>high). The reverse pattern was evident for happy faces. The data indicate a linear relation between 5-HTTLPR allelic status and attention biases to emotion, demonstrating a genetic mechanism for biased attention using ecologically valid stimuli that target socioemotional adaptation. Copyright 2009 Elsevier B.V. All rights reserved.

  5. Gender differences in the relationship between social communication and emotion recognition.

    Science.gov (United States)

    Kothari, Radha; Skuse, David; Wakefield, Justin; Micali, Nadia

    2013-11-01

    To investigate the association between autistic traits and emotion recognition in a large community sample of children using facial and social motion cues, additionally stratifying by gender. A general population sample of 3,666 children from the Avon Longitudinal Study of Parents and Children (ALSPAC) were assessed on their ability to correctly recognize emotions using the faces subtest of the Diagnostic Analysis of Non-Verbal Accuracy, and the Emotional Triangles Task, a novel test assessing recognition of emotion from social motion cues. Children with autistic-like social communication difficulties, as assessed by the Social Communication Disorders Checklist, were compared with children without such difficulties. Autistic-like social communication difficulties were associated with poorer recognition of emotion from social motion cues in both genders, but were associated with poorer facial emotion recognition in boys only (odds ratio = 1.9, 95% CI = 1.4, 2.6, p = .0001). This finding must be considered in light of lower power to detect differences in girls. In this community sample of children, greater deficits in social communication skills are associated with poorer discrimination of emotions, implying there may be an underlying continuum of liability to the association between these characteristics. As a similar degree of association was observed in both genders on a novel test of social motion cues, the relatively good performance of girls on the more familiar task of facial emotion discrimination may be due to compensatory mechanisms. Our study might indicate the existence of a cognitive process by which girls with underlying autistic traits can compensate for their covert deficits in emotion recognition, although this would require further investigation. Copyright © 2013 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  6. [Negative symptoms, emotion and cognition in schizophrenia].

    Science.gov (United States)

    Fakra, E; Belzeaux, R; Azorin, J-M; Adida, M

    2015-12-01

    For a long time, treatment of schizophrenia has been essentially focussed on positive symptoms managing. Yet, even if these symptoms are the most noticeable, negative symptoms are more enduring, resistant to pharmacological treatment and associated with a worse prognosis. In the two last decades, attention has shift towards cognitive deficit, as this deficit is most robustly associated to functional outcome. But it appears that the modest improvement in cognition, obtained in schizophrenia through pharmacological treatment or, more purposely, by cognitive enhancement therapy, has only lead to limited amelioration of functional outcome. Authors have claimed that pure cognitive processes, such as those evaluated and trained in lots of these programs, may be too distant from real-life conditions, as the latter are largely based on social interactions. Consequently, the field of social cognition, at the interface of cognition and emotion, has emerged. In a first part of this article we examined the links, in schizophrenia, between negative symptoms, cognition and emotions from a therapeutic standpoint. Nonetheless, investigation of emotion in schizophrenia may also hold relevant premises for understanding the physiopathology of this disorder. In a second part, we propose to illustrate this research by relying on the heuristic value of an elementary marker of social cognition, facial affect recognition. Facial affect recognition has been repeatedly reported to be impaired in schizophrenia and some authors have argued that this deficit could constitute an endophenotype of the illness. We here examined how facial affect processing has been used to explore broader emotion dysfunction in schizophrenia, through behavioural and imaging studies. In particular, fMRI paradigms using facial affect have shown particular patterns of amygdala engagement in schizophrenia, suggesting an intact potential to elicit the limbic system which may however not be advantageous. Finally, we

  7. Differential Interactions between Identity and Emotional Expression in Own and Other-Race Faces: Effects of Familiarity Revealed through Redundancy Gains

    Science.gov (United States)

    Yankouskaya, Alla; Humphreys, Glyn W.; Rotshtein, Pia

    2014-01-01

    We examined relations between the processing of facial identity and emotion in own- and other-race faces, using a fully crossed design with participants from 3 different ethnicities. The benefits of redundant identity and emotion signals were evaluated and formally tested in relation to models of independent and coactive feature processing and…

  8. Oxytocin improves emotion recognition for older males.

    Science.gov (United States)

    Campbell, Anna; Ruffman, Ted; Murray, Janice E; Glue, Paul

    2014-10-01

    Older adults (≥60 years) perform worse than young adults (18-30 years) when recognizing facial expressions of emotion. The hypothesized cause of these changes might be declines in neurotransmitters that could affect information processing within the brain. In the present study, we examined the neuropeptide oxytocin that functions to increase neurotransmission. Research suggests that oxytocin benefits the emotion recognition of less socially able individuals. Men tend to have lower levels of oxytocin and older men tend to have worse emotion recognition than older women; therefore, there is reason to think that older men will be particularly likely to benefit from oxytocin. We examined this idea using a double-blind design, testing 68 older and 68 young adults randomly allocated to receive oxytocin nasal spray (20 international units) or placebo. Forty-five minutes afterward they completed an emotion recognition task assessing labeling accuracy for angry, disgusted, fearful, happy, neutral, and sad faces. Older males receiving oxytocin showed improved emotion recognition relative to those taking placebo. No differences were found for older females or young adults. We hypothesize that oxytocin facilitates emotion recognition by improving neurotransmission in the group with the worst emotion recognition. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Alexithymia and the processing of emotional facial expressions (EFEs: systematic review, unanswered questions and further perspectives.

    Directory of Open Access Journals (Sweden)

    Delphine Grynberg

    Full Text Available Alexithymia is characterized by difficulties in identifying, differentiating and describing feelings. A high prevalence of alexithymia has often been observed in clinical disorders characterized by low social functioning. This review aims to assess the association between alexithymia and the ability to decode emotional facial expressions (EFEs within clinical and healthy populations. More precisely, this review has four main objectives: (1 to assess if alexithymia is a better predictor of the ability to decode EFEs than the diagnosis of clinical disorder; (2 to assess the influence of comorbid factors (depression and anxiety disorder on the ability to decode EFE; (3 to investigate if deficits in decoding EFEs are specific to some levels of processing or task types; (4 to investigate if the deficits are specific to particular EFEs. Twenty four studies (behavioural and neuroimaging were identified through a computerized literature search of Psycinfo, PubMed, and Web of Science databases from 1990 to 2010. Data on methodology, clinical characteristics, and possible confounds were analyzed. The review revealed that: (1 alexithymia is associated with deficits in labelling EFEs among clinical disorders, (2 the level of depression and anxiety partially account for the decoding deficits, (3 alexithymia is associated with reduced perceptual abilities, and is likely to be associated with impaired semantic representations of emotional concepts, and (4 alexithymia is associated with neither specific EFEs nor a specific valence. These studies are discussed with respect to processes involved in the recognition of EFEs. Future directions for research on emotion perception are also discussed.

  10. Characterization and recognition of mixed emotional expressions in thermal face image

    Science.gov (United States)

    Saha, Priya; Bhattacharjee, Debotosh; De, Barin K.; Nasipuri, Mita

    2016-05-01

    Facial expressions in infrared imaging have been introduced to solve the problem of illumination, which is an integral constituent of visual imagery. The paper investigates facial skin temperature distribution on mixed thermal facial expressions of our created face database where six are basic expressions and rest 12 are a mixture of those basic expressions. Temperature analysis has been performed on three facial regions of interest (ROIs); periorbital, supraorbital and mouth. Temperature variability of the ROIs in different expressions has been measured using statistical parameters. The temperature variation measurement in ROIs of a particular expression corresponds to a vector, which is later used in recognition of mixed facial expressions. Investigations show that facial features in mixed facial expressions can be characterized by positive emotion induced facial features and negative emotion induced facial features. Supraorbital is a useful facial region that can differentiate basic expressions from mixed expressions. Analysis and interpretation of mixed expressions have been conducted with the help of box and whisker plot. Facial region containing mixture of two expressions is generally less temperature inducing than corresponding facial region containing basic expressions.

  11. Proactive and reactive control depends on emotional valence: a Stroop study with emotional expressions and words.

    Science.gov (United States)

    Kar, Bhoomika Rastogi; Srinivasan, Narayanan; Nehabala, Yagyima; Nigam, Richa

    2018-03-01

    We examined proactive and reactive control effects in the context of task-relevant happy, sad, and angry facial expressions on a face-word Stroop task. Participants identified the emotion expressed by a face that contained a congruent or incongruent emotional word (happy/sad/angry). Proactive control effects were measured in terms of the reduction in Stroop interference (difference between incongruent and congruent trials) as a function of previous trial emotion and previous trial congruence. Reactive control effects were measured in terms of the reduction in Stroop interference as a function of current trial emotion and previous trial congruence. Previous trial negative emotions exert greater influence on proactive control than the positive emotion. Sad faces in the previous trial resulted in greater reduction in the Stroop interference for happy faces in the current trial. However, current trial angry faces showed stronger adaptation effects compared to happy faces. Thus, both proactive and reactive control mechanisms are dependent on emotional valence of task-relevant stimuli.

  12. Patterns of feelings in face to face negotiation: a Sino-Dutch pilot study

    NARCIS (Netherlands)

    Ulijn, J.M.; Rutkowski, A.F.; Kumar, Rajesh; Zhu, Y.

    2005-01-01

    We conducted a pilot study to compare the emotions experienced by Dutch and Chinese students during a face-to-face negotiation role play. Emotions play an important role in negotiations because they influence the behaviour and judgments of negotiators The Data Printer case developed by Greenhalgh

  13. The human body odor compound androstadienone leads to anger-dependent effects in an emotional Stroop but not dot-probe task using human faces.

    Science.gov (United States)

    Hornung, Jonas; Kogler, Lydia; Wolpert, Stephan; Freiherr, Jessica; Derntl, Birgit

    2017-01-01

    The androgen derivative androstadienone is a substance found in human sweat and thus is a putative human chemosignal. Androstadienone has been studied with respect to effects on mood states, attractiveness ratings, physiological and neural activation. With the current experiment, we aimed to explore in which way androstadienone affects attention to social cues (human faces). Moreover, we wanted to test whether effects depend on specific emotions, the participants' sex and individual sensitivity to smell androstadienone. To do so, we investigated 56 healthy individuals (thereof 29 females taking oral contraceptives) with two attention tasks on two consecutive days (once under androstadienone, once under placebo exposure in pseudorandomized order). With an emotional dot-probe task we measured visuo-spatial cueing while an emotional Stroop task allowed us to investigate interference control. Our results suggest that androstadienone acts in a sex, task and emotion-specific manner as a reduction in interference processes in the emotional Stroop task was only apparent for angry faces in men under androstadienone exposure. More specifically, men showed a smaller difference in reaction times for congruent compared to incongruent trials. At the same time also women were slightly affected by smelling androstadienone as they classified angry faces more often correctly under androstadienone. For the emotional dot-probe task no modulation by androstadienone was observed. Furthermore, in both attention paradigms individual sensitivity to androstadienone was neither correlated with reaction times nor error rates in men and women. To conclude, exposure to androstadienone seems to potentiate the relevance of angry faces in both men and women in connection with interference control, while processes of visuo-spatial cueing remain unaffected.

  14. Alexithymia is associated with attenuated automatic brain response to facial emotion in clinical depression.

    Science.gov (United States)

    Suslow, Thomas; Kugel, Harald; Rufer, Michael; Redlich, Ronny; Dohm, Katharina; Grotegerd, Dominik; Zaremba, Dario; Dannlowski, Udo

    2016-02-04

    Alexithymia is a clinically relevant personality trait related to difficulties in recognizing and describing emotions. Previous studies examining the neural correlates of alexithymia have shown mainly decreased response of several brain areas during emotion processing in healthy samples and patients suffering from autism or post-traumatic stress disorder. In the present study, we examined the effect of alexithymia on automatic brain reactivity to negative and positive facial expressions in clinical depression. Brain activation in response to sad, happy, neutral, and no facial expression (presented for 33 ms and masked by neutral faces) was measured by functional magnetic resonance imaging at 3 T in 26 alexithymic and 26 non-alexithymic patients with major depression. Alexithymic patients manifested less activation in response to masked sad and happy (compared to neutral) faces in right frontal regions and right caudate nuclei than non-alexithymic patients. Our neuroimaging study provides evidence that the personality trait alexithymia has a modulating effect on automatic emotion processing in clinical depression. Our findings support the idea that alexithymia could be associated with functional deficits of the right hemisphere. Future research on the neural substrates of emotion processing in depression should assess and control alexithymia in their analyses.

  15. Open-label administration of lisdexamfetamine dimesylate improves executive function impairments and symptoms of attention-deficit/hyperactivity disorder in adults.

    Science.gov (United States)

    Brown, Thomas E; Brams, Matthew; Gao, Joseph; Gasior, Maria; Childress, Ann

    2010-09-01

    Executive function (EF) impairment in attention-deficit/hyperactivity disorder (ADHD) may account for behavioral symptoms such as poor concentration, impaired working memory, problems in shifting among tasks, and prioritizing and planning complex sets of tasks or completing long-term projects at work or school. Poor self-regulation and control of emotional behaviors frequently are seen in patients with ADHD. This study assessed EF behaviors in adults with ADHD at baseline and after 4 weeks of treatment with lisdexamfetamine dimesylate (LDX). Executive function behavior was assessed using the Brown Attention-Deficit Disorder Scale (BADDS) during the 4-week open-label dose-optimization phase prior to a 2-period, randomized, double-blind, placebo-controlled crossover study of LDX (30-70 mg/day). The ADHD Rating Scale IV (ADHD-RS-IV) with adult prompts assessed ADHD symptoms. Change in EF behavioral symptoms was evaluated based on week 4 BADDS total and cluster scores; analyses of shifts from baseline among subjects with BADDS scores < 50, 50 to 59, 60 to 69, and ≥ 70; and scores less than or greater than baseline 90% confidence range (eg, reliably improved or worsened, respectively). Treatment-emergent adverse events (TEAEs) were described. At week 4, BADDS total and cluster scores were reduced (ie, improved; all P < 0.0001 vs baseline [n = 127]). The ADHD-RS-IV with adult prompts scores also improved (all P < 0.0001 vs baseline). At week 4, 62.7% of subjects had a BADDS total score of < 50, and 78.9% were reliably improved; 1.4% were reliably worsened. Common TEAEs (≥ 5%) during the dose-optimization phase were decreased appetite (36.6%), dry mouth (30.3%), headache (19.7%), insomnia (18.3%), upper respiratory tract infection (9.9%), irritability (8.5%), nausea (7.7%), anxiety (5.6%), and feeling jittery (5.6%). Clinically optimized doses of LDX (30-70 mg/day) significantly improved EF behaviors in adults with ADHD. Treatment-emergent adverse events with LDX were

  16. Abnormal GABAergic function and face processing in schizophrenia: A pharmacologic-fMRI study.

    Science.gov (United States)

    Tso, Ivy F; Fang, Yu; Phan, K Luan; Welsh, Robert C; Taylor, Stephan F

    2015-10-01

    The involvement of the gamma-aminobutyric acid (GABA) system in schizophrenia is suggested by postmortem studies and the common use of GABA receptor-potentiating agents in treatment. In a recent study, we used a benzodiazepine challenge to demonstrate abnormal GABAergic function during processing of negative visual stimuli in schizophrenia. This study extended this investigation by mapping GABAergic mechanisms associated with face processing and social appraisal in schizophrenia using a benzodiazepine challenge. Fourteen stable, medicated schizophrenia/schizoaffective patients (SZ) and 13 healthy controls (HC) underwent functional MRI using the blood oxygenation level-dependent (BOLD) technique while they performed the Socio-emotional Preference Task (SePT) on emotional face stimuli ("Do you like this face?"). Participants received single-blinded intravenous saline and lorazepam (LRZ) in two separate sessions separated by 1-3weeks. Both SZ and HC recruited medial prefrontal cortex/anterior cingulate during the SePT, relative to gender identification. A significant drug by group interaction was observed in the medial occipital cortex, such that SZ showed increased BOLD signal to LRZ challenge, while HC showed an expected decrease of signal; the interaction did not vary by task. The altered BOLD response to LRZ challenge in SZ was significantly correlated with increased negative affect across multiple measures. The altered response to LRZ challenge suggests that abnormal face processing and negative affect in SZ are associated with altered GABAergic function in the visual cortex, underscoring the role of impaired visual processing in socio-emotional deficits in schizophrenia. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Speech-based emotion detection in a resource-scarce environment

    CSIR Research Space (South Africa)

    Martirosian, O

    2007-11-01

    Full Text Available , happiness and frustration; passive emotion encompasses sadness and dis- appointment, and neutral encompasses speech with a negligible amount of emotional content. Because a study on the expression of emotion in speech has not been done in the South... seconds long and the segments labelled with the dominant emotion of the speech contained in them. The fine emotional labels used were angry, frustrated, happy, friendly, neutral, sad and depressed. These fine labels were combined into three broad...

  18. Association between Ability Emotional Intelligence and Left Insula during Social Judgment of Facial Emotions.

    Science.gov (United States)

    Quarto, Tiziana; Blasi, Giuseppe; Maddalena, Chiara; Viscanti, Giovanna; Lanciano, Tiziana; Soleti, Emanuela; Mangiulli, Ivan; Taurisano, Paolo; Fazio, Leonardo; Bertolino, Alessandro; Curci, Antonietta

    2016-01-01

    The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain "somatic marker circuitry" (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.

  19. Does a single session of electroconvulsive therapy alter the neural response to emotional faces in depression? A randomised sham-controlled functional magnetic resonance imaging study

    DEFF Research Database (Denmark)

    Miskowiak, Kamilla W; Kessing, Lars V; Ott, Caroline V

    2017-01-01

    neurocognitive bias in major depressive disorder. Patients with major depressive disorder were randomised to one active ( n=15) or sham electroconvulsive therapy ( n=12). The following day they underwent whole-brain functional magnetic resonance imaging at 3T while viewing emotional faces and performed facial...... expression recognition and dot-probe tasks. A single electroconvulsive therapy session had no effect on amygdala response to emotional faces. Whole-brain analysis revealed no effects of electroconvulsive therapy versus sham therapy after family-wise error correction at the cluster level, using a cluster...... to faces after a single electroconvulsive therapy session, the observed trend changes after a single electroconvulsive therapy session point to an early shift in emotional processing that may contribute to antidepressant effects of electroconvulsive therapy....

  20. Emotion Regulation in Adolescent Males with Attention-Deficit Hyperactivity Disorder: Testing the Effects of Comorbid Conduct Disorder

    Directory of Open Access Journals (Sweden)

    Clare Northover

    2015-09-01

    Full Text Available Although attention-deficit hyperactivity disorder (ADHD has been linked to emotion dysregulation, few studies have experimentally investigated this whilst controlling for the effects of comorbid conduct disorder (CD. Economic decision-making games that assess how individuals respond to offers varying in fairness have been used to study emotion regulation. The present study compared adolescent boys with ADHD (n = 90, ADHD + CD (n = 94 and typical controls (n = 47 on the Ultimatum Game and examined the contribution of ADHD and CD symptom scores and callous and unemotional traits to acceptance levels of unfair offers. There were no significant differences in acceptance rates of fair and highly unfair offers between groups, and only boys with ADHD did not significantly differ from the controls. However, the subgroup of boys with ADHD and additional high levels of aggressive CD symptoms rejected significantly more ambiguous (i.e., moderately unfair offers than any other subgroup, suggesting impaired emotion regulation in those with ADHD and aggressive CD. Correlations within the CD group showed that the rejection rate to moderately unfair offers was predicted by aggressive CD symptom severity, but not callous and unemotional traits. These findings highlight the fact that ADHD is a heterogeneous condition from an emotion regulation point of view.

  1. Emotion Regulation in Adolescent Males with Attention-Deficit Hyperactivity Disorder: Testing the Effects of Comorbid Conduct Disorder.

    Science.gov (United States)

    Northover, Clare; Thapar, Anita; Langley, Kate; van Goozen, Stephanie

    2015-09-07

    Although attention-deficit hyperactivity disorder (ADHD) has been linked to emotion dysregulation, few studies have experimentally investigated this whilst controlling for the effects of comorbid conduct disorder (CD). Economic decision-making games that assess how individuals respond to offers varying in fairness have been used to study emotion regulation. The present study compared adolescent boys with ADHD (n = 90), ADHD + CD (n = 94) and typical controls (n = 47) on the Ultimatum Game and examined the contribution of ADHD and CD symptom scores and callous and unemotional traits to acceptance levels of unfair offers. There were no significant differences in acceptance rates of fair and highly unfair offers between groups, and only boys with ADHD did not significantly differ from the controls. However, the subgroup of boys with ADHD and additional high levels of aggressive CD symptoms rejected significantly more ambiguous (i.e., moderately unfair) offers than any other subgroup, suggesting impaired emotion regulation in those with ADHD and aggressive CD. Correlations within the CD group showed that the rejection rate to moderately unfair offers was predicted by aggressive CD symptom severity, but not callous and unemotional traits. These findings highlight the fact that ADHD is a heterogeneous condition from an emotion regulation point of view.

  2. Emotion Knowledge and Attentional Differences in Preschoolers Showing Context-Inappropriate Anger.

    Science.gov (United States)

    Locke, Robin L; Lang, Nichole J

    2016-08-01

    Some children show anger inappropriate for the situation based on the predominant incentives, which is called context-inappropriate anger. Children need to attend to and interpret situational incentives for appropriate emotional responses. We examined associations of context-inappropriate anger with emotion recognition and attention problems in 43 preschoolers (42% male; M age = 55.1 months, SD = 4.1). Parents rated context-inappropriate anger across situations. Teachers rated attention problems using the Child Behavior Checklist-Teacher Report Form. Emotion recognition was ability to recognize emotional faces using the Emotion Matching Test. Anger perception bias was indicated by anger to non-anger situations using an adapted Affect Knowledge Test. 28% of children showed context-inappropriate anger, which correlated with lower emotion recognition (β = -.28) and higher attention problems (β = .36). Higher attention problems correlated with more anger perception bias (β = .32). This cross-sectional, correlational study provides preliminary findings that children with context-inappropriate anger showed more attention problems, which suggests that both "problems" tend to covary and associate with deficits or biases in emotion knowledge. © The Author(s) 2016.

  3. Neurological soft signs, but not theory of mind and emotion recognition deficit distinguished children with ADHD from healthy control.

    Science.gov (United States)

    Pitzianti, Mariabernarda; Grelloni, Clementina; Casarelli, Livia; D'Agati, Elisa; Spiridigliozzi, Simonetta; Curatolo, Paolo; Pasini, Augusto

    2017-10-01

    Attention Deficit Hyperactivity Disorder (ADHD) is associated with social cognition impairment, executive dysfunction and motor abnormalities, consisting in the persistence of neurological soft signs (NSS). Theory of mind (ToM) and emotion recognition (ER) deficit of children with ADHD have been interpreted as a consequence of their executive dysfunction, particularly inhibitory control deficit. To our knowledge, there are not studies that evaluate the possible correlation between the ToM and ER deficit and NSS in the population with ADHD, while this association has been studied in other psychiatric disorders, such as schizophrenia. Therefore, the aim of this study was to evaluate ToM and ER and NSS in a sample of 23 drug-naïve children with ADHD and a sample of 20 healthy children and the possible correlation between social cognition dysfunction and NSS in ADHD. Our findings suggest that ToM and ER dysfunction is not a constant feature in the population with ADHD, while NSS confirmed as a markers of atypical neurodevelopment and predictors of the severity of functional impairment in children with ADHD. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Motor, visual and emotional deficits in mice after closed-head mild traumatic brain injury are alleviated by the novel CB2 inverse agonist SMM-189.

    Science.gov (United States)

    Reiner, Anton; Heldt, Scott A; Presley, Chaela S; Guley, Natalie H; Elberger, Andrea J; Deng, Yunping; D'Surney, Lauren; Rogers, Joshua T; Ferrell, Jessica; Bu, Wei; Del Mar, Nobel; Honig, Marcia G; Gurley, Steven N; Moore, Bob M

    2014-12-31

    We have developed a focal blast model of closed-head mild traumatic brain injury (TBI) in mice. As true for individuals that have experienced mild TBI, mice subjected to 50-60 psi blast show motor, visual and emotional deficits, diffuse axonal injury and microglial activation, but no overt neuron loss. Because microglial activation can worsen brain damage after a concussive event and because microglia can be modulated by their cannabinoid type 2 receptors (CB2), we evaluated the effectiveness of the novel CB2 receptor inverse agonist SMM-189 in altering microglial activation and mitigating deficits after mild TBI. In vitro analysis indicated that SMM-189 converted human microglia from the pro-inflammatory M1 phenotype to the pro-healing M2 phenotype. Studies in mice showed that daily administration of SMM-189 for two weeks beginning shortly after blast greatly reduced the motor, visual, and emotional deficits otherwise evident after 50-60 psi blasts, and prevented brain injury that may contribute to these deficits. Our results suggest that treatment with the CB2 inverse agonist SMM-189 after a mild TBI event can reduce its adverse consequences by beneficially modulating microglial activation. These findings recommend further evaluation of CB2 inverse agonists as a novel therapeutic approach for treating mild TBI.

  5. Linerless label device and method

    KAUST Repository

    Binladen, Abdulkari

    2016-01-14

    This apparatus and method for applying a linerless label to an end user product includes a device with a printer for printing on a face surface of a linerless label, and a release coat applicator for applying a release coat to the face surface of the label; another device including an unwinder unit (103) to unwind a roll of printed linerless label; a belt (108); a glue applicator (102) for applying glue to the belt; a nip roller (106) for contacting and applying pressure to the face surface of the linerless label such that the glue on the belt transfers to the back surface of the linerless label; at least one slitting knife 105) positioned downstream the belt and a rewinder unit (104) positioned downstream the slitting knife; and a third device which die cuts and applies the linerless label to an end user object.

  6. Pleasure Experience and Emotion Expression in Patients with Schizophrenia

    Science.gov (United States)

    CHU, Min-yi; LI, Xu; LV, Qin-yu; Yl, Zheng-hui; CHEUNG, Eric F. C.; CHAN, Raymond C. K.

    2017-01-01

    Background Impairments in emotional experience and expression have been observed in patients with schizophrenia. However, most previous studies have been limited to either emotional experience (especially anhedonia) or expression. Few studies have examined both the experience and expression of emotion in schizophrenia patients at the same time. Aims The present study aimed to examine pleasure experience and emotion expression in patients with schizophrenia. In particular, we specifically examined the relationship between emotion impairments (both pleasure experience and expression) and negative symptoms. Methods One hundred and fifty patients completed the Temporal Experience of Pleasure Scale and Emotional Expressivity Scale. Results Schizophrenia patients exhibited deficits in experiencing pleasure, but showed intact reported emotion expression. Patients with prominent negative symptoms showed reduced anticipatory pleasure, especially in abstract anticipatory pleasure. Conclusion The present findings suggest that patients with schizophrenia have deficits in pleasure experience, while their abilities to express emotion appear intact. Such deficits are more severe in patients with prominent negative symptoms. PMID:29276350

  7. The Influence of Music on Facial Emotion Recognition in Children with Autism Spectrum Disorder and Neurotypical Children.

    Science.gov (United States)

    Brown, Laura S

    2017-03-01

    Children with autism spectrum disorder (ASD) often struggle with social skills, including the ability to perceive emotions based on facial expressions. Research evidence suggests that many individuals with ASD can perceive emotion in music. Examining whether music can be used to enhance recognition of facial emotion by children with ASD would inform development of music therapy interventions. The purpose of this study was to investigate the influence of music with a strong emotional valance (happy; sad) on children with ASD's ability to label emotions depicted in facial photographs, and their response time. Thirty neurotypical children and 20 children with high-functioning ASD rated expressions of happy, neutral, and sad in 30 photographs under two music listening conditions (sad music; happy music). During each music listening condition, participants rated the 30 images using a 7-point scale that ranged from very sad to very happy. Response time data were also collected across both conditions. A significant two-way interaction revealed that participants' ratings of happy and neutral faces were unaffected by music conditions, but sad faces were perceived to be sadder with sad music than with happy music. Across both conditions, neurotypical children rated the happy faces as happier and the sad faces as sadder than did participants with ASD. Response times of the neurotypical children were consistently shorter than response times of the children with ASD; both groups took longer to rate sad faces than happy faces. Response times of neurotypical children were generally unaffected by the valence of the music condition; however, children with ASD took longer to respond when listening to sad music. Music appears to affect perceptions of emotion in children with ASD, and perceptions of sad facial expressions seem to be more affected by emotionally congruent background music than are perceptions of happy or neutral faces. © the American Music Therapy Association 2016

  8. Facial emotional recognition in schizophrenia: preliminary results of the virtual reality program for facial emotional recognition

    Directory of Open Access Journals (Sweden)

    Teresa Souto

    2013-01-01

    Full Text Available BACKGROUND: Significant deficits in emotional recognition and social perception characterize patients with schizophrenia and have direct negative impact both in inter-personal relationships and in social functioning. Virtual reality, as a methodological resource, might have a high potential for assessment and training skills in people suffering from mental illness. OBJECTIVES: To present preliminary results of a facial emotional recognition assessment designed for patients with schizophrenia, using 3D avatars and virtual reality. METHODS: Presentation of 3D avatars which reproduce images developed with the FaceGen® software and integrated in a three-dimensional virtual environment. Each avatar was presented to a group of 12 patients with schizophrenia and a reference group of 12 subjects without psychiatric pathology. RESULTS: The results show that the facial emotions of happiness and anger are better recognized by both groups and that the major difficulties arise in fear and disgust recognition. Frontal alpha electroencephalography variations were found during the presentation of anger and disgust stimuli among patients with schizophrenia. DISCUSSION: The developed program evaluation module can be of surplus value both for patient and therapist, providing the task execution in a non anxiogenic environment, however similar to the actual experience.

  9. Extending extant models of the pathogenesis of borderline personality disorder to childhood borderline personality symptoms: the roles of affective dysfunction, disinhibition, and self- and emotion-regulation deficits.

    Science.gov (United States)

    Gratz, Kim L; Tull, Matthew T; Reynolds, Elizabeth K; Bagge, Courtney L; Latzman, Robert D; Daughters, Stacey B; Lejuez, C W

    2009-01-01

    Although research has been conducted on the course, consequences, and correlates of borderline personality disorder (BPD), little is known about its emergence in childhood, and no studies have examined the extent to which theoretical models of the pathogenesis of BPD in adults are applicable to the correlates of borderline personality symptoms in children. The goal of this study was to examine the interrelationships between two BPD-relevant personality traits (affective dysfunction and disinhibition), self- and emotion-regulation deficits, and childhood borderline personality symptoms among 263 children aged 9 to 13. We predicted that affective dysfunction, disinhibition, and their interaction would be associated with childhood borderline personality symptoms, and that self- and emotion-regulation deficits would mediate these relationships. Results provided support for the roles of both affective dysfunction and disinhibition (in the form of sensation seeking) in childhood borderline personality symptoms, as well as their hypothesized interaction. Further, both self- and emotion-regulation deficits partially mediated the relationship between affective dysfunction and childhood borderline personality symptoms. Finally, results provided evidence of different gender-based pathways to childhood borderline personality symptoms, suggesting that models of BPD among adults are more relevant to understanding the factors associated with borderline personality symptoms among girls than boys.

  10. From neural signatures of emotional modulation to social cognition: individual differences in healthy volunteers and psychiatric participants

    Science.gov (United States)

    Aguado, Jaume; Baez, Sandra; Huepe, David; Lopez, Vladimir; Ortega, Rodrigo; Sigman, Mariano; Mikulan, Ezequiel; Lischinsky, Alicia; Torrente, Fernando; Cetkovich, Marcelo; Torralva, Teresa; Bekinschtein, Tristan; Manes, Facundo

    2014-01-01

    It is commonly assumed that early emotional signals provide relevant information for social cognition tasks. The goal of this study was to test the association between (a) cortical markers of face emotional processing and (b) social-cognitive measures, and also to build a model which can predict this association (a and b) in healthy volunteers as well as in different groups of psychiatric patients. Thus, we investigated the early cortical processing of emotional stimuli (N170, using a face and word valence task) and their relationship with the social-cognitive profiles (SCPs, indexed by measures of theory of mind, fluid intelligence, speed processing and executive functions). Group comparisons and individual differences were assessed among schizophrenia (SCZ) patients and their relatives, individuals with attention deficit hyperactivity disorder (ADHD), individuals with euthymic bipolar disorder (BD) and healthy participants (educational level, handedness, age and gender matched). Our results provide evidence of emotional N170 impairments in the affected groups (SCZ and relatives, ADHD and BD) as well as subtle group differences. Importantly, cortical processing of emotional stimuli predicted the SCP, as evidenced by a structural equation model analysis. This is the first study to report an association model of brain markers of emotional processing and SCP. PMID:23685775

  11. Craving Facebook? Behavioral addiction to online social networking and its association with emotion regulation deficits.

    Science.gov (United States)

    Hormes, Julia M; Kearns, Brianna; Timko, C Alix

    2014-12-01

    To assess disordered online social networking use via modified diagnostic criteria for substance dependence, and to examine its association with difficulties with emotion regulation and substance use. Cross-sectional survey study targeting undergraduate students. Associations between disordered online social networking use, internet addiction, deficits in emotion regulation and alcohol use problems were examined using univariate and multivariate analyses of covariance. A large University in the Northeastern United States. Undergraduate students (n = 253, 62.8% female, 60.9% white, age mean = 19.68, standard deviation = 2.85), largely representative of the target population. The response rate was 100%. Disordered online social networking use, determined via modified measures of alcohol abuse and dependence, including DSM-IV-TR diagnostic criteria for alcohol dependence, the Penn Alcohol Craving Scale and the Cut-down, Annoyed, Guilt, Eye-opener (CAGE) screen, along with the Young Internet Addiction Test, Alcohol Use Disorders Identification Test, Acceptance and Action Questionnaire-II, White Bear Suppression Inventory and Difficulties in Emotion Regulation Scale. Disordered online social networking use was present in 9.7% [n = 23; 95% confidence interval (5.9, 13.4)] of the sample surveyed, and significantly and positively associated with scores on the Young Internet Addiction Test (P addictive. Modified measures of substance abuse and dependence are suitable in assessing disordered online social networking use. Disordered online social networking use seems to arise as part of a cluster of symptoms of poor emotion regulation skills and heightened susceptibility to both substance and non-substance addiction. © 2014 Society for the Study of Addiction.

  12. Burden and Expressed Emotion of Caregivers in Cases of Adult Substance Use Disorder with and Without Attention Deficit/Hyperactivity Disorder or Autism Spectrum Disorder

    NARCIS (Netherlands)

    Kronenberg, Linda M.; Goossens, Peter J. J.; van Busschbach, Jooske T.; van Achterberg, Theo; van den Brink, Wim

    Objective To identify and compare caregiver burden and expressed emotion (EE) in adult substance use disorder (SUD) patients with and without co-occurring attention deficit/hyperactivity disorder (ADHD) or autism spectrum disorder (ASD). To examine possible differences in correlations between

  13. Association between Ability Emotional Intelligence and Left Insula during Social Judgment of Facial Emotions.

    Directory of Open Access Journals (Sweden)

    Tiziana Quarto

    Full Text Available The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI. Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain "somatic marker circuitry" (SMC sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC. Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.

  14. A critical appraisal of the role of neuropsychological deficits in preschool ADHD.

    Science.gov (United States)

    Sjöwall, Douglas; Thorell, Lisa B

    2018-03-14

    The present study aimed at improving our understanding of the role of neuropsychological deficits in preschool Attention Deficit Hyperactivity Disorder (ADHD). The study included 52 children in the ADHD group and 72 controls (age 4-6 years). Both laboratory measures and teacher reports of executive deficits (i.e., working memory, inhibition, and shifting), delay-related behaviors (i.e., the preference for minimizing delay), and emotional functions (i.e., emotion recognition and regulation) were included. Variable-oriented analyses were complemented with person-oriented analyses (i.e., identifying the proportion of patients considered impaired). Results showed that the ADHD group differed from controls with regard to all measures of executive functioning and most measures of delay-related behaviors, but few differences were found for emotional functioning. A substantial subgroup (23%) of children with ADHD did not have a neuropsychological deficit in any domain. There were subgroups with executive or delay-related deficits only, but no pure emotional subgroup. The overlap between different neuropsychological deficits was much larger when teacher reports were used as opposed to laboratory measures. Regarding functional impairments, large mean differences were found between the ADHD group and controls. However, neuropsychological deficits were not able to explain individual variations in daily life functioning among children with ADHD. In conclusion, the present study identified some important methodological and theoretical issues regarding the role of neuropsychological functioning in preschool ADHD.

  15. Attention-Deficit/Hyperactivity Disorder: A Historical Review (1775 to Present).

    Science.gov (United States)

    Leahy, Laura G

    2017-09-01

    As a new school year approaches, nurses will find themselves faced with students with symptoms of attention-deficit/hyperactivity disorder (ADHD). Navigating the diagnostic label changes and numerous psychopharmacological treatment options can prove time-consuming and confusing. The current article explores the early years of symptom identification, various diagnostic labels, and subsequent psychopharmacological treatments from psychostimulants to non-stimulant alternatives (including a prescription medical food). The current article also serves as a discussion guide for nurses and clinicians when providing education to patients and their loved ones, teachers, coaches, and others who may question the diagnosis and treatment of ADHD. This disorder can have a significant impact on one's ability to function within family, school, work, and social settings. A historical context is provided for the evolution of today's diagnostic criteria and the pharmacotherapy used in the treatment of ADHD. [Journal of Psychosocial Nursing and Mental Health Services, 55(9), 10-16.]. Copyright 2017, SLACK Incorporated.

  16. Oxytocin improves facial emotion recognition in young adults with antisocial personality disorder.

    Science.gov (United States)

    Timmermann, Marion; Jeung, Haang; Schmitt, Ruth; Boll, Sabrina; Freitag, Christine M; Bertsch, Katja; Herpertz, Sabine C

    2017-11-01

    Deficient facial emotion recognition has been suggested to underlie aggression in individuals with antisocial personality disorder (ASPD). As the neuropeptide oxytocin (OT) has been shown to improve facial emotion recognition, it might also exert beneficial effects in individuals providing so much harm to the society. In a double-blind, randomized, placebo-controlled crossover trial, 22 individuals with ASPD and 29 healthy control (HC) subjects (matched for age, sex, intelligence, and education) were intranasally administered either OT (24 IU) or a placebo 45min before participating in an emotion classification paradigm with fearful, angry, and happy faces. We assessed the number of correct classifications and reaction times as indicators of emotion recognition ability. Significant group×substance×emotion interactions were found in correct classifications and reaction times. Compared to HC, individuals with ASPD showed deficits in recognizing fearful and happy faces; these group differences were no longer observable under OT. Additionally, reaction times for angry faces differed significantly between the ASPD and HC group in the placebo condition. This effect was mainly driven by longer reaction times in HC subjects after placebo administration compared to OT administration while individuals with ASPD revealed descriptively the contrary response pattern. Our data indicate an improvement of the recognition of fearful and happy facial expressions by OT in young adults with ASPD. Particularly the increased recognition of facial fear is of high importance since the correct perception of distress signals in others is thought to inhibit aggression. Beneficial effects of OT might be further mediated by improved recognition of facial happiness probably reflecting increased social reward responsiveness. Copyright © 2017. Published by Elsevier Ltd.

  17. Subthalamic nucleus stimulation impairs emotional conflict adaptation in Parkinson's disease.

    Science.gov (United States)

    Irmen, Friederike; Huebl, Julius; Schroll, Henning; Brücke, Christof; Schneider, Gerd-Helge; Hamker, Fred H; Kühn, Andrea A

    2017-10-01

    The subthalamic nucleus (STN) occupies a strategic position in the motor network, slowing down responses in situations with conflicting perceptual input. Recent evidence suggests a role of the STN in emotion processing through strong connections with emotion recognition structures. As deep brain stimulation (DBS) of the STN in patients with Parkinson's disease (PD) inhibits monitoring of perceptual and value-based conflict, STN DBS may also interfere with emotional conflict processing. To assess a possible interference of STN DBS with emotional conflict processing, we used an emotional Stroop paradigm. Subjects categorized face stimuli according to their emotional expression while ignoring emotionally congruent or incongruent superimposed word labels. Eleven PD patients ON and OFF STN DBS and eleven age-matched healthy subjects conducted the task. We found conflict-induced response slowing in healthy controls and PD patients OFF DBS, but not ON DBS, suggesting STN DBS to decrease adaptation to within-trial conflict. OFF DBS, patients showed more conflict-induced slowing for negative conflict stimuli, which was diminished by STN DBS. Computational modelling of STN influence on conflict adaptation disclosed DBS to interfere via increased baseline activity. © The Author (2017). Published by Oxford University Press.

  18. Emotion recognition in body dysmorphic disorder: application of the Reading the Mind in the Eyes Task.

    Science.gov (United States)

    Buhlmann, Ulrike; Winter, Anna; Kathmann, Norbert

    2013-03-01

    Body dysmorphic disorder (BDD) is characterized by perceived appearance-related defects, often tied to aspects of the face or head (e.g., acne). Deficits in decoding emotional expressions have been examined in several psychological disorders including BDD. Previous research indicates that BDD is associated with impaired facial emotion recognition, particularly in situations that involve the BDD sufferer him/herself. The purpose of this study was to further evaluate the ability to read other people's emotions among 31 individuals with BDD, and 31 mentally healthy controls. We applied the Reading the Mind in the Eyes task, in which participants are presented with a series of pairs of eyes, one at a time, and are asked to identify the emotion that describes the stimulus best. The groups did not differ with respect to decoding other people's emotions by looking into their eyes. Findings are discussed in light of previous research examining emotion recognition in BDD. Copyright © 2013. Published by Elsevier Ltd.

  19. Neural correlates of emotional intelligence in a visual emotional oddball task: an ERP study.

    Science.gov (United States)

    Raz, Sivan; Dan, Orrie; Zysberg, Leehu

    2014-11-01

    The present study was aimed at identifying potential behavioral and neural correlates of Emotional Intelligence (EI) by using scalp-recorded Event-Related Potentials (ERPs). EI levels were defined according to both self-report questionnaire and a performance-based ability test. We identified ERP correlates of emotional processing by using a visual-emotional oddball paradigm, in which subjects were confronted with one frequent standard stimulus (a neutral face) and two deviant stimuli (a happy and an angry face). The effects of these faces were then compared across groups with low and high EI levels. The ERP results indicate that participants with high EI exhibited significantly greater mean amplitudes of the P1, P2, N2, and P3 ERP components in response to emotional and neutral faces, at frontal, posterior-parietal and occipital scalp locations. P1, P2 and N2 are considered indexes of attention-related processes and have been associated with early attention to emotional stimuli. The later P3 component has been thought to reflect more elaborative, top-down, emotional information processing including emotional evaluation and memory encoding and formation. These results may suggest greater recruitment of resources to process all emotional and non-emotional faces at early and late processing stages among individuals with higher EI. The present study underscores the usefulness of ERP methodology as a sensitive measure for the study of emotional stimuli processing in the research field of EI. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Differences in Facial Emotion Recognition between First Episode Psychosis, Borderline Personality Disorder and Healthy Controls.

    Directory of Open Access Journals (Sweden)

    Ana Catalan

    Full Text Available Facial emotion recognition (FER is essential to guide social functioning and behaviour for interpersonal communication. FER may be altered in severe mental illness such as in psychosis and in borderline personality disorder patients. However, it is unclear if these FER alterations are specifically related to psychosis. Awareness of FER alterations may be useful in clinical settings to improve treatment strategies. The aim of our study was to examine FER in patients with severe mental disorder and their relation with psychotic symptomatology.Socio-demographic and clinical variables were collected. Alterations on emotion recognition were assessed in 3 groups: patients with first episode psychosis (FEP (n = 64, borderline personality patients (BPD (n = 37 and healthy controls (n = 137, using the Degraded Facial Affect Recognition Task. The Positive and Negative Syndrome Scale, Structured Interview for Schizotypy Revised and Community Assessment of Psychic Experiences scales were used to assess positive psychotic symptoms. WAIS III subtests were used to assess IQ.Kruskal-Wallis analysis showed a significant difference between groups on the FER of neutral faces score between FEP, BPD patients and controls and between FEP patients and controls in angry face recognition. No significant differences were found between groups in the fear or happy conditions. There was a significant difference between groups in the attribution of negative emotion to happy faces. BPD and FEP groups had a much higher tendency to recognize happy faces as negatives. There was no association with the different symptom domains in either group.FEP and BPD patients have problems in recognizing neutral faces more frequently than controls. Moreover, patients tend to over-report negative emotions in recognition of happy faces. Although no relation between psychotic symptoms and FER alterations was found, these deficits could contribute to a patient's misinterpretations in daily life.