WorldWideScience

Sample records for impaired emotion recognition

  1. Impaired emotion recognition in music in Parkinson's disease.

    Science.gov (United States)

    van Tricht, Mirjam J; Smeding, Harriet M M; Speelman, Johannes D; Schmand, Ben A

    2010-10-01

    Music has the potential to evoke strong emotions and plays a significant role in the lives of many people. Music might therefore be an ideal medium to assess emotion recognition. We investigated emotion recognition in music in 20 patients with idiopathic Parkinson's disease (PD) and 20 matched healthy volunteers. The role of cognitive dysfunction and other disease characteristics in emotion recognition was also evaluated. We used 32 musical excerpts that expressed happiness, sadness, fear or anger. PD patients were impaired in recognizing fear and anger in music. Fear recognition was associated with executive functions in PD patients and in healthy controls, but the emotion recognition impairments of PD patients persisted after adjusting for executive functioning. We found no differences in the recognition of happy or sad music. Emotion recognition was not related to depressive symptoms, disease duration or severity of motor symptoms. We conclude that PD patients are impaired in recognizing complex emotions in music. Although this impairment is related to executive dysfunction, our findings most likely reflect an additional primary deficit in emotional processing. 2010 Elsevier Inc. All rights reserved.

  2. Bihippocampal damage with emotional dysfunction: impaired auditory recognition of fear.

    Science.gov (United States)

    Ghika-Schmid, F; Ghika, J; Vuilleumier, P; Assal, G; Vuadens, P; Scherer, K; Maeder, P; Uske, A; Bogousslavsky, J

    1997-01-01

    A right-handed man developed a sudden transient, amnestic syndrome associated with bilateral hemorrhage of the hippocampi, probably due to Urbach-Wiethe disease. In the 3rd month, despite significant hippocampal structural damage on imaging, only a milder degree of retrograde and anterograde amnesia persisted on detailed neuropsychological examination. On systematic testing of recognition of facial and vocal expression of emotion, we found an impairment of the vocal perception of fear, but not that of other emotions, such as joy, sadness and anger. Such selective impairment of fear perception was not present in the recognition of facial expression of emotion. Thus emotional perception varies according to the different aspects of emotions and the different modality of presentation (faces versus voices). This is consistent with the idea that there may be multiple emotion systems. The study of emotional perception in this unique case of bilateral involvement of hippocampus suggests that this structure may play a critical role in the recognition of fear in vocal expression, possibly dissociated from that of other emotions and from that of fear in facial expression. In regard of recent data suggesting that the amygdala is playing a role in the recognition of fear in the auditory as well as in the visual modality this could suggest that the hippocampus may be part of the auditory pathway of fear recognition.

  3. Recovery from emotion recognition impairment after temporal lobectomy

    Directory of Open Access Journals (Sweden)

    Francesca eBenuzzi

    2014-06-01

    Full Text Available Mesial temporal lobe epilepsy (MTLE can be associated with emotion recognition impairment that can be particularly severe in patients with early onset seizures (1-3. Whereas there is growing evidence that memory and language can improve in seizure-free patients after anterior temporal lobectomy (ATL (4, the effects of surgery on emotional processing are still unknown. We used functional magnetic resonance imaging (fMRI to investigate short-term reorganization of networks engaged in facial emotion recognition in MTLE patients. Behavioral and fMRI data were collected from six patients before and after ATL. During the fMRI scan, patients were asked to make a gender decision on fearful and neutral faces. Behavioral data demonstrated that two patients with early-onset right MTLE were impaired in fear recognition while fMRI results showed they lacked specific activations for fearful faces. Post-ATL behavioral data showed improved emotion recognition ability, while fMRI demonstrated the recruitment of a functional network for fearful face processing. Our results suggest that ATL elicited brain plasticity mechanisms allowing behavioral and fMRI improvement in emotion recognition.

  4. Degraded Impairment of Emotion Recognition in Parkinson's Disease Extends from Negative to Positive Emotions.

    Science.gov (United States)

    Lin, Chia-Yao; Tien, Yi-Min; Huang, Jong-Tsun; Tsai, Chon-Haw; Hsu, Li-Chuan

    2016-01-01

    Because of dopaminergic neurodegeneration, patients with Parkinson's disease (PD) show impairment in the recognition of negative facial expressions. In the present study, we aimed to determine whether PD patients with more advanced motor problems would show a much greater deficit in recognition of emotional facial expressions than a control group and whether impairment of emotion recognition would extend to positive emotions. Twenty-nine PD patients and 29 age-matched healthy controls were recruited. Participants were asked to discriminate emotions in Experiment  1 and identify gender in Experiment  2. In Experiment  1, PD patients demonstrated a recognition deficit for negative (sadness and anger) and positive faces. Further analysis showed that only PD patients with high motor dysfunction performed poorly in recognition of happy faces. In Experiment  2, PD patients showed an intact ability for gender identification, and the results eliminated possible abilities in the functions measured in Experiment  2 as alternative explanations for the results of Experiment  1. We concluded that patients' ability to recognize emotions deteriorated as the disease progressed. Recognition of negative emotions was impaired first, and then the impairment extended to positive emotions.

  5. Degraded Impairment of Emotion Recognition in Parkinson’s Disease Extends from Negative to Positive Emotions

    Directory of Open Access Journals (Sweden)

    Chia-Yao Lin

    2016-01-01

    Full Text Available Because of dopaminergic neurodegeneration, patients with Parkinson’s disease (PD show impairment in the recognition of negative facial expressions. In the present study, we aimed to determine whether PD patients with more advanced motor problems would show a much greater deficit in recognition of emotional facial expressions than a control group and whether impairment of emotion recognition would extend to positive emotions. Twenty-nine PD patients and 29 age-matched healthy controls were recruited. Participants were asked to discriminate emotions in Experiment  1 and identify gender in Experiment  2. In Experiment  1, PD patients demonstrated a recognition deficit for negative (sadness and anger and positive faces. Further analysis showed that only PD patients with high motor dysfunction performed poorly in recognition of happy faces. In Experiment  2, PD patients showed an intact ability for gender identification, and the results eliminated possible abilities in the functions measured in Experiment  2 as alternative explanations for the results of Experiment  1. We concluded that patients’ ability to recognize emotions deteriorated as the disease progressed. Recognition of negative emotions was impaired first, and then the impairment extended to positive emotions.

  6. Impaired Emotion Recognition in Music in Parkinson's Disease

    Science.gov (United States)

    van Tricht, Mirjam J.; Smeding, Harriet M. M.; Speelman, Johannes D.; Schmand, Ben A.

    2010-01-01

    Music has the potential to evoke strong emotions and plays a significant role in the lives of many people. Music might therefore be an ideal medium to assess emotion recognition. We investigated emotion recognition in music in 20 patients with idiopathic Parkinson's disease (PD) and 20 matched healthy volunteers. The role of cognitive dysfunction…

  7. Impaired emotion recognition in music in Parkinson's disease

    NARCIS (Netherlands)

    van Tricht, Mirjam J.; Smeding, Harriet M. M.; Speelman, Johannes D.; Schmand, Ben A.

    2010-01-01

    Music has the potential to evoke strong emotions and plays a significant role in the lives of many people. Music might therefore be an ideal medium to assess emotion recognition. We investigated emotion recognition in music in 20 patients with idiopathic Parkinson's disease (PD) and 20 matched

  8. Impaired emotion recognition in music in Parkinson's disease

    NARCIS (Netherlands)

    van Tricht, M.J.; Smeding, H.M.M.; Speelman, J.D.; Schmand, B.A.

    2010-01-01

    Music has the potential to evoke strong emotions and plays a significant role in the lives of many people. Music might therefore be an ideal medium to assess emotion recognition. We investigated emotion recognition in music in 20 patients with idiopathic Parkinson’s disease (PD) and 20 matched

  9. Facial Emotion Recognition Impairments are Associated with Brain Volume Abnormalities in Individuals with HIV

    Science.gov (United States)

    Clark, Uraina S.; Walker, Keenan A.; Cohen, Ronald A.; Devlin, Kathryn N.; Folkers, Anna M.; Pina, Mathew M.; Tashima, Karen T.

    2015-01-01

    Impaired facial emotion recognition abilities in HIV+ patients are well documented, but little is known about the neural etiology of these difficulties. We examined the relation of facial emotion recognition abilities to regional brain volumes in 44 HIV-positive (HIV+) and 44 HIV-negative control (HC) adults. Volumes of structures implicated in HIV− associated neuropathology and emotion recognition were measured on MRI using an automated segmentation tool. Relative to HC, HIV+ patients demonstrated emotion recognition impairments for fearful expressions, reduced anterior cingulate cortex (ACC) volumes, and increased amygdala volumes. In the HIV+ group, fear recognition impairments correlated significantly with ACC, but not amygdala volumes. ACC reductions were also associated with lower nadir CD4 levels (i.e., greater HIV-disease severity). These findings extend our understanding of the neurobiological substrates underlying an essential social function, facial emotion recognition, in HIV+ individuals and implicate HIV-related ACC atrophy in the impairment of these abilities. PMID:25744868

  10. Emotion-induced impairments in speeded word recognition tasks.

    Science.gov (United States)

    Zeelenberg, René; Bocanegra, Bruno R; Pecher, Diane

    2011-01-01

    Recent studies show that emotional stimuli impair the identification of subsequently presented, briefly flashed stimuli. In the present study, we investigated whether emotional distractors (primes) impaired target processing when presentation of the target stimulus was not impoverished. In lexical decision, animacy decision, rhyme decision, and nonword naming, targets were presented in such a manner that they were clearly visible (i.e., targets were not masked and presented until participants responded). In all tasks taboo-sexual distractors caused a slowdown in responding to the subsequent neutral target. Our results indicate that the detrimental effects of emotional distractors are not confined to paradigms in which visibility of the target is limited. Moreover, impairments were obtained even when semantic processing of stimuli was not required.

  11. Impaired emotion recognition is linked to alexithymia in heroin addicts

    Directory of Open Access Journals (Sweden)

    Giuseppe Craparo

    2016-04-01

    Full Text Available Several investigations document altered emotion processing in opiate addiction. Nevertheless, the origin of this phenomenon remains unclear. Here we examined the role of alexithymia in the ability (i.e., number of errors—accuracy and reaction times—RTs of thirty-one heroin addicts and thirty-one healthy controls to detect several affective expressions. Results show generally lower accuracy and higher RTs in the recognition of facial expressions of emotions for patients, compared to controls. The hierarchical multivariate regression analysis shows that alexithymia might be responsible of the between groups difference with respect to the RTs in emotion detection. Overall, we provide new insights in the clinical interpretation of affective deficits in heroin addicts suggesting a role of alexithymia in their ability to recognize emotions.

  12. Joint recognition-expression impairment of facial emotions in Huntington's disease despite intact understanding of feelings.

    Science.gov (United States)

    Trinkler, Iris; Cleret de Langavant, Laurent; Bachoud-Lévi, Anne-Catherine

    2013-02-01

    Patients with Huntington's disease (HD), a neurodegenerative disorder that causes major motor impairments, also show cognitive and emotional deficits. While their deficit in recognising emotions has been explored in depth, little is known about their ability to express emotions and understand their feelings. If these faculties were impaired, patients might not only mis-read emotion expressions in others but their own emotions might be mis-interpreted by others as well, or thirdly, they might have difficulties understanding and describing their feelings. We compared the performance of recognition and expression of facial emotions in 13 HD patients with mild motor impairments but without significant bucco-facial abnormalities, and 13 controls matched for age and education. Emotion recognition was investigated in a forced-choice recognition test (FCR), and emotion expression by filming participants while they mimed the six basic emotional facial expressions (anger, disgust, fear, surprise, sadness and joy) to the experimenter. The films were then segmented into 60 stimuli per participant and four external raters performed a FCR on this material. Further, we tested understanding of feelings in self (alexithymia) and others (empathy) using questionnaires. Both recognition and expression were impaired across different emotions in HD compared to controls and recognition and expression scores were correlated. By contrast, alexithymia and empathy scores were very similar in HD and controls. This might suggest that emotion deficits in HD might be tied to the expression itself. Because similar emotion recognition-expression deficits are also found in Parkinson's Disease and vascular lesions of the striatum, our results further confirm the importance of the striatum for emotion recognition and expression, while access to the meaning of feelings relies on a different brain network, and is spared in HD. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Common impairments of emotional facial expression recognition in schizophrenia across French and Japanese cultures

    Directory of Open Access Journals (Sweden)

    Takashi eOkada

    2015-07-01

    Full Text Available To address whether the recognition of emotional facial expressions is impaired in schizophrenia across different cultures, patients with schizophrenia and age-matched normal controls in France and Japan were tested with a labeling task of emotional facial expressions and a matching task of unfamiliar faces. Schizophrenia patients in both France and Japan were less accurate in labeling fearful facial expressions. There was no correlation between the scores of facial emotion labeling and face matching. These results suggest that the impaired recognition of emotional facial expressions in schizophrenia is common across different cultures.

  14. Emotion recognition impairment in traumatic brain injury compared with schizophrenia spectrum: similar deficits with different origins.

    Science.gov (United States)

    Mancuso, Mauro; Magnani, Nadia; Cantagallo, Anna; Rossi, Giulia; Capitani, Donatella; Galletti, Vania; Cardamone, Giuseppe; Robertson, Ian Hamilton

    2015-02-01

    The aim of our study was to identify the common and separate mechanisms that might underpin emotion recognition impairment in patients with traumatic brain injury (TBI) and schizophrenia (Sz) compared with healthy controls (HCs). We recruited 21 Sz outpatients, 24 severe TBI outpatients, and 38 HCs, and we used eye-tracking to compare facial emotion processing performance. Both Sz and TBI patients were significantly poorer at recognizing facial emotions compared with HC. Sz patients showed a different way of exploring the Pictures of Facial Affects stimuli and were significantly worse in recognition of neutral expressions. Selective or sustained attention deficits in TBI may reduce efficient emotion recognition, whereas in Sz, there is a more strategic deficit underlying the observed problem. There would seem to be scope for adjustment of effective rehabilitative training focused on emotion recognition.

  15. Impairment in the recognition of emotion across different media following traumatic brain injury.

    Science.gov (United States)

    Williams, Claire; Wood, Rodger Ll

    2010-02-01

    The current study examined emotion recognition following traumatic brain injury (TBI) and examined whether performance differed according to the affective valence and type of media presentation of the stimuli. A total of 64 patients with TBI and matched controls completed the Emotion Evaluation Test (EET) and Ekman 60 Faces Test (E-60-FT). Patients with TBI also completed measures of information processing and verbal ability. Results revealed that the TBI group were significantly impaired compared to controls when recognizing emotion on the EET and E-60-FT. A significant main effect of valence was found in both groups, with poor recognition of negative emotions. However, the difference between the recognition of positive and negative emotions was larger in the TBI group. The TBI group were also more accurate recognizing emotion displayed in audiovisual media (EET) than that displayed in still media (E-60-FT). No significant relationship was obtained between emotion recognition tasks and information-processing speed. A significant positive relationship was found between the E-60-FT and one measure of verbal ability. These findings support models of emotion that specify separate neurological pathways for certain emotions and different media and confirm that patients with TBI are vulnerable to experiencing emotion recognition difficulties.

  16. Emotion recognition in mild cognitive impairment: relationship to psychosocial disability and caregiver burden.

    Science.gov (United States)

    McCade, Donna; Savage, Greg; Guastella, Adam; Hickie, Ian B; Lewis, Simon J G; Naismith, Sharon L

    2013-09-01

    Impaired emotion recognition in dementia is associated with increased patient agitation, behavior management difficulties, and caregiver burden. Emerging evidence supports the presence of very early emotion recognition difficulties in mild cognitive impairment (MCI); however, the relationship between these impairments and psychosocial measures is not yet explored. Emotion recognition abilities of 27 patients with nonamnestic MCI (naMCI), 29 patients with amnestic MCI (aMCI), and 22 control participants were assessed. Self-report measures assessed patient functional disability, while informants rated the degree of burden they experienced. Difficulties in recognizing anger was evident in the amnestic subtype. Although both the patient groups reported greater social functioning disability, compared with the controls, a relationship between social dysfunction and anger recognition was evident only for patients with naMCI. A significant association was found between burden and anger recognition in patients with aMCI. Impaired emotion recognition abilities impact MCI subtypes differentially. Interventions targeted at patients with MCI, and caregivers are warranted.

  17. [Measuring impairment of facial affects recognition in schizophrenia. Preliminary study of the facial emotions recognition task (TREF)].

    Science.gov (United States)

    Gaudelus, B; Virgile, J; Peyroux, E; Leleu, A; Baudouin, J-Y; Franck, N

    2015-06-01

    The impairment of social cognition, including facial affects recognition, is a well-established trait in schizophrenia, and specific cognitive remediation programs focusing on facial affects recognition have been developed by different teams worldwide. However, even though social cognitive impairments have been confirmed, previous studies have also shown heterogeneity of the results between different subjects. Therefore, assessment of personal abilities should be measured individually before proposing such programs. Most research teams apply tasks based on facial affects recognition by Ekman et al. or Gur et al. However, these tasks are not easily applicable in a clinical exercise. Here, we present the Facial Emotions Recognition Test (TREF), which is designed to identify facial affects recognition impairments in a clinical practice. The test is composed of 54 photos and evaluates abilities in the recognition of six universal emotions (joy, anger, sadness, fear, disgust and contempt). Each of these emotions is represented with colored photos of 4 different models (two men and two women) at nine intensity levels from 20 to 100%. Each photo is presented during 10 seconds; no time limit for responding is applied. The present study compared the scores of the TREF test in a sample of healthy controls (64 subjects) and people with stabilized schizophrenia (45 subjects) according to the DSM IV-TR criteria. We analysed global scores for all emotions, as well as sub scores for each emotion between these two groups, taking into account gender differences. Our results were coherent with previous findings. Applying TREF, we confirmed an impairment in facial affects recognition in schizophrenia by showing significant differences between the two groups in their global results (76.45% for healthy controls versus 61.28% for people with schizophrenia), as well as in sub scores for each emotion except for joy. Scores for women were significantly higher than for men in the population

  18. Neuroanatomical correlates of impaired decision-making and facial emotion recognition in early Parkinson's disease.

    Science.gov (United States)

    Ibarretxe-Bilbao, Naroa; Junque, Carme; Tolosa, Eduardo; Marti, Maria-Jose; Valldeoriola, Francesc; Bargallo, Nuria; Zarei, Mojtaba

    2009-09-01

    Decision-making and recognition of emotions are often impaired in patients with Parkinson's disease (PD). The orbitofrontal cortex (OFC) and the amygdala are critical structures subserving these functions. This study was designed to test whether there are any structural changes in these areas that might explain the impairment of decision-making and recognition of facial emotions in early PD. We used the Iowa Gambling Task (IGT) and the Ekman 60 faces test which are sensitive to the integrity of OFC and amygdala dysfunctions in 24 early PD patients and 24 controls. High-resolution structural magnetic resonance images (MRI) were also obtained. Group analysis using voxel-based morphometry (VBM) showed significant and corrected (P decision-making and recognition of facial emotions occurs at the early stages of PD, (ii) these neuropsychological deficits are accompanied by degeneration of OFC and amygdala, and (iii) bilateral OFC reductions are associated with impaired recognition of emotions, and GM volume loss in left lateral OFC is related to decision-making impairment in PD.

  19. Emotion recognition impairment and apathy after subthalamic nucleus stimulation in Parkinson's disease have separate neural substrates.

    Science.gov (United States)

    Drapier, D; Péron, J; Leray, E; Sauleau, P; Biseul, I; Drapier, S; Le Jeune, F; Travers, D; Bourguignon, A; Haegelen, C; Millet, B; Vérin, M

    2008-09-01

    To test the hypothesis that emotion recognition and apathy share the same functional circuit involving the subthalamic nucleus (STN). A consecutive series of 17 patients with advanced Parkinson's disease (PD) was assessed 3 months before (M-3) and 3 months (M+3) after STN deep brain stimulation (DBS). Mean (+/-S.D.) age at surgery was 56.9 (8.7) years. Mean disease duration at surgery was 11.8 (2.6) years. Apathy was measured using the Apathy Evaluation Scale (AES) at both M-3 and M3. Patients were also assessed using a computerised paradigm of facial emotion recognition [Ekman, P., & Friesen, W. V. (1976). Pictures of facial affect. Palo Alto: Consulting Psychologist Press] before and after STN DBS. Prior to this, the Benton Facial Recognition Test was used to check that the ability to perceive faces was intact. Apathy had significantly worsened at M3 (42.5+/-8.9, p=0.006) after STN-DBS, in relation to the preoperative assessment (37.2+/-5.5). There was also a significant reduction in recognition percentages for facial expressions of fear (43.1%+/-22.9 vs. 61.6%+/-21.4, p=0.022) and sadness (52.7%+/-19.1 vs. 67.6%+/-22.8, p=0.031) after STN DBS. However, the postoperative worsening of apathy and emotion recognition impairment were not correlated. Our results confirm that the STN is involved in both the apathy and emotion recognition networks. However, the absence of any correlation between apathy and emotion recognition impairment suggests that the worsening of apathy following surgery could not be explained by a lack of facial emotion recognition and that its behavioural and cognitive components should therefore also be taken into consideration.

  20. Does aging impair first impression accuracy? Differentiating emotion recognition from complex social inferences.

    Science.gov (United States)

    Krendl, Anne C; Rule, Nicholas O; Ambady, Nalini

    2014-09-01

    Young adults can be surprisingly accurate at making inferences about people from their faces. Although these first impressions have important consequences for both the perceiver and the target, it remains an open question whether first impression accuracy is preserved with age. Specifically, could age differences in impressions toward others stem from age-related deficits in accurately detecting complex social cues? Research on aging and impression formation suggests that young and older adults show relative consensus in their first impressions, but it is unknown whether they differ in accuracy. It has been widely shown that aging disrupts emotion recognition accuracy, and that these impairments may predict deficits in other social judgments, such as detecting deceit. However, it is unclear whether general impression formation accuracy (e.g., emotion recognition accuracy, detecting complex social cues) relies on similar or distinct mechanisms. It is important to examine this question to evaluate how, if at all, aging might affect overall accuracy. Here, we examined whether aging impaired first impression accuracy in predicting real-world outcomes and categorizing social group membership. Specifically, we studied whether emotion recognition accuracy and age-related cognitive decline (which has been implicated in exacerbating deficits in emotion recognition) predict first impression accuracy. Our results revealed that emotion recognition accuracy did not predict first impression accuracy, nor did age-related cognitive decline impair it. These findings suggest that domains of social perception outside of emotion recognition may rely on mechanisms that are relatively unimpaired by aging. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  1. Face to face: blocking facial mimicry can selectively impair recognition of emotional expressions.

    Science.gov (United States)

    Oberman, Lindsay M; Winkielman, Piotr; Ramachandran, Vilayanur S

    2007-01-01

    People spontaneously mimic a variety of behaviors, including emotional facial expressions. Embodied cognition theories suggest that mimicry reflects internal simulation of perceived emotion in order to facilitate its understanding. If so, blocking facial mimicry should impair recognition of expressions, especially of emotions that are simulated using facial musculature. The current research tested this hypothesis using four expressions (happy, disgust, fear, and sad) and two mimicry-interfering manipulations (1) biting on a pen and (2) chewing gum, as well as two control conditions. Experiment 1 used electromyography over cheek, mouth, and nose regions. The bite manipulation consistently activated assessed muscles, whereas the chew manipulation activated muscles only intermittently. Further, expressing happiness generated most facial action. Experiment 2 found that the bite manipulation interfered most with recognition of happiness. These findings suggest that facial mimicry differentially contributes to recognition of specific facial expressions, thus allowing for more refined predictions from embodied cognition theories.

  2. Impaired recognition of facial emotions from low-spatial frequencies in Asperger syndrome.

    Science.gov (United States)

    Kätsyri, Jari; Saalasti, Satu; Tiippana, Kaisa; von Wendt, Lennart; Sams, Mikko

    2008-01-01

    The theory of 'weak central coherence' [Happe, F., & Frith, U. (2006). The weak coherence account: Detail-focused cognitive style in autism spectrum disorders. Journal of Autism and Developmental Disorders, 36(1), 5-25] implies that persons with autism spectrum disorders (ASDs) have a perceptual bias for local but not for global stimulus features. The recognition of emotional facial expressions representing various different levels of detail has not been studied previously in ASDs. We analyzed the recognition of four basic emotional facial expressions (anger, disgust, fear and happiness) from low-spatial frequencies (overall global shapes without local features) in adults with an ASD. A group of 20 participants with Asperger syndrome (AS) was compared to a group of non-autistic age- and sex-matched controls. Emotion recognition was tested from static and dynamic facial expressions whose spatial frequency contents had been manipulated by low-pass filtering at two levels. The two groups recognized emotions similarly from non-filtered faces and from dynamic vs. static facial expressions. In contrast, the participants with AS were less accurate than controls in recognizing facial emotions from very low-spatial frequencies. The results suggest intact recognition of basic facial emotions and dynamic facial information, but impaired visual processing of global features in ASDs.

  3. Emotional face recognition deficit in amnestic patients with mild cognitive impairment: behavioral and electrophysiological evidence

    Directory of Open Access Journals (Sweden)

    Yang L

    2015-08-01

    Full Text Available Linlin Yang, Xiaochuan Zhao, Lan Wang, Lulu Yu, Mei Song, Xueyi Wang Department of Mental Health, The First Hospital of Hebei Medical University, Hebei Medical University Institute of Mental Health, Shijiazhuang, People’s Republic of China Abstract: Amnestic mild cognitive impairment (MCI has been conceptualized as a transitional stage between healthy aging and Alzheimer’s disease. Thus, understanding emotional face recognition deficit in patients with amnestic MCI could be useful in determining progression of amnestic MCI. The purpose of this study was to investigate the features of emotional face processing in amnestic MCI by using event-related potentials (ERPs. Patients with amnestic MCI and healthy controls performed a face recognition task, giving old/new responses to previously studied and novel faces with different emotional messages as the stimulus material. Using the learning-recognition paradigm, the experiments were divided into two steps, ie, a learning phase and a test phase. ERPs were analyzed on electroencephalographic recordings. The behavior data indicated high emotion classification accuracy for patients with amnestic MCI and for healthy controls. The mean percentage of correct classifications was 81.19% for patients with amnestic MCI and 96.46% for controls. Our ERP data suggest that patients with amnestic MCI were still be able to undertake personalizing processing for negative faces, but not for neutral or positive faces, in the early frontal processing stage. In the early time window, no differences in frontal old/new effect were found between patients with amnestic MCI and normal controls. However, in the late time window, the three types of stimuli did not elicit any old/new parietal effects in patients with amnestic MCI, suggesting their recollection was impaired. This impairment may be closely associated with amnestic MCI disease. We conclude from our data that face recognition processing and emotional memory is

  4. Facial Emotion Recognition Impairment in Patients with Parkinson's Disease and Isolated Apathy

    Directory of Open Access Journals (Sweden)

    Mercè Martínez-Corral

    2010-01-01

    Full Text Available Apathy is a frequent feature of Parkinson's disease (PD, usually related with executive dysfunction. However, in a subgroup of PD patients apathy may represent the only or predominant neuropsychiatric feature. To understand the mechanisms underlying apathy in PD, we investigated emotional processing in PD patients with and without apathy and in healthy controls (HC, assessed by a facial emotion recognition task (FERT. We excluded PD patients with cognitive impairment, depression, other affective disturbances and previous surgery for PD. PD patients with apathy scored significantly worse in the FERT, performing worse in fear, anger, and sadness recognition. No differences, however, were found between nonapathetic PD patients and HC. These findings suggest the existence of a disruption of emotional-affective processing in cognitive preserved PD patients with apathy. To identify specific dysfunction of limbic structures in PD, patients with isolated apathy may have therapeutic and prognostic implications.

  5. LSD Acutely Impairs Fear Recognition and Enhances Emotional Empathy and Sociality.

    Science.gov (United States)

    Dolder, Patrick C; Schmid, Yasmin; Müller, Felix; Borgwardt, Stefan; Liechti, Matthias E

    2016-10-01

    Lysergic acid diethylamide (LSD) is used recreationally and has been evaluated as an adjunct to psychotherapy to treat anxiety in patients with life-threatening illness. LSD is well-known to induce perceptual alterations, but unknown is whether LSD alters emotional processing in ways that can support psychotherapy. We investigated the acute effects of LSD on emotional processing using the Face Emotion Recognition Task (FERT) and Multifaceted Empathy Test (MET). The effects of LSD on social behavior were tested using the Social Value Orientation (SVO) test. Two similar placebo-controlled, double-blind, random-order, crossover studies were conducted using 100 μg LSD in 24 subjects and 200 μg LSD in 16 subjects. All of the subjects were healthy and mostly hallucinogen-naive 25- to 65-year-old volunteers (20 men, 20 women). LSD produced feelings of happiness, trust, closeness to others, enhanced explicit and implicit emotional empathy on the MET, and impaired the recognition of sad and fearful faces on the FERT. LSD enhanced the participants' desire to be with other people and increased their prosocial behavior on the SVO test. These effects of LSD on emotion processing and sociality may be useful for LSD-assisted psychotherapy.

  6. Scanning patterns of faces do not explain impaired emotion recognition in Huntington Disease: Evidence for a high level mechanism

    Directory of Open Access Journals (Sweden)

    Marieke evan Asselen

    2012-02-01

    Full Text Available Previous studies in patients with amygdala lesions suggested that deficits in emotion recognition might be mediated by impaired scanning patterns of faces. Here we investigated whether scanning patterns also contribute to the selective impairment in recognition of disgust in Huntington disease (HD. To achieve this goal, we recorded eye movements during a two-alternative forced choice emotion recognition task. HD patients in presymptomatic (n=16 and symptomatic (n=9 disease stages were tested and their performance was compared to a control group (n=22. In our emotion recognition task, participants had to indicate whether a face reflected one of six basic emotions. In addition, and in order to define whether emotion recognition was altered when the participants were forced to look at a specific component of the face, we used a second task where only limited facial information was provided (eyes/mouth in partially masked faces. Behavioural results showed no differences in the ability to recognize emotions between presymptomatic gene carriers and controls. However, an emotion recognition deficit was found for all 6 basic emotion categories in early stage HD. Analysis of eye movement patterns showed that patient and controls used similar scanning strategies. Patterns of deficits were similar regardless of whether parts of the faces were masked or not, thereby confirming that selective attention to particular face parts is not underlying the deficits. These results suggest that the emotion recognition deficits in symptomatic HD patients cannot be explained by impaired scanning patterns of faces. Furthermore, no selective deficit for recognition of disgust was found in presymptomatic HD patients.

  7. Deficits in facial emotion recognition indicate behavioral changes and impaired self-awareness after moderate to severe traumatic brain injury.

    Science.gov (United States)

    Spikman, Jacoba M; Milders, Maarten V; Visser-Keizer, Annemarie C; Westerhof-Evers, Herma J; Herben-Dekker, Meike; van der Naalt, Joukje

    2013-01-01

    Traumatic brain injury (TBI) is a leading cause of disability, specifically among younger adults. Behavioral changes are common after moderate to severe TBI and have adverse consequences for social and vocational functioning. It is hypothesized that deficits in social cognition, including facial affect recognition, might underlie these behavioral changes. Measurement of behavioral deficits is complicated, because the rating scales used rely on subjective judgement, often lack specificity and many patients provide unrealistically positive reports of their functioning due to impaired self-awareness. Accordingly, it is important to find performance based tests that allow objective and early identification of these problems. In the present study 51 moderate to severe TBI patients in the sub-acute and chronic stage were assessed with a test for emotion recognition (FEEST) and a questionnaire for behavioral problems (DEX) with a self and proxy rated version. Patients performed worse on the total score and on the negative emotion subscores of the FEEST than a matched group of 31 healthy controls. Patients also exhibited significantly more behavioral problems on both the DEX self and proxy rated version, but proxy ratings revealed more severe problems. No significant correlation was found between FEEST scores and DEX self ratings. However, impaired emotion recognition in the patients, and in particular of Sadness and Anger, was significantly correlated with behavioral problems as rated by proxies and with impaired self-awareness. This is the first study to find these associations, strengthening the proposed recognition of social signals as a condition for adequate social functioning. Hence, deficits in emotion recognition can be conceived as markers for behavioral problems and lack of insight in TBI patients. This finding is also of clinical importance since, unlike behavioral problems, emotion recognition can be objectively measured early after injury, allowing for early

  8. Deficits in facial emotion recognition indicate behavioral changes and impaired self-awareness after moderate to severe traumatic brain injury.

    Directory of Open Access Journals (Sweden)

    Jacoba M Spikman

    Full Text Available Traumatic brain injury (TBI is a leading cause of disability, specifically among younger adults. Behavioral changes are common after moderate to severe TBI and have adverse consequences for social and vocational functioning. It is hypothesized that deficits in social cognition, including facial affect recognition, might underlie these behavioral changes. Measurement of behavioral deficits is complicated, because the rating scales used rely on subjective judgement, often lack specificity and many patients provide unrealistically positive reports of their functioning due to impaired self-awareness. Accordingly, it is important to find performance based tests that allow objective and early identification of these problems. In the present study 51 moderate to severe TBI patients in the sub-acute and chronic stage were assessed with a test for emotion recognition (FEEST and a questionnaire for behavioral problems (DEX with a self and proxy rated version. Patients performed worse on the total score and on the negative emotion subscores of the FEEST than a matched group of 31 healthy controls. Patients also exhibited significantly more behavioral problems on both the DEX self and proxy rated version, but proxy ratings revealed more severe problems. No significant correlation was found between FEEST scores and DEX self ratings. However, impaired emotion recognition in the patients, and in particular of Sadness and Anger, was significantly correlated with behavioral problems as rated by proxies and with impaired self-awareness. This is the first study to find these associations, strengthening the proposed recognition of social signals as a condition for adequate social functioning. Hence, deficits in emotion recognition can be conceived as markers for behavioral problems and lack of insight in TBI patients. This finding is also of clinical importance since, unlike behavioral problems, emotion recognition can be objectively measured early after injury

  9. Facing the Problem: Impaired Emotion Recognition During Multimodal Social Information Processing in Borderline Personality Disorder.

    Science.gov (United States)

    Niedtfeld, Inga; Defiebre, Nadine; Regenbogen, Christina; Mier, Daniela; Fenske, Sabrina; Kirsch, Peter; Lis, Stefanie; Schmahl, Christian

    2017-04-01

    Previous research has revealed alterations and deficits in facial emotion recognition in patients with borderline personality disorder (BPD). During interpersonal communication in daily life, social signals such as speech content, variation in prosody, and facial expression need to be considered simultaneously. We hypothesized that deficits in higher level integration of social stimuli contribute to difficulties in emotion recognition in BPD, and heightened arousal might explain this effect. Thirty-one patients with BPD and thirty-one healthy controls were asked to identify emotions in short video clips, which were designed to represent different combinations of the three communication channels: facial expression, speech content, and prosody. Skin conductance was recorded as a measure of sympathetic arousal, while controlling for state dissociation. Patients with BPD showed lower mean accuracy scores than healthy control subjects in all conditions comprising emotional facial expressions. This was true for the condition with facial expression only, and for the combination of all three communication channels. Electrodermal responses were enhanced in BPD only in response to auditory stimuli. In line with the major body of facial emotion recognition studies, we conclude that deficits in the interpretation of facial expressions lead to the difficulties observed in multimodal emotion processing in BPD.

  10. In the face of threat: neural and endocrine correlates of impaired facial emotion recognition in cocaine dependence.

    Science.gov (United States)

    Ersche, K D; Hagan, C C; Smith, D G; Jones, P S; Calder, A J; Williams, G B

    2015-05-26

    The ability to recognize facial expressions of emotion in others is a cornerstone of human interaction. Selective impairments in the recognition of facial expressions of fear have frequently been reported in chronic cocaine users, but the nature of these impairments remains poorly understood. We used the multivariate method of partial least squares and structural magnetic resonance imaging to identify gray matter brain networks that underlie facial affect processing in both cocaine-dependent (n = 29) and healthy male volunteers (n = 29). We hypothesized that disruptions in neuroendocrine function in cocaine-dependent individuals would explain their impairments in fear recognition by modulating the relationship with the underlying gray matter networks. We found that cocaine-dependent individuals not only exhibited significant impairments in the recognition of fear, but also for facial expressions of anger. Although recognition accuracy of threatening expressions co-varied in all participants with distinctive gray matter networks implicated in fear and anger processing, in cocaine users it was less well predicted by these networks than in controls. The weaker brain-behavior relationships for threat processing were also mediated by distinctly different factors. Fear recognition impairments were influenced by variations in intelligence levels, whereas anger recognition impairments were associated with comorbid opiate dependence and related reduction in testosterone levels. We also observed an inverse relationship between testosterone levels and the duration of crack and opiate use. Our data provide novel insight into the neurobiological basis of abnormal threat processing in cocaine dependence, which may shed light on new opportunities facilitating the psychosocial integration of these patients.

  11. Impaired recognition of social emotion in patients with complex regional pain syndrome.

    Science.gov (United States)

    Shin, Na Young; Kang, Do-Hyung; Jang, Joon Hwan; Park, Soo Young; Hwang, Jae Yeon; Kim, Sung Nyun; Byun, Min Soo; Park, Hye Youn; Kim, Yong Chul

    2013-11-01

    Multiple brain areas involved in nociceptive, autonomic, and social-emotional processing are disproportionally changed in patients with complex regional pain syndrome (CRPS). Little empirical evidence is available involving social cognitive functioning in patients with chronic pain conditions. We investigated the ability of patients with CRPS to recognize the mental/emotional states of other people. Forty-three patients with CRPS and 30 healthy controls performed the Reading Mind in the Eyes Test, which consists of photos in which human eyes express various emotional and mental states. Neuropsychological tests, including the Wisconsin Card Sorting Test, the stop-signal test, and the reaction time test, were administered to evaluate other cognitive functions. Patients with CRPS were significantly less accurate at recognizing emotional states in other persons, but not on other cognitive tests, compared with control subjects. We found a significant association between the deficit in social-emotion recognition and the affective dimension of pain, whereas this deficit was not related to the sensory dimension of pain. Our findings suggest a disrupted ability to recognize others' mental/emotional states in patients with CRPS. This article demonstrated a deficit in inferring mental/emotional states of others in patients with CRPS that was related to pain affect. Our study suggests that additional interventions directed toward reducing distressful affective pain may be helpful to restore social cognitive processing in patients with CRPS. Copyright © 2013 American Pain Society. Published by Elsevier Inc. All rights reserved.

  12. Recognition memory of neutral words can be impaired by task-irrelevant emotional encoding contexts: behavioral and electrophysiological evidence.

    Science.gov (United States)

    Zhang, Qin; Liu, Xuan; An, Wei; Yang, Yang; Wang, Yinan

    2015-01-01

    Previous studies on the effects of emotional context on memory for centrally presented neutral items have obtained inconsistent results. And in most of those studies subjects were asked to either make a connection between the item and the context at study or retrieve both the item and the context. When no response for the contexts is required, how emotional contexts influence memory for neutral items is still unclear. Thus, the present study attempted to investigate the influences of four types of emotional picture contexts on recognition memory of neutral words using both behavioral and event-related potential (ERP) measurements. During study, words were superimposed centrally onto emotional contexts, and subjects were asked to just remember the words. During test, both studied and new words were presented without the emotional contexts and subjects had to make "old/new" judgments for those words. The results revealed that, compared with the neutral context, the negative contexts and positive high-arousing context impaired recognition of words. ERP results at encoding demonstrated that, compared with items presented in the neutral context, items in the positive and negative high-arousing contexts elicited more positive ERPs, which probably reflects an automatic process of attention capturing of high-arousing context as well as a conscious and effortful process of overcoming the interference of high-arousing context. During retrieval, significant FN400 old/new effects occurred in conditions of the negative low-arousing, positive, and neutral contexts but not in the negative high-arousing condition. Significant LPC old/new effects occurred in all conditions of context. However, the LPC old/new effect in the negative high-arousing condition was smaller than that in the positive high-arousing and low-arousing conditions. These results suggest that emotional context might influence both the familiarity and recollection processes.

  13. Evaluating music emotion recognition

    DEFF Research Database (Denmark)

    Sturm, Bob L.

    2013-01-01

    A fundamental problem with nearly all work in music genre recognition (MGR)is that evaluation lacks validity with respect to the principal goals of MGR. This problem also occurs in the evaluation of music emotion recognition (MER). Standard approaches to evaluation, though easy to implement, do...... not reliably differentiate between recognizing genre or emotion from music, or by virtue of confounding factors in signals (e.g., equalization). We demonstrate such problems for evaluating an MER system, and conclude with recommendations....

  14. Behavioral and Neuroimaging Evidence for Facial Emotion Recognition in Elderly Korean Adults with Mild Cognitive Impairment, Alzheimer's Disease, and Frontotemporal Dementia.

    Science.gov (United States)

    Park, Soowon; Kim, Taehoon; Shin, Seong A; Kim, Yu Kyeong; Sohn, Bo Kyung; Park, Hyeon-Ju; Youn, Jung-Hae; Lee, Jun-Young

    2017-01-01

    Background: Facial emotion recognition (FER) is impaired in individuals with frontotemporal dementia (FTD) and Alzheimer's disease (AD) when compared to healthy older adults. Since deficits in emotion recognition are closely related to caregiver burden or social interactions, researchers have fundamental interest in FER performance in patients with dementia. Purpose: The purpose of this study was to identify the performance profiles of six facial emotions (i.e., fear, anger, disgust, sadness, surprise, and happiness) and neutral faces measured among Korean healthy control (HCs), and those with mild cognitive impairment (MCI), AD, and FTD. Additionally, the neuroanatomical correlates of facial emotions were investigated. Methods: A total of 110 (33 HC, 32 MCI, 32 AD, 13 FTD) older adult participants were recruited from two different medical centers in metropolitan areas of South Korea. These individuals underwent an FER test that was used to assess the recognition of emotions or absence of emotion (neutral) in 35 facial stimuli. Repeated measures two-way analyses of variance were used to examine the distinct profiles of emotional recognition among the four groups. We also performed brain imaging and voxel-based morphometry (VBM) on the participants to examine the associations between FER scores and gray matter volume. Results: The mean score of negative emotion recognition (i.e., fear, anger, disgust, and sadness) clearly discriminated FTD participants from individuals with MCI and AD and HC [ F (3,106) = 10.829, p < 0.001, η 2 = 0.235], whereas the mean score of positive emotion recognition (i.e., surprise and happiness) did not. A VBM analysis showed negative emotions were correlated with gray matter volume of anterior temporal regions, whereas positive emotions were related to gray matter volume of fronto-parietal regions. Conclusion: Impairment of negative FER in patients with FTD is cross-cultural. The discrete neural correlates of FER indicate that emotional

  15. Behavioral and Neuroimaging Evidence for Facial Emotion Recognition in Elderly Korean Adults with Mild Cognitive Impairment, Alzheimer’s Disease, and Frontotemporal Dementia

    Directory of Open Access Journals (Sweden)

    Soowon Park

    2017-11-01

    Full Text Available Background: Facial emotion recognition (FER is impaired in individuals with frontotemporal dementia (FTD and Alzheimer’s disease (AD when compared to healthy older adults. Since deficits in emotion recognition are closely related to caregiver burden or social interactions, researchers have fundamental interest in FER performance in patients with dementia.Purpose: The purpose of this study was to identify the performance profiles of six facial emotions (i.e., fear, anger, disgust, sadness, surprise, and happiness and neutral faces measured among Korean healthy control (HCs, and those with mild cognitive impairment (MCI, AD, and FTD. Additionally, the neuroanatomical correlates of facial emotions were investigated.Methods: A total of 110 (33 HC, 32 MCI, 32 AD, 13 FTD older adult participants were recruited from two different medical centers in metropolitan areas of South Korea. These individuals underwent an FER test that was used to assess the recognition of emotions or absence of emotion (neutral in 35 facial stimuli. Repeated measures two-way analyses of variance were used to examine the distinct profiles of emotional recognition among the four groups. We also performed brain imaging and voxel-based morphometry (VBM on the participants to examine the associations between FER scores and gray matter volume.Results: The mean score of negative emotion recognition (i.e., fear, anger, disgust, and sadness clearly discriminated FTD participants from individuals with MCI and AD and HC [F(3,106 = 10.829, p < 0.001, η2 = 0.235], whereas the mean score of positive emotion recognition (i.e., surprise and happiness did not. A VBM analysis showed negative emotions were correlated with gray matter volume of anterior temporal regions, whereas positive emotions were related to gray matter volume of fronto-parietal regions.Conclusion: Impairment of negative FER in patients with FTD is cross-cultural. The discrete neural correlates of FER indicate that

  16. Behavioral and Neuroimaging Evidence for Facial Emotion Recognition in Elderly Korean Adults with Mild Cognitive Impairment, Alzheimer’s Disease, and Frontotemporal Dementia

    Science.gov (United States)

    Park, Soowon; Kim, Taehoon; Shin, Seong A; Kim, Yu Kyeong; Sohn, Bo Kyung; Park, Hyeon-Ju; Youn, Jung-Hae; Lee, Jun-Young

    2017-01-01

    Background: Facial emotion recognition (FER) is impaired in individuals with frontotemporal dementia (FTD) and Alzheimer’s disease (AD) when compared to healthy older adults. Since deficits in emotion recognition are closely related to caregiver burden or social interactions, researchers have fundamental interest in FER performance in patients with dementia. Purpose: The purpose of this study was to identify the performance profiles of six facial emotions (i.e., fear, anger, disgust, sadness, surprise, and happiness) and neutral faces measured among Korean healthy control (HCs), and those with mild cognitive impairment (MCI), AD, and FTD. Additionally, the neuroanatomical correlates of facial emotions were investigated. Methods: A total of 110 (33 HC, 32 MCI, 32 AD, 13 FTD) older adult participants were recruited from two different medical centers in metropolitan areas of South Korea. These individuals underwent an FER test that was used to assess the recognition of emotions or absence of emotion (neutral) in 35 facial stimuli. Repeated measures two-way analyses of variance were used to examine the distinct profiles of emotional recognition among the four groups. We also performed brain imaging and voxel-based morphometry (VBM) on the participants to examine the associations between FER scores and gray matter volume. Results: The mean score of negative emotion recognition (i.e., fear, anger, disgust, and sadness) clearly discriminated FTD participants from individuals with MCI and AD and HC [F(3,106) = 10.829, p emotion recognition (i.e., surprise and happiness) did not. A VBM analysis showed negative emotions were correlated with gray matter volume of anterior temporal regions, whereas positive emotions were related to gray matter volume of fronto-parietal regions. Conclusion: Impairment of negative FER in patients with FTD is cross-cultural. The discrete neural correlates of FER indicate that emotional recognition processing is a multi-modal system in the brain

  17. Stereotype Associations and Emotion Recognition

    NARCIS (Netherlands)

    Bijlstra, Gijsbert; Holland, Rob W.; Dotsch, Ron; Hugenberg, Kurt; Wigboldus, Daniel H. J.

    We investigated whether stereotype associations between specific emotional expressions and social categories underlie stereotypic emotion recognition biases. Across two studies, we replicated previously documented stereotype biases in emotion recognition using both dynamic (Study 1) and static

  18. Impaired recognition and expression of emotional prosody in schizophrenia : Review and meta-analysis

    NARCIS (Netherlands)

    Hoekert, Marjolijn; Kahn, Rene S.; Pijnenborg, Marieke; Aleman, Andre

    Background: Deficits in emotion processing may be one of the most pervasive disturbances in schizophrenia that may contribute to social isolation. In this report we focus on vocal emotion processing. This function bears upon two corner stones of social functioning, language and emotion, which have

  19. Impaired Recognition of Facially Expressed Emotions in Different Groups of Patients with Sleep Disorders.

    Science.gov (United States)

    Crönlein, Tatjana; Langguth, Berthold; Eichhammer, Peter; Busch, Volker

    2016-01-01

    Recently it has been shown that acute sleep loss has a direct impact on emotional processing in healthy individuals. Here we studied the effect of chronically disturbed sleep on emotional processing by investigating two samples of patients with sleep disorders. 25 patients with psychophysiologic insomnia (23 women and 2 men, mean age: 51.6 SD; 10.9 years), 19 patients with sleep apnea syndrome (4 women and 15 men, mean age: 51.9; SD 11.1) and a control sample of 24 subjects with normal sleep (15 women and 9 men, mean age 45.3; SD 8.8) completed a Facial Expressed Emotion Labelling (FEEL) task, requiring participants to categorize and rate the intensity of six emotional expression categories: anger, anxiety, fear, happiness, disgust and sadness. Differences in FEEL score and its subscales among the three samples were analysed using ANOVA with gender as a covariate. Both patients with psychophysiologic insomnia and patients with sleep apnea showed significantly lower performance in the FEEL test as compared to the control group. Differences were seen in the scales happiness and sadness. Patient groups did not differ from each other. By demonstrating that previously known effects of acute sleep deprivation on emotional processing can be extended to persons experiencing chronically disturbed sleep, our data contribute to a deeper understanding of the relationship between sleep loss and emotions.

  20. Impaired Recognition of Facially Expressed Emotions in Different Groups of Patients with Sleep Disorders.

    Directory of Open Access Journals (Sweden)

    Tatjana Crönlein

    Full Text Available Recently it has been shown that acute sleep loss has a direct impact on emotional processing in healthy individuals. Here we studied the effect of chronically disturbed sleep on emotional processing by investigating two samples of patients with sleep disorders.25 patients with psychophysiologic insomnia (23 women and 2 men, mean age: 51.6 SD; 10.9 years, 19 patients with sleep apnea syndrome (4 women and 15 men, mean age: 51.9; SD 11.1 and a control sample of 24 subjects with normal sleep (15 women and 9 men, mean age 45.3; SD 8.8 completed a Facial Expressed Emotion Labelling (FEEL task, requiring participants to categorize and rate the intensity of six emotional expression categories: anger, anxiety, fear, happiness, disgust and sadness. Differences in FEEL score and its subscales among the three samples were analysed using ANOVA with gender as a covariate.Both patients with psychophysiologic insomnia and patients with sleep apnea showed significantly lower performance in the FEEL test as compared to the control group. Differences were seen in the scales happiness and sadness. Patient groups did not differ from each other.By demonstrating that previously known effects of acute sleep deprivation on emotional processing can be extended to persons experiencing chronically disturbed sleep, our data contribute to a deeper understanding of the relationship between sleep loss and emotions.

  1. Embodied emotion impairment in Huntington's Disease.

    Science.gov (United States)

    Trinkler, Iris; Devignevielle, Sévérine; Achaibou, Amal; Ligneul, Romain V; Brugières, Pierre; Cleret de Langavant, Laurent; De Gelder, Beatrice; Scahill, Rachael; Schwartz, Sophie; Bachoud-Lévi, Anne-Catherine

    2017-07-01

    Theories of embodied cognition suggest that perceiving an emotion involves somatovisceral and motoric re-experiencing. Here we suggest taking such an embodied stance when looking at emotion processing deficits in patients with Huntington's Disease (HD), a neurodegenerative motor disorder. The literature on these patients' emotion recognition deficit has recently been enriched by some reports of impaired emotion expression. The goal of the study was to find out if expression deficits might be linked to a more motoric level of impairment. We used electromyography (EMG) to compare voluntary emotion expression from words to emotion imitation from static face images, and spontaneous emotion mimicry in 28 HD patients and 24 matched controls. For the latter two imitation conditions, an underlying emotion understanding is not imperative (even though performance might be helped by it). EMG measures were compared to emotion recognition and to the capacity to identify and describe emotions using alexithymia questionnaires. Alexithymia questionnaires tap into the more somato-visceral or interoceptive aspects of emotion perception. Furthermore, we correlated patients' expression and recognition scores to cerebral grey matter volume using voxel-based morphometry (VBM). EMG results replicated impaired voluntary emotion expression in HD. Critically, voluntary imitation and spontaneous mimicry were equally impaired and correlated with impaired recognition. By contrast, alexithymia scores were normal, suggesting that emotion representations on the level of internal experience might be spared. Recognition correlated with brain volume in the caudate as well as in areas previously associated with shared action representations, namely somatosensory, posterior parietal, posterior superior temporal sulcus (pSTS) and subcentral sulcus. Together, these findings indicate that in these patients emotion deficits might be tied to the "motoric level" of emotion expression. Such a double

  2. Impaired recognition of happy facial expressions in bipolar disorder.

    Science.gov (United States)

    Lawlor-Savage, Linette; Sponheim, Scott R; Goghari, Vina M

    2014-08-01

    The ability to accurately judge facial expressions is important in social interactions. Individuals with bipolar disorder have been found to be impaired in emotion recognition; however, the specifics of the impairment are unclear. This study investigated whether facial emotion recognition difficulties in bipolar disorder reflect general cognitive, or emotion-specific, impairments. Impairment in the recognition of particular emotions and the role of processing speed in facial emotion recognition were also investigated. Clinically stable bipolar patients (n = 17) and healthy controls (n = 50) judged five facial expressions in two presentation types, time-limited and self-paced. An age recognition condition was used as an experimental control. Bipolar patients' overall facial recognition ability was unimpaired. However, patients' specific ability to judge happy expressions under time constraints was impaired. Findings suggest a deficit in happy emotion recognition impacted by processing speed. Given the limited sample size, further investigation with a larger patient sample is warranted.

  3. Impaired Attribution of Emotion to Facial Expressions in Anxiety and Major Depression

    NARCIS (Netherlands)

    Demenescu, Liliana R.; Kortekaas, Rudie; den Boer, Johan A.; Aleman, Andre

    2010-01-01

    Background: Recognition of others' emotions is an important aspect of interpersonal communication. In major depression, a significant emotion recognition impairment has been reported. It remains unclear whether the ability to recognize emotion from facial expressions is also impaired in anxiety

  4. Emotion recognition in Chinese people with schizophrenia.

    Science.gov (United States)

    Chan, Chetwyn C H; Wong, Raymond; Wang, Kai; Lee, Tatia M C

    2008-01-15

    This study examined whether people with paranoid or nonparanoid schizophrenia would show emotion-recognition deficits, both facial and prosodic. Furthermore, this study examined the neuropsychological predictors of emotion-recognition ability in people with schizophrenia. Participants comprised 86 people, of whom: 43 were people diagnosed with schizophrenia and 43 were controls. The 43 clinical participants were placed in either the paranoid group (n=19) or the nonparanoid group (n=24). Each participant was administered the Facial Emotion Recognition task and the Prosodic Recognition task, together with other neuropsychological measures of attention and visual perception. People suffering from nonparanoid schizophrenia were found to have deficits in both facial and prosodic emotion recognition, after correction for the differences in the intelligence and depression scores between the two groups. Furthermore, spatial perception was observed to be the best predictor of facial emotion identification in individuals with nonparanoid schizophrenia, whereas attentional processing control predicted both prosodic emotion identification and discrimination in nonparanoid schizophrenia patients. Our findings suggest that patients with schizophrenia in remission may still suffer from impairment of certain aspects of emotion recognition.

  5. FILTWAM and Voice Emotion Recognition

    NARCIS (Netherlands)

    Bahreini, Kiavash; Nadolski, Rob; Westera, Wim

    2014-01-01

    This paper introduces the voice emotion recognition part of our framework for improving learning through webcams and microphones (FILTWAM). This framework enables multimodal emotion recognition of learners during game-based learning. The main goal of this study is to validate the use of microphone

  6. A quasi-randomized feasibility pilot study of specific treatments to improve emotion recognition and mental-state reasoning impairments in schizophrenia.

    Science.gov (United States)

    Marsh, Pamela Jane; Polito, Vince; Singh, Subba; Coltheart, Max; Langdon, Robyn; Harris, Anthony W

    2016-10-24

    Impaired ability to make inferences about what another person might think or feel (i.e., social cognition impairment) is recognised as a core feature of schizophrenia and a key determinant of the poor social functioning that characterizes this illness. The development of treatments to target social cognitive impairments as a causal factor of impaired functioning in schizophrenia is of high priority. In this study, we investigated the acceptability, feasibility, and limited efficacy of 2 programs targeted at specific domains of social cognition in schizophrenia: "SoCog" Mental-State Reasoning Training (SoCog-MSRT) and "SoCog" Emotion Recognition Training (SoCog-ERT). Thirty-one participants with schizophrenia or schizoaffective disorder were allocated to either SoCog-MSRT (n = 19) or SoCog-ERT (n = 12). Treatment comprised 12 twice-weekly sessions for 6 weeks. Participants underwent assessments of social cognition, neurocognition and symptoms at baseline, post-training and 3-months after completing training. Attendance at training sessions was high with an average of 89.29 % attendance in the SoCog-MSRT groups and 85.42 % in the SoCog-ERT groups. Participants also reported the 2 programs as enjoyable and beneficial. Both SoCog-MSRT and SoCog-ERT groups showed increased scores on a false belief reasoning task and the Reading the Mind in the Eyes test. The SoCog-MSRT group also showed reduced personalising attributional biases in a small number of participants, while the SoCog-ERT group showed improved emotion recognition. The results are promising and support the feasibility and acceptability of the 2 SoCog programs as well as limited efficacy to improve social cognitive abilities in schizophrenia. There is also some evidence that skills for the recognition of basic facial expressions need specific training. Australian New Zealand Clinical Trials Registry ACTRN12613000978763 . Retrospectively registered 3/09/2013.

  7. The recognition of facial emotion expressions in Parkinson's disease.

    Science.gov (United States)

    Assogna, Francesca; Pontieri, Francesco E; Caltagirone, Carlo; Spalletta, Gianfranco

    2008-11-01

    A limited number of studies in Parkinson's Disease (PD) suggest a disturbance of recognition of facial emotion expressions. In particular, disgust recognition impairment has been reported in unmedicated and medicated PD patients. However, the results are rather inconclusive in the definition of the degree and the selectivity of emotion recognition impairment, and an associated impairment of almost all basic facial emotions in PD is also described. Few studies have investigated the relationship with neuropsychiatric and neuropsychological symptoms with mainly negative results. This inconsistency may be due to many different problems, such as emotion assessment, perception deficit, cognitive impairment, behavioral symptoms, illness severity and antiparkinsonian therapy. Here we review the clinical characteristics and neural structures involved in the recognition of specific facial emotion expressions, and the plausible role of dopamine transmission and dopamine replacement therapy in these processes. It is clear that future studies should be directed to clarify all these issues.

  8. Facial emotion recognition in paranoid schizophrenia and autism spectrum disorder.

    Science.gov (United States)

    Sachse, Michael; Schlitt, Sabine; Hainz, Daniela; Ciaramidaro, Angela; Walter, Henrik; Poustka, Fritz; Bölte, Sven; Freitag, Christine M

    2014-11-01

    Schizophrenia (SZ) and autism spectrum disorder (ASD) share deficits in emotion processing. In order to identify convergent and divergent mechanisms, we investigated facial emotion recognition in SZ, high-functioning ASD (HFASD), and typically developed controls (TD). Different degrees of task difficulty and emotion complexity (face, eyes; basic emotions, complex emotions) were used. Two Benton tests were implemented in order to elicit potentially confounding visuo-perceptual functioning and facial processing. Nineteen participants with paranoid SZ, 22 with HFASD and 20 TD were included, aged between 14 and 33 years. Individuals with SZ were comparable to TD in all obtained emotion recognition measures, but showed reduced basic visuo-perceptual abilities. The HFASD group was impaired in the recognition of basic and complex emotions compared to both, SZ and TD. When facial identity recognition was adjusted for, group differences remained for the recognition of complex emotions only. Our results suggest that there is a SZ subgroup with predominantly paranoid symptoms that does not show problems in face processing and emotion recognition, but visuo-perceptual impairments. They also confirm the notion of a general facial and emotion recognition deficit in HFASD. No shared emotion recognition deficit was found for paranoid SZ and HFASD, emphasizing the differential cognitive underpinnings of both disorders. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Acoustic modeling for emotion recognition

    CERN Document Server

    Anne, Koteswara Rao; Vankayalapati, Hima Deepthi

    2015-01-01

     This book presents state of art research in speech emotion recognition. Readers are first presented with basic research and applications – gradually more advance information is provided, giving readers comprehensive guidance for classify emotions through speech. Simulated databases are used and results extensively compared, with the features and the algorithms implemented using MATLAB. Various emotion recognition models like Linear Discriminant Analysis (LDA), Regularized Discriminant Analysis (RDA), Support Vector Machines (SVM) and K-Nearest neighbor (KNN) and are explored in detail using prosody and spectral features, and feature fusion techniques.

  10. An audiovisual emotion recognition system

    Science.gov (United States)

    Han, Yi; Wang, Guoyin; Yang, Yong; He, Kun

    2007-12-01

    Human emotions could be expressed by many bio-symbols. Speech and facial expression are two of them. They are both regarded as emotional information which is playing an important role in human-computer interaction. Based on our previous studies on emotion recognition, an audiovisual emotion recognition system is developed and represented in this paper. The system is designed for real-time practice, and is guaranteed by some integrated modules. These modules include speech enhancement for eliminating noises, rapid face detection for locating face from background image, example based shape learning for facial feature alignment, and optical flow based tracking algorithm for facial feature tracking. It is known that irrelevant features and high dimensionality of the data can hurt the performance of classifier. Rough set-based feature selection is a good method for dimension reduction. So 13 speech features out of 37 ones and 10 facial features out of 33 ones are selected to represent emotional information, and 52 audiovisual features are selected due to the synchronization when speech and video fused together. The experiment results have demonstrated that this system performs well in real-time practice and has high recognition rate. Our results also show that the work in multimodules fused recognition will become the trend of emotion recognition in the future.

  11. The automaticity of emotion recognition.

    Science.gov (United States)

    Tracy, Jessica L; Robins, Richard W

    2008-02-01

    Evolutionary accounts of emotion typically assume that humans evolved to quickly and efficiently recognize emotion expressions because these expressions convey fitness-enhancing messages. The present research tested this assumption in 2 studies. Specifically, the authors examined (a) how quickly perceivers could recognize expressions of anger, contempt, disgust, embarrassment, fear, happiness, pride, sadness, shame, and surprise; (b) whether accuracy is improved when perceivers deliberate about each expression's meaning (vs. respond as quickly as possible); and (c) whether accurate recognition can occur under cognitive load. Across both studies, perceivers quickly and efficiently (i.e., under cognitive load) recognized most emotion expressions, including the self-conscious emotions of pride, embarrassment, and shame. Deliberation improved accuracy in some cases, but these improvements were relatively small. Discussion focuses on the implications of these findings for the cognitive processes underlying emotion recognition.

  12. Recognition of Face and Emotional Facial Expressions in Autism

    Directory of Open Access Journals (Sweden)

    Muhammed Tayyib Kadak

    2013-03-01

    Full Text Available Autism is a genetically transferred neurodevelopmental disorder characterized by severe and permanent deficits in many interpersonal relation areas like communication, social interaction and emotional responsiveness. Patients with autism have deficits in face recognition, eye contact and recognition of emotional expression. Both recognition of face and expression of facial emotion carried on face processing. Structural and functional impairment in fusiform gyrus, amygdala, superior temporal sulcus and other brain regions lead to deficits in recognition of face and facial emotion. Therefore studies suggest that face processing deficits resulted in problems in areas of social interaction and emotion in autism. Studies revealed that children with autism had problems in recognition of facial expression and used mouth region more than eye region. It was also shown that autistic patients interpreted ambiguous expressions as negative emotion. In autism, deficits related in various stages of face processing like detection of gaze, face identity, recognition of emotional expression were determined, so far. Social interaction impairments in autistic spectrum disorders originated from face processing deficits during the periods of infancy, childhood and adolescence. Recognition of face and expression of facial emotion could be affected either automatically by orienting towards faces after birth, or by “learning” processes in developmental periods such as identity and emotion processing. This article aimed to review neurobiological basis of face processing and recognition of emotional facial expressions during normal development and in autism.

  13. Brain Structural Correlates of Emotion Recognition in Psychopaths.

    Directory of Open Access Journals (Sweden)

    Vanessa Pera-Guardiola

    Full Text Available Individuals with psychopathy present deficits in the recognition of facial emotional expressions. However, the nature and extent of these alterations are not fully understood. Furthermore, available data on the functional neural correlates of emotional face recognition deficits in adult psychopaths have provided mixed results. In this context, emotional face morphing tasks may be suitable for clarifying mild and emotion-specific impairments in psychopaths. Likewise, studies exploring corresponding anatomical correlates may be useful for disentangling available neurofunctional evidence based on the alleged neurodevelopmental roots of psychopathic traits. We used Voxel-Based Morphometry and a morphed emotional face expression recognition task to evaluate the relationship between regional gray matter (GM volumes and facial emotion recognition deficits in male psychopaths. In comparison to male healthy controls, psychopaths showed deficits in the recognition of sad, happy and fear emotional expressions. In subsequent brain imaging analyses psychopaths with better recognition of facial emotional expressions showed higher volume in the prefrontal cortex (orbitofrontal, inferior frontal and dorsomedial prefrontal cortices, somatosensory cortex, anterior insula, cingulate cortex and the posterior lobe of the cerebellum. Amygdala and temporal lobe volumes contributed to better emotional face recognition in controls only. These findings provide evidence suggesting that variability in brain morphometry plays a role in accounting for psychopaths' impaired ability to recognize emotional face expressions, and may have implications for comprehensively characterizing the empathy and social cognition dysfunctions typically observed in this population of subjects.

  14. Textual emotion recognition for enhancing enterprise computing

    Science.gov (United States)

    Quan, Changqin; Ren, Fuji

    2016-05-01

    The growing interest in affective computing (AC) brings a lot of valuable research topics that can meet different application demands in enterprise systems. The present study explores a sub area of AC techniques - textual emotion recognition for enhancing enterprise computing. Multi-label emotion recognition in text is able to provide a more comprehensive understanding of emotions than single label emotion recognition. A representation of 'emotion state in text' is proposed to encompass the multidimensional emotions in text. It ensures the description in a formal way of the configurations of basic emotions as well as of the relations between them. Our method allows recognition of the emotions for the words bear indirect emotions, emotion ambiguity and multiple emotions. We further investigate the effect of word order for emotional expression by comparing the performances of bag-of-words model and sequence model for multi-label sentence emotion recognition. The experiments show that the classification results under sequence model are better than under bag-of-words model. And homogeneous Markov model showed promising results of multi-label sentence emotion recognition. This emotion recognition system is able to provide a convenient way to acquire valuable emotion information and to improve enterprise competitive ability in many aspects.

  15. Deficits in Facial Emotion Recognition Indicate Behavioral Changes and Impaired Self-Awareness after Moderate to Severe Traumatic Brain Injury

    OpenAIRE

    Spikman, Jacoba M.; Milders, Maarten V.; Visser-Keizer, Annemarie C.; Westerhof-Evers, Herma J.; Herben-Dekker, Meike; van der Naalt, Joukje

    2013-01-01

    Traumatic brain injury (TBI) is a leading cause of disability, specifically among younger adults. Behavioral changes are common after moderate to severe TBI and have adverse consequences for social and vocational functioning. It is hypothesized that deficits in social cognition, including facial affect recognition, might underlie these behavioral changes. Measurement of behavioral deficits is complicated, because the rating scales used rely on subjective judgement, often lack specificity and ...

  16. Serotonergic modulation of face-emotion recognition

    Directory of Open Access Journals (Sweden)

    C.M. Del-Ben

    2008-04-01

    Full Text Available Facial expressions of basic emotions have been widely used to investigate the neural substrates of emotion processing, but little is known about the exact meaning of subjective changes provoked by perceiving facial expressions. Our assumption was that fearful faces would be related to the processing of potential threats, whereas angry faces would be related to the processing of proximal threats. Experimental studies have suggested that serotonin modulates the brain processes underlying defensive responses to environmental threats, facilitating risk assessment behavior elicited by potential threats and inhibiting fight or flight responses to proximal threats. In order to test these predictions about the relationship between fearful and angry faces and defensive behaviors, we carried out a review of the literature about the effects of pharmacological probes that affect 5-HT-mediated neurotransmission on the perception of emotional faces. The hypothesis that angry faces would be processed as a proximal threat and that, as a consequence, their recognition would be impaired by an increase in 5-HT function was not supported by the results reviewed. In contrast, most of the studies that evaluated the behavioral effects of serotonin challenges showed that increased 5-HT neurotransmission facilitates the recognition of fearful faces, whereas its decrease impairs the same performance. These results agree with the hypothesis that fearful faces are processed as potential threats and that 5-HT enhances this brain processing.

  17. Does comorbid anxiety counteract emotion recognition deficits in conduct disorder?

    Science.gov (United States)

    Short, Roxanna M L; Sonuga-Barke, Edmund J S; Adams, Wendy J; Fairchild, Graeme

    2016-08-01

    Previous research has reported altered emotion recognition in both conduct disorder (CD) and anxiety disorders (ADs) - but these effects appear to be of different kinds. Adolescents with CD often show a generalised pattern of deficits, while those with ADs show hypersensitivity to specific negative emotions. Although these conditions often cooccur, little is known regarding emotion recognition performance in comorbid CD+ADs. Here, we test the hypothesis that in the comorbid case, anxiety-related emotion hypersensitivity counteracts the emotion recognition deficits typically observed in CD. We compared facial emotion recognition across four groups of adolescents aged 12-18 years: those with CD alone (n = 28), ADs alone (n = 23), cooccurring CD+ADs (n = 20) and typically developing controls (n = 28). The emotion recognition task we used systematically manipulated the emotional intensity of facial expressions as well as fixation location (eye, nose or mouth region). Conduct disorder was associated with a generalised impairment in emotion recognition; however, this may have been modulated by group differences in IQ. AD was associated with increased sensitivity to low-intensity happiness, disgust and sadness. In general, the comorbid CD+ADs group performed similarly to typically developing controls. Although CD alone was associated with emotion recognition impairments, ADs and comorbid CD+ADs were associated with normal or enhanced emotion recognition performance. The presence of comorbid ADs appeared to counteract the effects of CD, suggesting a potentially protective role, although future research should examine the contribution of IQ and gender to these effects. © 2016 Association for Child and Adolescent Mental Health.

  18. Recognition of facial and musical emotions in Parkinson's disease.

    Science.gov (United States)

    Saenz, A; Doé de Maindreville, A; Henry, A; de Labbey, S; Bakchine, S; Ehrlé, N

    2013-03-01

    Patients with amygdala lesions were found to be impaired in recognizing the fear emotion both from face and from music. In patients with Parkinson's disease (PD), impairment in recognition of emotions from facial expressions was reported for disgust, fear, sadness and anger, but no studies had yet investigated this population for the recognition of emotions from both face and music. The ability to recognize basic universal emotions (fear, happiness and sadness) from both face and music was investigated in 24 medicated patients with PD and 24 healthy controls. The patient group was tested for language (verbal fluency tasks), memory (digit and spatial span), executive functions (Similarities and Picture Completion subtests of the WAIS III, Brixton and Stroop tests), visual attention (Bells test), and fulfilled self-assessment tests for anxiety and depression. Results showed that the PD group was significantly impaired for recognition of both fear and sadness emotions from facial expressions, whereas their performance in recognition of emotions from musical excerpts was not different from that of the control group. The scores of fear and sadness recognition from faces were neither correlated to scores in tests for executive and cognitive functions, nor to scores in self-assessment scales. We attributed the observed dissociation to the modality (visual vs. auditory) of presentation and to the ecological value of the musical stimuli that we used. We discuss the relevance of our findings for the care of patients with PD. © 2012 The Author(s) European Journal of Neurology © 2012 EFNS.

  19. The Differential Effects of Thalamus and Basal Ganglia on Facial Emotion Recognition

    Science.gov (United States)

    Cheung, Crystal C. Y.; Lee, Tatia M. C.; Yip, James T. H.; King, Kristin E.; Li, Leonard S. W.

    2006-01-01

    This study examined if subcortical stroke was associated with impaired facial emotion recognition. Furthermore, the lateralization of the impairment and the differential profiles of facial emotion recognition deficits with localized thalamic or basal ganglia damage were also studied. Thirty-eight patients with subcortical strokes and 19 matched…

  20. Food-Induced Emotional Resonance Improves Emotion Recognition

    OpenAIRE

    Pandolfi, Elisa; Sacripante, Riccardo; Cardini, Flavia

    2016-01-01

    The effect of food substances on emotional states has been widely investigated, showing, for example, that eating chocolate is able to reduce negative mood. Here, for the first time, we have shown that the consumption of specific food substances is not only able to induce particular emotional states, but more importantly, to facilitate recognition of corresponding emotional facial expressions in others. Participants were asked to perform an emotion recognition task before and after eating eit...

  1. Emotion recognition in borderline personality disorder: effects of emotional information on negative bias.

    Science.gov (United States)

    Fenske, Sabrina; Lis, Stefanie; Liebke, Lisa; Niedtfeld, Inga; Kirsch, Peter; Mier, Daniela

    2015-01-01

    Borderline Personality Disorder (BPD) is characterized by severe deficits in social interactions, which might be linked to deficits in emotion recognition. Research on emotion recognition abilities in BPD revealed heterogeneous results, ranging from deficits to heightened sensitivity. The most stable findings point to an impairment in the evaluation of neutral facial expressions as neutral, as well as to a negative bias in emotion recognition; that is the tendency to attribute negative emotions to neutral expressions, or in a broader sense to report a more negative emotion category than depicted. However, it remains unclear which contextual factors influence the occurrence of this negative bias. Previous studies suggest that priming by preceding emotional information and also constrained processing time might augment the emotion recognition deficit in BPD. To test these assumptions, 32 female BPD patients and 31 healthy females, matched for age and education, participated in an emotion recognition study, in which every facial expression was preceded by either a positive, neutral or negative scene. Furthermore, time constraints for processing were varied by presenting the facial expressions with short (100 ms) or long duration (up to 3000 ms) in two separate blocks. BPD patients showed a significant deficit in emotion recognition for neutral and positive facial expression, associated with a significant negative bias. In BPD patients, this emotion recognition deficit was differentially affected by preceding emotional information and time constraints, with a greater influence of emotional information during long face presentations and a greater influence of neutral information during short face presentations. Our results are in line with previous findings supporting the existence of a negative bias in emotion recognition in BPD patients, and provide further insights into biased social perceptions in BPD patients.

  2. Does cortisol modulate emotion recognition and empathy?

    Science.gov (United States)

    Duesenberg, Moritz; Weber, Juliane; Schulze, Lars; Schaeuffele, Carmen; Roepke, Stefan; Hellmann-Regen, Julian; Otte, Christian; Wingenfeld, Katja

    2016-04-01

    Emotion recognition and empathy are important aspects in the interaction and understanding of other people's behaviors and feelings. The Human environment comprises of stressful situations that impact social interactions on a daily basis. Aim of the study was to examine the effects of the stress hormone cortisol on emotion recognition and empathy. In this placebo-controlled study, 40 healthy men and 40 healthy women (mean age 24.5 years) received either 10mg of hydrocortisone or placebo. We used the Multifaceted Empathy Test to measure emotional and cognitive empathy. Furthermore, we examined emotion recognition from facial expressions, which contained two emotions (anger and sadness) and two emotion intensities (40% and 80%). We did not find a main effect for treatment or sex on either empathy or emotion recognition but a sex × emotion interaction on emotion recognition. The main result was a four-way-interaction on emotion recognition including treatment, sex, emotion and task difficulty. At 40% task difficulty, women recognized angry faces better than men in the placebo condition. Furthermore, in the placebo condition, men recognized sadness better than anger. At 80% task difficulty, men and women performed equally well in recognizing sad faces but men performed worse compared to women with regard to angry faces. Apparently, our results did not support the hypothesis that increases in cortisol concentration alone influence empathy and emotion recognition in healthy young individuals. However, sex and task difficulty appear to be important variables in emotion recognition from facial expressions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Novel acoustic features for speech emotion recognition

    Institute of Scientific and Technical Information of China (English)

    ROH Yong-Wan; KIM Dong-Ju; LEE Woo-Seok; HONG Kwang-Seok

    2009-01-01

    This paper focuses on acoustic features that effectively improve the recognition of emotion in human speech. The novel features in this paper are based on spectral-based entropy parameters such as fast Fourier transform (FFT) spectral entropy, delta FFT spectral entropy, Mel-frequency filter bank (MFB)spectral entropy, and Delta MFB spectral entropy. Spectral-based entropy features are simple. They reflect frequency characteristic and changing characteristic in frequency of speech. We implement an emotion rejection module using the probability distribution of recognized-scores and rejected-scores.This reduces the false recognition rate to improve overall performance. Recognized-scores and rejected-scores refer to probabilities of recognized and rejected emotion recognition results, respectively.These scores are first obtained from a pattern recognition procedure. The pattern recognition phase uses the Gaussian mixture model (GMM). We classify the four emotional states as anger, sadness,happiness and neutrality. The proposed method is evaluated using 45 sentences in each emotion for 30 subjects, 15 males and 15 females. Experimental results show that the proposed method is superior to the existing emotion recognition methods based on GMM using energy, Zero Crossing Rate (ZCR),linear prediction coefficient (LPC), and pitch parameters. We demonstrate the effectiveness of the proposed approach. One of the proposed features, combined MFB and delta MFB spectral entropy improves performance approximately 10% compared to the existing feature parameters for speech emotion recognition methods. We demonstrate a 4% performance improvement in the applied emotion rejection with low confidence score.

  4. Effects of cue modality and emotional category on recognition of nonverbal emotional signals in schizophrenia.

    Science.gov (United States)

    Vogel, Bastian D; Brück, Carolin; Jacob, Heike; Eberle, Mark; Wildgruber, Dirk

    2016-07-07

    Impaired interpretation of nonverbal emotional cues in patients with schizophrenia has been reported in several studies and a clinical relevance of these deficits for social functioning has been assumed. However, it is unclear to what extent the impairments depend on specific emotions or specific channels of nonverbal communication. Here, the effect of cue modality and emotional categories on accuracy of emotion recognition was evaluated in 21 patients with schizophrenia and compared to a healthy control group (n = 21). To this end, dynamic stimuli comprising speakers of both genders in three different sensory modalities (auditory, visual and audiovisual) and five emotional categories (happy, alluring, neutral, angry and disgusted) were used. Patients with schizophrenia were found to be impaired in emotion recognition in comparison to the control group across all stimuli. Considering specific emotions more severe deficits were revealed in the recognition of alluring stimuli and less severe deficits in the recognition of disgusted stimuli as compared to all other emotions. Regarding cue modality the extent of the impairment in emotional recognition did not significantly differ between auditory and visual cues across all emotional categories. However, patients with schizophrenia showed significantly more severe disturbances for vocal as compared to facial cues when sexual interest is expressed (alluring stimuli), whereas more severe disturbances for facial as compared to vocal cues were observed when happiness or anger is expressed. Our results confirmed that perceptual impairments can be observed for vocal as well as facial cues conveying various social and emotional connotations. The observed differences in severity of impairments with most severe deficits for alluring expressions might be related to specific difficulties in recognizing the complex social emotional information of interpersonal intentions as compared to "basic" emotional states. Therefore

  5. Evaluating music emotion recognition:Lessons from music genre recognition?

    OpenAIRE

    Sturm, Bob L.

    2013-01-01

    A fundamental problem with nearly all work in music genre recognition (MGR)is that evaluation lacks validity with respect to the principal goals of MGR. This problem also occurs in the evaluation of music emotion recognition (MER). Standard approaches to evaluation, though easy to implement, do not reliably differentiate between recognizing genre or emotion from music, or by virtue of confounding factors in signals (e.g., equalization). We demonstrate such problems for evaluating an MER syste...

  6. Oxytocin improves emotion recognition for older males.

    Science.gov (United States)

    Campbell, Anna; Ruffman, Ted; Murray, Janice E; Glue, Paul

    2014-10-01

    Older adults (≥60 years) perform worse than young adults (18-30 years) when recognizing facial expressions of emotion. The hypothesized cause of these changes might be declines in neurotransmitters that could affect information processing within the brain. In the present study, we examined the neuropeptide oxytocin that functions to increase neurotransmission. Research suggests that oxytocin benefits the emotion recognition of less socially able individuals. Men tend to have lower levels of oxytocin and older men tend to have worse emotion recognition than older women; therefore, there is reason to think that older men will be particularly likely to benefit from oxytocin. We examined this idea using a double-blind design, testing 68 older and 68 young adults randomly allocated to receive oxytocin nasal spray (20 international units) or placebo. Forty-five minutes afterward they completed an emotion recognition task assessing labeling accuracy for angry, disgusted, fearful, happy, neutral, and sad faces. Older males receiving oxytocin showed improved emotion recognition relative to those taking placebo. No differences were found for older females or young adults. We hypothesize that oxytocin facilitates emotion recognition by improving neurotransmission in the group with the worst emotion recognition. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Temporal lobe structures and facial emotion recognition in schizophrenia patients and nonpsychotic relatives.

    Science.gov (United States)

    Goghari, Vina M; Macdonald, Angus W; Sponheim, Scott R

    2011-11-01

    Temporal lobe abnormalities and emotion recognition deficits are prominent features of schizophrenia and appear related to the diathesis of the disorder. This study investigated whether temporal lobe structural abnormalities were associated with facial emotion recognition deficits in schizophrenia and related to genetic liability for the disorder. Twenty-seven schizophrenia patients, 23 biological family members, and 36 controls participated. Several temporal lobe regions (fusiform, superior temporal, middle temporal, amygdala, and hippocampus) previously associated with face recognition in normative samples and found to be abnormal in schizophrenia were evaluated using volumetric analyses. Participants completed a facial emotion recognition task and an age recognition control task under time-limited and self-paced conditions. Temporal lobe volumes were tested for associations with task performance. Group status explained 23% of the variance in temporal lobe volume. Left fusiform gray matter volume was decreased by 11% in patients and 7% in relatives compared with controls. Schizophrenia patients additionally exhibited smaller hippocampal and middle temporal volumes. Patients were unable to improve facial emotion recognition performance with unlimited time to make a judgment but were able to improve age recognition performance. Patients additionally showed a relationship between reduced temporal lobe gray matter and poor facial emotion recognition. For the middle temporal lobe region, the relationship between greater volume and better task performance was specific to facial emotion recognition and not age recognition. Because schizophrenia patients exhibited a specific deficit in emotion recognition not attributable to a generalized impairment in face perception, impaired emotion recognition may serve as a target for interventions.

  8. Mapping correspondence between facial mimicry and emotion recognition in healthy subjects.

    Science.gov (United States)

    Ponari, Marta; Conson, Massimiliano; D'Amico, Nunzia Pina; Grossi, Dario; Trojano, Luigi

    2012-12-01

    We aimed at verifying the hypothesis that facial mimicry is causally and selectively involved in emotion recognition. For this purpose, in Experiment 1, we explored the effect of tonic contraction of muscles in upper or lower half of participants' face on their ability to recognize emotional facial expressions. We found that the "lower" manipulation specifically impaired recognition of happiness and disgust, the "upper" manipulation impaired recognition of anger, while both manipulations affected recognition of fear; recognition of surprise and sadness were not affected by either blocking manipulations. In Experiment 2, we verified whether emotion recognition is hampered by stimuli in which an upper or lower half-face showing an emotional expression is combined with a neutral half-face. We found that the neutral lower half-face interfered with recognition of happiness and disgust, whereas the neutral upper half impaired recognition of anger; recognition of fear and sadness was impaired by both manipulations, whereas recognition of surprise was not affected by either manipulation. Taken together, the present findings support simulation models of emotion recognition and provide insight into the role of mimicry in comprehension of others' emotional facial expressions. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  9. Progesterone impairs social recognition in male rats.

    Science.gov (United States)

    Bychowski, Meaghan E; Auger, Catherine J

    2012-04-01

    The influence of progesterone in the brain and on the behavior of females is fairly well understood. However, less is known about the effect of progesterone in the male system. In male rats, receptors for progesterone are present in virtually all vasopressin (AVP) immunoreactive cells in the bed nucleus of the stria terminalis (BST) and the medial amygdala (MeA). This colocalization functions to regulate AVP expression, as progesterone and/or progestin receptors (PR)s suppress AVP expression in these same extrahypothalamic regions in the brain. These data suggest that progesterone may influence AVP-dependent behavior. While AVP is implicated in numerous behavioral and physiological functions in rodents, AVP appears essential for social recognition of conspecifics. Therefore, we examined the effects of progesterone on social recognition. We report that progesterone plays an important role in modulating social recognition in the male brain, as progesterone treatment leads to a significant impairment of social recognition in male rats. Moreover, progesterone appears to act on PRs to impair social recognition, as progesterone impairment of social recognition is blocked by a PR antagonist, RU-486. Social recognition is also impaired by a specific progestin agonist, R5020. Interestingly, we show that progesterone does not interfere with either general memory or olfactory processes, suggesting that progesterone seems critically important to social recognition memory. These data provide strong evidence that physiological levels of progesterone can have an important impact on social behavior in male rats. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Speech emotion recognition methods: A literature review

    Science.gov (United States)

    Basharirad, Babak; Moradhaseli, Mohammadreza

    2017-10-01

    Recently, attention of the emotional speech signals research has been boosted in human machine interfaces due to availability of high computation capability. There are many systems proposed in the literature to identify the emotional state through speech. Selection of suitable feature sets, design of a proper classifications methods and prepare an appropriate dataset are the main key issues of speech emotion recognition systems. This paper critically analyzed the current available approaches of speech emotion recognition methods based on the three evaluating parameters (feature set, classification of features, accurately usage). In addition, this paper also evaluates the performance and limitations of available methods. Furthermore, it highlights the current promising direction for improvement of speech emotion recognition systems.

  11. Food-Induced Emotional Resonance Improves Emotion Recognition.

    Science.gov (United States)

    Pandolfi, Elisa; Sacripante, Riccardo; Cardini, Flavia

    2016-01-01

    The effect of food substances on emotional states has been widely investigated, showing, for example, that eating chocolate is able to reduce negative mood. Here, for the first time, we have shown that the consumption of specific food substances is not only able to induce particular emotional states, but more importantly, to facilitate recognition of corresponding emotional facial expressions in others. Participants were asked to perform an emotion recognition task before and after eating either a piece of chocolate or a small amount of fish sauce-which we expected to induce happiness or disgust, respectively. Our results showed that being in a specific emotional state improves recognition of the corresponding emotional facial expression. Indeed, eating chocolate improved recognition of happy faces, while disgusted expressions were more readily recognized after eating fish sauce. In line with the embodied account of emotion understanding, we suggest that people are better at inferring the emotional state of others when their own emotional state resonates with the observed one.

  12. Food-Induced Emotional Resonance Improves Emotion Recognition

    Science.gov (United States)

    Pandolfi, Elisa; Sacripante, Riccardo; Cardini, Flavia

    2016-01-01

    The effect of food substances on emotional states has been widely investigated, showing, for example, that eating chocolate is able to reduce negative mood. Here, for the first time, we have shown that the consumption of specific food substances is not only able to induce particular emotional states, but more importantly, to facilitate recognition of corresponding emotional facial expressions in others. Participants were asked to perform an emotion recognition task before and after eating either a piece of chocolate or a small amount of fish sauce—which we expected to induce happiness or disgust, respectively. Our results showed that being in a specific emotional state improves recognition of the corresponding emotional facial expression. Indeed, eating chocolate improved recognition of happy faces, while disgusted expressions were more readily recognized after eating fish sauce. In line with the embodied account of emotion understanding, we suggest that people are better at inferring the emotional state of others when their own emotional state resonates with the observed one. PMID:27973559

  13. Emotion recognition from speech: tools and challenges

    Science.gov (United States)

    Al-Talabani, Abdulbasit; Sellahewa, Harin; Jassim, Sabah A.

    2015-05-01

    Human emotion recognition from speech is studied frequently for its importance in many applications, e.g. human-computer interaction. There is a wide diversity and non-agreement about the basic emotion or emotion-related states on one hand and about where the emotion related information lies in the speech signal on the other side. These diversities motivate our investigations into extracting Meta-features using the PCA approach, or using a non-adaptive random projection RP, which significantly reduce the large dimensional speech feature vectors that may contain a wide range of emotion related information. Subsets of Meta-features are fused to increase the performance of the recognition model that adopts the score-based LDC classifier. We shall demonstrate that our scheme outperform the state of the art results when tested on non-prompted databases or acted databases (i.e. when subjects act specific emotions while uttering a sentence). However, the huge gap between accuracy rates achieved on the different types of datasets of speech raises questions about the way emotions modulate the speech. In particular we shall argue that emotion recognition from speech should not be dealt with as a classification problem. We shall demonstrate the presence of a spectrum of different emotions in the same speech portion especially in the non-prompted data sets, which tends to be more "natural" than the acted datasets where the subjects attempt to suppress all but one emotion.

  14. Music Education Intervention Improves Vocal Emotion Recognition

    Science.gov (United States)

    Mualem, Orit; Lavidor, Michal

    2015-01-01

    The current study is an interdisciplinary examination of the interplay among music, language, and emotions. It consisted of two experiments designed to investigate the relationship between musical abilities and vocal emotional recognition. In experiment 1 (N = 24), we compared the influence of two short-term intervention programs--music and…

  15. Novel acoustic features for speech emotion recognition

    Institute of Scientific and Technical Information of China (English)

    ROH; Yong-Wan; KIM; Dong-Ju; LEE; Woo-Seok; HONG; Kwang-Seok

    2009-01-01

    This paper focuses on acoustic features that effectively improve the recognition of emotion in human speech.The novel features in this paper are based on spectral-based entropy parameters such as fast Fourier transform(FFT) spectral entropy,delta FFT spectral entropy,Mel-frequency filter bank(MFB) spectral entropy,and Delta MFB spectral entropy.Spectral-based entropy features are simple.They reflect frequency characteristic and changing characteristic in frequency of speech.We implement an emotion rejection module using the probability distribution of recognized-scores and rejected-scores.This reduces the false recognition rate to improve overall performance.Recognized-scores and rejected-scores refer to probabilities of recognized and rejected emotion recognition results,respectively.These scores are first obtained from a pattern recognition procedure.The pattern recognition phase uses the Gaussian mixture model(GMM).We classify the four emotional states as anger,sadness,happiness and neutrality.The proposed method is evaluated using 45 sentences in each emotion for 30 subjects,15 males and 15 females.Experimental results show that the proposed method is superior to the existing emotion recognition methods based on GMM using energy,Zero Crossing Rate(ZCR),linear prediction coefficient(LPC),and pitch parameters.We demonstrate the effectiveness of the proposed approach.One of the proposed features,combined MFB and delta MFB spectral entropy improves performance approximately 10% compared to the existing feature parameters for speech emotion recognition methods.We demonstrate a 4% performance improvement in the applied emotion rejection with low confidence score.

  16. Domain Adversarial for Acoustic Emotion Recognition

    OpenAIRE

    Abdelwahab, Mohammed; Busso, Carlos

    2018-01-01

    The performance of speech emotion recognition is affected by the differences in data distributions between train (source domain) and test (target domain) sets used to build and evaluate the models. This is a common problem, as multiple studies have shown that the performance of emotional classifiers drop when they are exposed to data that does not match the distribution used to build the emotion classifiers. The difference in data distributions becomes very clear when the training and testing...

  17. Discrete Neural Correlates for the Recognition of Negative Emotions: Insights from Frontotemporal Dementia

    Science.gov (United States)

    Kumfor, Fiona; Irish, Muireann; Hodges, John R.; Piguet, Olivier

    2013-01-01

    Patients with frontotemporal dementia have pervasive changes in emotion recognition and social cognition, yet the neural changes underlying these emotion processing deficits remain unclear. The multimodal system model of emotion proposes that basic emotions are dependent on distinct brain regions, which undergo significant pathological changes in frontotemporal dementia. As such, this syndrome may provide important insight into the impact of neural network degeneration upon the innate ability to recognise emotions. This study used voxel-based morphometry to identify discrete neural correlates involved in the recognition of basic emotions (anger, disgust, fear, sadness, surprise and happiness) in frontotemporal dementia. Forty frontotemporal dementia patients (18 behavioural-variant, 11 semantic dementia, 11 progressive nonfluent aphasia) and 27 healthy controls were tested on two facial emotion recognition tasks: The Ekman 60 and Ekman Caricatures. Although each frontotemporal dementia group showed impaired recognition of negative emotions, distinct associations between emotion-specific task performance and changes in grey matter intensity emerged. Fear recognition was associated with the right amygdala; disgust recognition with the left insula; anger recognition with the left middle and superior temporal gyrus; and sadness recognition with the left subcallosal cingulate, indicating that discrete neural substrates are necessary for emotion recognition in frontotemporal dementia. The erosion of emotion-specific neural networks in neurodegenerative disorders may produce distinct profiles of performance that are relevant to understanding the neurobiological basis of emotion processing. PMID:23805313

  18. Emotion recognition and oxytocin in patients with schizophrenia

    Science.gov (United States)

    Averbeck, B. B.; Bobin, T.; Evans, S.; Shergill, S. S.

    2012-01-01

    Background Studies have suggested that patients with schizophrenia are impaired at recognizing emotions. Recently, it has been shown that the neuropeptide oxytocin can have beneficial effects on social behaviors. Method To examine emotion recognition deficits in patients and see whether oxytocin could improve these deficits, we carried out two experiments. In the first experiment we recruited 30 patients with schizophrenia and 29 age- and IQ-matched control subjects, and gave them an emotion recognition task. Following this, we carried out a second experiment in which we recruited 21 patients with schizophrenia for a double-blind, placebo-controlled cross-over study of the effects of oxytocin on the same emotion recognition task. Results In the first experiment we found that patients with schizophrenia had a deficit relative to controls in recognizing emotions. In the second experiment we found that administration of oxytocin improved the ability of patients to recognize emotions. The improvement was consistent and occurred for most emotions, and was present whether patients were identifying morphed or non-morphed faces. Conclusions These data add to a growing literature showing beneficial effects of oxytocin on social–behavioral tasks, as well as clinical symptoms. PMID:21835090

  19. Social appraisal influences recognition of emotions.

    Science.gov (United States)

    Mumenthaler, Christian; Sander, David

    2012-06-01

    The notion of social appraisal emphasizes the importance of a social dimension in appraisal theories of emotion by proposing that the way an individual appraises an event is influenced by the way other individuals appraise and feel about the same event. This study directly tested this proposal by asking participants to recognize dynamic facial expressions of emotion (fear, happiness, or anger in Experiment 1; fear, happiness, anger, or neutral in Experiment 2) in a target face presented at the center of a screen while a contextual face, which appeared simultaneously in the periphery of the screen, expressed an emotion (fear, happiness, anger) or not (neutral) and either looked at the target face or not. We manipulated gaze direction to be able to distinguish between a mere contextual effect (gaze away from both the target face and the participant) and a specific social appraisal effect (gaze toward the target face). Results of both experiments provided evidence for a social appraisal effect in emotion recognition, which differed from the mere effect of contextual information: Whereas facial expressions were identical in both conditions, the direction of the gaze of the contextual face influenced emotion recognition. Social appraisal facilitated the recognition of anger, happiness, and fear when the contextual face expressed the same emotion. This facilitation was stronger than the mere contextual effect. Social appraisal also allowed better recognition of fear when the contextual face expressed anger and better recognition of anger when the contextual face expressed fear. 2012 APA, all rights reserved

  20. A motivational determinant of facial emotion recognition: regulatory focus affects recognition of emotions in faces.

    Science.gov (United States)

    Sassenrath, Claudia; Sassenberg, Kai; Ray, Devin G; Scheiter, Katharina; Jarodzka, Halszka

    2014-01-01

    Two studies examined an unexplored motivational determinant of facial emotion recognition: observer regulatory focus. It was predicted that a promotion focus would enhance facial emotion recognition relative to a prevention focus because the attentional strategies associated with promotion focus enhance performance on well-learned or innate tasks - such as facial emotion recognition. In Study 1, a promotion or a prevention focus was experimentally induced and better facial emotion recognition was observed in a promotion focus compared to a prevention focus. In Study 2, individual differences in chronic regulatory focus were assessed and attention allocation was measured using eye tracking during the facial emotion recognition task. Results indicated that the positive relation between a promotion focus and facial emotion recognition is mediated by shorter fixation duration on the face which reflects a pattern of attention allocation matched to the eager strategy in a promotion focus (i.e., striving to make hits). A prevention focus did not have an impact neither on perceptual processing nor on facial emotion recognition. Taken together, these findings demonstrate important mechanisms and consequences of observer motivational orientation for facial emotion recognition.

  1. Recognition of Facial Expressions of Different Emotional Intensities in Patients with Frontotemporal Lobar Degeneration

    Directory of Open Access Journals (Sweden)

    Roy P. C. Kessels

    2007-01-01

    Full Text Available Behavioural problems are a key feature of frontotemporal lobar degeneration (FTLD. Also, FTLD patients show impairments in emotion processing. Specifically, the perception of negative emotional facial expressions is affected. Generally, however, negative emotional expressions are regarded as more difficult to recognize than positive ones, which thus may have been a confounding factor in previous studies. Also, ceiling effects are often present on emotion recognition tasks using full-blown emotional facial expressions. In the present study with FTLD patients, we examined the perception of sadness, anger, fear, happiness, surprise and disgust at different emotional intensities on morphed facial expressions to take task difficulty into account. Results showed that our FTLD patients were specifically impaired at the recognition of the emotion anger. Also, the patients performed worse than the controls on recognition of surprise, but performed at control levels on disgust, happiness, sadness and fear. These findings corroborate and extend previous results showing deficits in emotion perception in FTLD.

  2. Neural Substrates of Auditory Emotion Recognition Deficits in Schizophrenia.

    Science.gov (United States)

    Kantrowitz, Joshua T; Hoptman, Matthew J; Leitman, David I; Moreno-Ortega, Marta; Lehrfeld, Jonathan M; Dias, Elisa; Sehatpour, Pejman; Laukka, Petri; Silipo, Gail; Javitt, Daniel C

    2015-11-04

    Deficits in auditory emotion recognition (AER) are a core feature of schizophrenia and a key component of social cognitive impairment. AER deficits are tied behaviorally to impaired ability to interpret tonal ("prosodic") features of speech that normally convey emotion, such as modulations in base pitch (F0M) and pitch variability (F0SD). These modulations can be recreated using synthetic frequency modulated (FM) tones that mimic the prosodic contours of specific emotional stimuli. The present study investigates neural mechanisms underlying impaired AER using a combined event-related potential/resting-state functional connectivity (rsfMRI) approach in 84 schizophrenia/schizoaffective disorder patients and 66 healthy comparison subjects. Mismatch negativity (MMN) to FM tones was assessed in 43 patients/36 controls. rsfMRI between auditory cortex and medial temporal (insula) regions was assessed in 55 patients/51 controls. The relationship between AER, MMN to FM tones, and rsfMRI was assessed in the subset who performed all assessments (14 patients, 21 controls). As predicted, patients showed robust reductions in MMN across FM stimulus type (p = 0.005), particularly to modulations in F0M, along with impairments in AER and FM tone discrimination. MMN source analysis indicated dipoles in both auditory cortex and anterior insula, whereas rsfMRI analyses showed reduced auditory-insula connectivity. MMN to FM tones and functional connectivity together accounted for ∼50% of the variance in AER performance across individuals. These findings demonstrate that impaired preattentive processing of tonal information and reduced auditory-insula connectivity are critical determinants of social cognitive dysfunction in schizophrenia, and thus represent key targets for future research and clinical intervention. Schizophrenia patients show deficits in the ability to infer emotion based upon tone of voice [auditory emotion recognition (AER)] that drive impairments in social cognition

  3. Multimodal approaches for emotion recognition: a survey

    Science.gov (United States)

    Sebe, Nicu; Cohen, Ira; Gevers, Theo; Huang, Thomas S.

    2005-01-01

    Recent technological advances have enabled human users to interact with computers in ways previously unimaginable. Beyond the confines of the keyboard and mouse, new modalities for human-computer interaction such as voice, gesture, and force-feedback are emerging. Despite important advances, one necessary ingredient for natural interaction is still missing-emotions. Emotions play an important role in human-to-human communication and interaction, allowing people to express themselves beyond the verbal domain. The ability to understand human emotions is desirable for the computer in several applications. This paper explores new ways of human-computer interaction that enable the computer to be more aware of the user's emotional and attentional expressions. We present the basic research in the field and the recent advances into the emotion recognition from facial, voice, and physiological signals, where the different modalities are treated independently. We then describe the challenging problem of multimodal emotion recognition and we advocate the use of probabilistic graphical models when fusing the different modalities. We also discuss the difficult issues of obtaining reliable affective data, obtaining ground truth for emotion recognition, and the use of unlabeled data.

  4. Facial expression recognition and emotional regulation in narcolepsy with cataplexy.

    Science.gov (United States)

    Bayard, Sophie; Croisier Langenier, Muriel; Dauvilliers, Yves

    2013-04-01

    Cataplexy is pathognomonic of narcolepsy with cataplexy, and defined by a transient loss of muscle tone triggered by strong emotions. Recent researches suggest abnormal amygdala function in narcolepsy with cataplexy. Emotion treatment and emotional regulation strategies are complex functions involving cortical and limbic structures, like the amygdala. As the amygdala has been shown to play a role in facial emotion recognition, we tested the hypothesis that patients with narcolepsy with cataplexy would have impaired recognition of facial emotional expressions compared with patients affected with central hypersomnia without cataplexy and healthy controls. We also aimed to determine whether cataplexy modulates emotional regulation strategies. Emotional intensity, arousal and valence ratings on Ekman faces displaying happiness, surprise, fear, anger, disgust, sadness and neutral expressions of 21 drug-free patients with narcolepsy with cataplexy were compared with 23 drug-free sex-, age- and intellectual level-matched adult patients with hypersomnia without cataplexy and 21 healthy controls. All participants underwent polysomnography recording and multiple sleep latency tests, and completed depression, anxiety and emotional regulation questionnaires. Performance of patients with narcolepsy with cataplexy did not differ from patients with hypersomnia without cataplexy or healthy controls on both intensity rating of each emotion on its prototypical label and mean ratings for valence and arousal. Moreover, patients with narcolepsy with cataplexy did not use different emotional regulation strategies. The level of depressive and anxious symptoms in narcolepsy with cataplexy did not differ from the other groups. Our results demonstrate that narcolepsy with cataplexy accurately perceives and discriminates facial emotions, and regulates emotions normally. The absence of alteration of perceived affective valence remains a major clinical interest in narcolepsy with cataplexy

  5. Biologically inspired emotion recognition from speech

    Directory of Open Access Journals (Sweden)

    Buscicchio Cosimo

    2011-01-01

    Full Text Available Abstract Emotion recognition has become a fundamental task in human-computer interaction systems. In this article, we propose an emotion recognition approach based on biologically inspired methods. Specifically, emotion classification is performed using a long short-term memory (LSTM recurrent neural network which is able to recognize long-range dependencies between successive temporal patterns. We propose to represent data using features derived from two different models: mel-frequency cepstral coefficients (MFCC and the Lyon cochlear model. In the experimental phase, results obtained from the LSTM network and the two different feature sets are compared, showing that features derived from the Lyon cochlear model give better recognition results in comparison with those obtained with the traditional MFCC representation.

  6. Facial emotion recognition in adolescents with personality pathology

    NARCIS (Netherlands)

    Berenschot, Fleur; Van Aken, Marcel A G|info:eu-repo/dai/nl/081831218; Hessels, Christel; De Castro, Bram Orobio|info:eu-repo/dai/nl/166985422; Pijl, Ysbrand; Montagne, Barbara; Van Voorst, Guus

    2014-01-01

    It has been argued that a heightened emotional sensitivity interferes with the cognitive processing of facial emotion recognition and may explain the intensified emotional reactions to external emotional stimuli of adults with personality pathology, such as borderline personality disorder (BPD).

  7. A Fuzzy Aproach For Facial Emotion Recognition

    Science.gov (United States)

    Gîlcă, Gheorghe; Bîzdoacă, Nicu-George

    2015-09-01

    This article deals with an emotion recognition system based on the fuzzy sets. Human faces are detected in images with the Viola - Jones algorithm and for its tracking in video sequences we used the Camshift algorithm. The detected human faces are transferred to the decisional fuzzy system, which is based on the variable fuzzyfication measurements of the face: eyebrow, eyelid and mouth. The system can easily determine the emotional state of a person.

  8. Emotion recognition and social skills in child and adolescent offspring of parents with schizophrenia.

    Science.gov (United States)

    Horton, Leslie E; Bridgwater, Miranda A; Haas, Gretchen L

    2017-05-01

    Emotion recognition, a social cognition domain, is impaired in people with schizophrenia and contributes to social dysfunction. Whether impaired emotion recognition emerges as a manifestation of illness or predates symptoms is unclear. Findings from studies of emotion recognition impairments in first-degree relatives of people with schizophrenia are mixed and, to our knowledge, no studies have investigated the link between emotion recognition and social functioning in that population. This study examined facial affect recognition and social skills in 16 offspring of parents with schizophrenia (familial high-risk/FHR) compared to 34 age- and sex-matched healthy controls (HC), ages 7-19. As hypothesised, FHR children exhibited impaired overall accuracy, accuracy in identifying fearful faces, and overall recognition speed relative to controls. Age-adjusted facial affect recognition accuracy scores predicted parent's overall rating of their child's social skills for both groups. This study supports the presence of facial affect recognition deficits in FHR children. Importantly, as the first known study to suggest the presence of these deficits in young, asymptomatic FHR children, it extends findings to a developmental stage predating symptoms. Further, findings point to a relationship between early emotion recognition and social skills. Improved characterisation of deficits in FHR children could inform early intervention.

  9. Disturbed emotion recognition in patients with narcissistic personality disorder.

    Science.gov (United States)

    Marissen, Marlies A E; Deen, Mathijs L; Franken, Ingmar H A

    2012-07-30

    Although theoretically the lack of empathy is a supposed key symptom of narcissistic personality disorder (NPD), empirical studies examining empathy in NPD are scarce. In the present study it was examined whether patients with NPD differ from healthy controls and a psychiatric control group in their empathic abilities. In order to examine this question, 20 patients with NPD, 20 patients with a personality disorder in the Cluster C spectrum and 20 healthy control participants were presented with a questionnaire and a facial recognition task designed to measure empathic abilities. It was found that patients with NPD did not differ from the two control groups on a self-report questionnaire indicating that patients regard themselves as sensitive to the feelings of others. On the contrary, it was found NPD patients generally performed worse on a facial emotion recognition task compared to both control groups. In addition to this general deficit in emotion recognition, patients with NPD showed a specific deficit for emotions representing fear and disgust. These results provide the first empirical evidence for impaired emotion recognition in patients with NPD. Crown Copyright © 2012. Published by Elsevier Ireland Ltd. All rights reserved.

  10. Communication Skills Training Exploiting Multimodal Emotion Recognition

    Science.gov (United States)

    Bahreini, Kiavash; Nadolski, Rob; Westera, Wim

    2017-01-01

    The teaching of communication skills is a labour-intensive task because of the detailed feedback that should be given to learners during their prolonged practice. This study investigates to what extent our FILTWAM facial and vocal emotion recognition software can be used for improving a serious game (the Communication Advisor) that delivers a…

  11. Emotion recognition a pattern analysis approach

    CERN Document Server

    Konar, Amit

    2014-01-01

    Offers both foundations and advances on emotion recognition in a single volumeProvides a thorough and insightful introduction to the subject by utilizing computational tools of diverse domainsInspires young researchers to prepare themselves for their own researchDemonstrates direction of future research through new technologies, such as Microsoft Kinect, EEG systems etc.

  12. What is the relationship between the recognition of emotions and core beliefs: Associations between the recognition of emotions in facial expressions and the maladaptive schemas in depressed patients.

    Science.gov (United States)

    Csukly, Gábor; Telek, Rita; Filipovits, Dóra; Takács, Barnabás; Unoka, Zsolt; Simon, Lajos

    2011-03-01

    Depressed patients are both characterized by social reality distorting maladaptive schemas and facial expression recognition impairments. The aim of the present study was to identify specific associations among symptom severity of depression, early maladaptive schemas and recognition patterns of facially expressed emotions. The subjects were inpatients, diagnosed with depression. We used 2 virtual humans for presenting the basic emotions to assess emotion recognition. The Symptom Check List 90 (SCL-90) was used as a self-report measure of psychiatric symptoms and the Beck Depression Inventory (BDI) was applied to assess symptoms of depression. The Young Schema Questionnaire Long Form (YSQ-L) was used to assess the presence of early maladaptive schemas. The recognition rate for happiness showed significant associations with both the BDI and the depression subscale of the SCL-90. After performing the second order factor analysis of the YSQ-L, we found statistically significant associations between the recognition indices of specific emotions and the main factors of the YSQ-L. In this study we found correlations between maladaptive schemas and emotion recognition impairments. While both domains likely contribute to the symptoms of depression, we believe that the results will help us to better understand the social cognitive deficits of depressed patients at the schema level and at the emotion recognition level. Copyright © 2010 Elsevier Ltd. All rights reserved.

  13. Optimization Methods in Emotion Recognition System

    Directory of Open Access Journals (Sweden)

    L. Povoda

    2016-09-01

    Full Text Available Emotions play big role in our everyday communication and contain important information. This work describes a novel method of automatic emotion recognition from textual data. The method is based on well-known data mining techniques, novel approach based on parallel run of SVM (Support Vector Machine classifiers, text preprocessing and 3 optimization methods: sequential elimination of attributes, parameter optimization based on token groups, and method of extending train data sets during practical testing and production release final tuning. We outperformed current state of the art methods and the results were validated on bigger data sets (3346 manually labelled samples which is less prone to overfitting when compared to related works. The accuracy achieved in this work is 86.89% for recognition of 5 emotional classes. The experiments were performed in the real world helpdesk environment, was processing Czech language but the proposed methodology is general and can be applied to many different languages.

  14. A Multidimensional Approach to the Study of Emotion Recognition in Autism Spectrum Disorders

    Directory of Open Access Journals (Sweden)

    Jean eXavier

    2015-12-01

    Full Text Available Although deficits in emotion recognition have been widely reported in Autism Spectrum Disorder (ASD, experiments have been restricted to either facial or vocal expressions. Here, we explored multimodal emotion processing in children with ASD (N=19 and with typical development (TD, N=19, considering uni (faces and voices and multimodal (faces/voices simultaneously stimuli and developmental comorbidities (neuro-visual, language and motor impairments.Compared to TD controls, children with ASD had rather high and heterogeneous emotion recognition scores but showed also several significant differences: lower emotion recognition scores for visual stimuli, for neutral emotion, and a greater number of saccades during visual task. Multivariate analyses showed that: (1 the difficulties they experienced with visual stimuli were partially alleviated multimodal stimuli. (2 Developmental age was significantly associated with emotion recognition in TD children, whereas it was the case only for the multimodal task in children with ASD. (3 Language impairments tended to be associated with emotion recognition scores of ASD children in the auditory modality. Conversely, in the visual or bimodal (visuo-auditory tasks, the impact of developmental coordination disorder or neuro-visual impairments was not found.We conclude that impaired emotion processing constitutes a dimension to explore in the field of ASD, as research has the potential to define more homogeneous subgroups and tailored interventions. However, it is clear that developmental age, the nature of the stimuli, and other developmental comorbidities must also be taken into account when studying this dimension.

  15. Brain correlates of musical and facial emotion recognition: evidence from the dementias.

    Science.gov (United States)

    Hsieh, S; Hornberger, M; Piguet, O; Hodges, J R

    2012-07-01

    The recognition of facial expressions of emotion is impaired in semantic dementia (SD) and is associated with right-sided brain atrophy in areas known to be involved in emotion processing, notably the amygdala. Whether patients with SD also experience difficulty recognizing emotions conveyed by other media, such as music, is unclear. Prior studies have used excerpts of known music from classical or film repertoire but not unfamiliar melodies designed to convey distinct emotions. Patients with SD (n = 11), Alzheimer's disease (n = 12) and healthy control participants (n = 20) underwent tests of emotion recognition in two modalities: unfamiliar musical tunes and unknown faces as well as volumetric MRI. Patients with SD were most impaired with the recognition of facial and musical emotions, particularly for negative emotions. Voxel-based morphometry showed that the labelling of emotions, regardless of modality, correlated with the degree of atrophy in the right temporal pole, amygdala and insula. The recognition of musical (but not facial) emotions was also associated with atrophy of the left anterior and inferior temporal lobe, which overlapped with regions correlating with standardized measures of verbal semantic memory. These findings highlight the common neural substrates supporting the processing of emotions by facial and musical stimuli but also indicate that the recognition of emotions from music draws upon brain regions that are associated with semantics in language. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Facial Emotion Recognition and Expression in Parkinson’s Disease: An Emotional Mirror Mechanism?

    Science.gov (United States)

    Ricciardi, Lucia; Visco-Comandini, Federica; Erro, Roberto; Morgante, Francesca; Bologna, Matteo; Fasano, Alfonso; Ricciardi, Diego; Edwards, Mark J.; Kilner, James

    2017-01-01

    Background and aim Parkinson’s disease (PD) patients have impairment of facial expressivity (hypomimia) and difficulties in interpreting the emotional facial expressions produced by others, especially for aversive emotions. We aimed to evaluate the ability to produce facial emotional expressions and to recognize facial emotional expressions produced by others in a group of PD patients and a group of healthy participants in order to explore the relationship between these two abilities and any differences between the two groups of participants. Methods Twenty non-demented, non-depressed PD patients and twenty healthy participants (HC) matched for demographic characteristics were studied. The ability of recognizing emotional facial expressions was assessed with the Ekman 60-faces test (Emotion recognition task). Participants were video-recorded while posing facial expressions of 6 primary emotions (happiness, sadness, surprise, disgust, fear and anger). The most expressive pictures for each emotion were derived from the videos. Ten healthy raters were asked to look at the pictures displayed on a computer-screen in pseudo-random fashion and to identify the emotional label in a six-forced-choice response format (Emotion expressivity task). Reaction time (RT) and accuracy of responses were recorded. At the end of each trial the participant was asked to rate his/her confidence in his/her perceived accuracy of response. Results For emotion recognition, PD reported lower score than HC for Ekman total score (pemotions sub-scores happiness, fear, anger, sadness (pfacial emotion expressivity task, PD and HC significantly differed in the total score (p = 0.05) and in the sub-scores for happiness, sadness, anger (all pemotions. There was a significant positive correlation between the emotion facial recognition and expressivity in both groups; the correlation was even stronger when ranking emotions from the best recognized to the worst (R = 0.75, p = 0.004). Conclusions PD

  17. Facial Emotion Recognition and Expression in Parkinson's Disease: An Emotional Mirror Mechanism?

    Science.gov (United States)

    Ricciardi, Lucia; Visco-Comandini, Federica; Erro, Roberto; Morgante, Francesca; Bologna, Matteo; Fasano, Alfonso; Ricciardi, Diego; Edwards, Mark J; Kilner, James

    2017-01-01

    Parkinson's disease (PD) patients have impairment of facial expressivity (hypomimia) and difficulties in interpreting the emotional facial expressions produced by others, especially for aversive emotions. We aimed to evaluate the ability to produce facial emotional expressions and to recognize facial emotional expressions produced by others in a group of PD patients and a group of healthy participants in order to explore the relationship between these two abilities and any differences between the two groups of participants. Twenty non-demented, non-depressed PD patients and twenty healthy participants (HC) matched for demographic characteristics were studied. The ability of recognizing emotional facial expressions was assessed with the Ekman 60-faces test (Emotion recognition task). Participants were video-recorded while posing facial expressions of 6 primary emotions (happiness, sadness, surprise, disgust, fear and anger). The most expressive pictures for each emotion were derived from the videos. Ten healthy raters were asked to look at the pictures displayed on a computer-screen in pseudo-random fashion and to identify the emotional label in a six-forced-choice response format (Emotion expressivity task). Reaction time (RT) and accuracy of responses were recorded. At the end of each trial the participant was asked to rate his/her confidence in his/her perceived accuracy of response. For emotion recognition, PD reported lower score than HC for Ekman total score (pemotions sub-scores happiness, fear, anger, sadness (pfacial emotion expressivity task, PD and HC significantly differed in the total score (p = 0.05) and in the sub-scores for happiness, sadness, anger (all pemotions. There was a significant positive correlation between the emotion facial recognition and expressivity in both groups; the correlation was even stronger when ranking emotions from the best recognized to the worst (R = 0.75, p = 0.004). PD patients showed difficulties in recognizing emotional

  18. Negative Emotional Arousal Impairs Associative Memory Performance for Emotionally Neutral Content in Healthy Participants.

    Directory of Open Access Journals (Sweden)

    Jonathan Guez

    Full Text Available The effect of emotional arousal on memory presents a complex pattern with previous studies reporting conflicting results of both improved and reduced memory performance following arousal manipulations. In this study we further tested the effect of negative emotional arousal (NEA on individual-item recognition and associative recognition of neutral stimuli in healthy participants, and hypothesized that NEA will particularly impair associative memory performance. The current study consists of two experiments; in both, participants studied a list of word-pairs and were then tested for items (items recognition test, and for associations (associative recognition test. In the first experiment, the arousal manipulation was induced by flashing emotionally-negative or neutral pictures between study-pairs while in the second experiment arousal was induced by presenting emotionally-negative or neutral pictures between lists. The results of the two experiments converged and supported an associative memory deficit observed under NEA conditions. We suggest that NEA is associated with an altered ability to bind one stimulus to another as a result of impaired recollection, resulting in poorer associative memory performance. The current study findings may contribute to the understanding of the mechanism underlying memory impairments reported in disorders associated with traumatic stress.

  19. Emotional face recognition deficits and medication effects in pre-manifest through stage-II Huntington's disease.

    Science.gov (United States)

    Labuschagne, Izelle; Jones, Rebecca; Callaghan, Jenny; Whitehead, Daisy; Dumas, Eve M; Say, Miranda J; Hart, Ellen P; Justo, Damian; Coleman, Allison; Dar Santos, Rachelle C; Frost, Chris; Craufurd, David; Tabrizi, Sarah J; Stout, Julie C

    2013-05-15

    Facial emotion recognition impairments have been reported in Huntington's disease (HD). However, the nature of the impairments across the spectrum of HD remains unclear. We report on emotion recognition data from 344 participants comprising premanifest HD (PreHD) and early HD patients, and controls. In a test of recognition of facial emotions, we examined responses to six basic emotional expressions and neutral expressions. In addition, and within the early HD sample, we tested for differences on emotion recognition performance between those 'on' vs. 'off' neuroleptic or selective serotonin reuptake inhibitor (SSRI) medications. The PreHD groups showed significant (precognition, compared to controls, on fearful, angry and surprised faces; whereas the early HD groups were significantly impaired across all emotions including neutral expressions. In early HD, neuroleptic use was associated with worse facial emotion recognition, whereas SSRI use was associated with better facial emotion recognition. The findings suggest that emotion recognition impairments exist across the HD spectrum, but are relatively more widespread in manifest HD than in the premanifest period. Commonly prescribed medications to treat HD-related symptoms also appear to affect emotion recognition. These findings have important implications for interpersonal communication and medication usage in HD. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  20. The familial basis of facial emotion recognition deficits in adolescents with conduct disorder and their unaffected relatives.

    Science.gov (United States)

    Sully, K; Sonuga-Barke, E J S; Fairchild, G

    2015-07-01

    There is accumulating evidence of impairments in facial emotion recognition in adolescents with conduct disorder (CD). However, the majority of studies in this area have only been able to demonstrate an association, rather than a causal link, between emotion recognition deficits and CD. To move closer towards understanding the causal pathways linking emotion recognition problems with CD, we studied emotion recognition in the unaffected first-degree relatives of CD probands, as well as those with a diagnosis of CD. Using a family-based design, we investigated facial emotion recognition in probands with CD (n = 43), their unaffected relatives (n = 21), and healthy controls (n = 38). We used the Emotion Hexagon task, an alternative forced-choice task using morphed facial expressions depicting the six primary emotions, to assess facial emotion recognition accuracy. Relative to controls, the CD group showed impaired recognition of anger, fear, happiness, sadness and surprise (all p emotion recognition deficits are present in adolescents who are at increased familial risk for developing antisocial behaviour, as well as those who have already developed CD. Consequently, impaired emotion recognition appears to be a viable familial risk marker or candidate endophenotype for CD.

  1. Association of impaired facial affect recognition with basic facial and visual processing deficits in schizophrenia.

    Science.gov (United States)

    Norton, Daniel; McBain, Ryan; Holt, Daphne J; Ongur, Dost; Chen, Yue

    2009-06-15

    Impaired emotion recognition has been reported in schizophrenia, yet the nature of this impairment is not completely understood. Recognition of facial emotion depends on processing affective and nonaffective facial signals, as well as basic visual attributes. We examined whether and how poor facial emotion recognition in schizophrenia is related to basic visual processing and nonaffective face recognition. Schizophrenia patients (n = 32) and healthy control subjects (n = 29) performed emotion discrimination, identity discrimination, and visual contrast detection tasks, where the emotionality, distinctiveness of identity, or visual contrast was systematically manipulated. Subjects determined which of two presentations in a trial contained the target: the emotional face for emotion discrimination, a specific individual for identity discrimination, and a sinusoidal grating for contrast detection. Patients had significantly higher thresholds (worse performance) than control subjects for discriminating both fearful and happy faces. Furthermore, patients' poor performance in fear discrimination was predicted by performance in visual detection and face identity discrimination. Schizophrenia patients require greater emotional signal strength to discriminate fearful or happy face images from neutral ones. Deficient emotion recognition in schizophrenia does not appear to be determined solely by affective processing but is also linked to the processing of basic visual and facial information.

  2. Audio-based deep music emotion recognition

    Science.gov (United States)

    Liu, Tong; Han, Li; Ma, Liangkai; Guo, Dongwei

    2018-05-01

    As the rapid development of multimedia networking, more and more songs are issued through the Internet and stored in large digital music libraries. However, music information retrieval on these libraries can be really hard, and the recognition of musical emotion is especially challenging. In this paper, we report a strategy to recognize the emotion contained in songs by classifying their spectrograms, which contain both the time and frequency information, with a convolutional neural network (CNN). The experiments conducted on the l000-song dataset indicate that the proposed model outperforms traditional machine learning method.

  3. Stereotypes and prejudice affect the recognition of emotional body postures.

    Science.gov (United States)

    Bijlstra, Gijsbert; Holland, Rob W; Dotsch, Ron; Wigboldus, Daniel H J

    2018-03-26

    Most research on emotion recognition focuses on facial expressions. However, people communicate emotional information through bodily cues as well. Prior research on facial expressions has demonstrated that emotion recognition is modulated by top-down processes. Here, we tested whether this top-down modulation generalizes to the recognition of emotions from body postures. We report three studies demonstrating that stereotypes and prejudice about men and women may affect how fast people classify various emotional body postures. Our results suggest that gender cues activate gender associations, which affect the recognition of emotions from body postures in a top-down fashion. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  4. Emotion recognition in girls with conduct problems.

    Science.gov (United States)

    Schwenck, Christina; Gensthaler, Angelika; Romanos, Marcel; Freitag, Christine M; Schneider, Wolfgang; Taurines, Regina

    2014-01-01

    A deficit in emotion recognition has been suggested to underlie conduct problems. Although several studies have been conducted on this topic so far, most concentrated on male participants. The aim of the current study was to compare recognition of morphed emotional faces in girls with conduct problems (CP) with elevated or low callous-unemotional (CU+ vs. CU-) traits and a matched healthy developing control group (CG). Sixteen girls with CP-CU+, 16 girls with CP-CU- and 32 controls (mean age: 13.23 years, SD=2.33 years) were included. Video clips with morphed faces were presented in two runs to assess emotion recognition. Multivariate analysis of variance with the factors group and run was performed. Girls with CP-CU- needed more time than the CG to encode sad, fearful, and happy faces and they correctly identified sadness less often. Girls with CP-CU+ outperformed the other groups in the identification of fear. Learning effects throughout runs were the same for all groups except that girls with CP-CU- correctly identified fear less often in the second run compared to the first run. Results need to be replicated with comparable tasks, which might result in subgroup-specific therapeutic recommendations.

  5. Emotional recognition from dynamic facial, vocal and musical expressions following traumatic brain injury.

    Science.gov (United States)

    Drapeau, Joanie; Gosselin, Nathalie; Peretz, Isabelle; McKerral, Michelle

    2017-01-01

    To assess emotion recognition from dynamic facial, vocal and musical expressions in sub-groups of adults with traumatic brain injuries (TBI) of different severities and identify possible common underlying mechanisms across domains. Forty-one adults participated in this study: 10 with moderate-severe TBI, nine with complicated mild TBI, 11 with uncomplicated mild TBI and 11 healthy controls, who were administered experimental (emotional recognition, valence-arousal) and control tasks (emotional and structural discrimination) for each domain. Recognition of fearful faces was significantly impaired in moderate-severe and in complicated mild TBI sub-groups, as compared to those with uncomplicated mild TBI and controls. Effect sizes were medium-large. Participants with lower GCS scores performed more poorly when recognizing fearful dynamic facial expressions. Emotion recognition from auditory domains was preserved following TBI, irrespective of severity. All groups performed equally on control tasks, indicating no perceptual disorders. Although emotional recognition from vocal and musical expressions was preserved, no correlation was found across auditory domains. This preliminary study may contribute to improving comprehension of emotional recognition following TBI. Future studies of larger samples could usefully include measures of functional impacts of recognition deficits for fearful facial expressions. These could help refine interventions for emotional recognition following a brain injury.

  6. Theory of mind and its relationship with executive functions and emotion recognition in borderline personality disorder.

    Science.gov (United States)

    Baez, Sandra; Marengo, Juan; Perez, Ana; Huepe, David; Font, Fernanda Giralt; Rial, Veronica; Gonzalez-Gadea, María Luz; Manes, Facundo; Ibanez, Agustin

    2015-09-01

    Impaired social cognition has been claimed to be a mechanism underlying the development and maintenance of borderline personality disorder (BPD). One important aspect of social cognition is the theory of mind (ToM), a complex skill that seems to be influenced by more basic processes, such as executive functions (EF) and emotion recognition. Previous ToM studies in BPD have yielded inconsistent results. This study assessed the performance of BPD adults on ToM, emotion recognition, and EF tasks. We also examined whether EF and emotion recognition could predict the performance on ToM tasks. We evaluated 15 adults with BPD and 15 matched healthy controls using different tasks of EF, emotion recognition, and ToM. The results showed that BPD adults exhibited deficits in the three domains, which seem to be task-dependent. Furthermore, we found that EF and emotion recognition predicted the performance on ToM. Our results suggest that tasks that involve real-life social scenarios and contextual cues are more sensitive to detect ToM and emotion recognition deficits in BPD individuals. Our findings also indicate that (a) ToM variability in BPD is partially explained by individual differences on EF and emotion recognition; and (b) ToM deficits of BPD patients are partially explained by the capacity to integrate cues from face, prosody, gesture, and social context to identify the emotions and others' beliefs. © 2014 The British Psychological Society.

  7. Recognition of emotional facial expressions in adolescents with anorexia nervosa and adolescents with major depression.

    Science.gov (United States)

    Sfärlea, Anca; Greimel, Ellen; Platt, Belinda; Dieler, Alica C; Schulte-Körne, Gerd

    2018-04-01

    Anorexia nervosa (AN) has been suggested to be associated with abnormalities in facial emotion recognition. Most prior studies on facial emotion recognition in AN have investigated adult samples, despite the onset of AN being particularly often during adolescence. In addition, few studies have examined whether impairments in facial emotion recognition are specific to AN or might be explained by frequent comorbid conditions that are also associated with deficits in emotion recognition, such as depression. The present study addressed these gaps by investigating recognition of emotional facial expressions in adolescent girls with AN (n = 26) compared to girls with major depression (MD; n = 26) and healthy girls (HC; n = 37). Participants completed one task requiring identification of emotions (happy, sad, afraid, angry, neutral) in faces and two control tasks. Neither of the clinical groups showed impairments. The AN group was more accurate than the HC group in recognising afraid facial expressions and more accurate than the MD group in recognising happy, sad, and afraid expressions. Misclassification analyses identified subtle group differences in the types of errors made. The results suggest that the deficits in facial emotion recognition found in adult AN samples are not present in adolescent patients. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Theory of mind and emotion recognition skills in children with specific language impairment, autism spectrum disorder and typical development: group differences and connection to knowledge of grammatical morphology, word-finding abilities and verbal working memory.

    Science.gov (United States)

    Loukusa, Soile; Mäkinen, Leena; Kuusikko-Gauffin, Sanna; Ebeling, Hanna; Moilanen, Irma

    2014-01-01

    Social perception skills, such as understanding the mind and emotions of others, affect children's communication abilities in real-life situations. In addition to autism spectrum disorder (ASD), there is increasing knowledge that children with specific language impairment (SLI) also demonstrate difficulties in their social perception abilities. To compare the performance of children with SLI, ASD and typical development (TD) in social perception tasks measuring Theory of Mind (ToM) and emotion recognition. In addition, to evaluate the association between social perception tasks and language tests measuring word-finding abilities, knowledge of grammatical morphology and verbal working memory. Children with SLI (n = 18), ASD (n = 14) and TD (n = 25) completed two NEPSY-II subtests measuring social perception abilities: (1) Affect Recognition and (2) ToM (includes Verbal and non-verbal Contextual tasks). In addition, children's word-finding abilities were measured with the TWF-2, grammatical morphology by using the Grammatical Closure subtest of ITPA, and verbal working memory by using subtests of Sentence Repetition or Word List Interference (chosen according the child's age) of the NEPSY-II. Children with ASD scored significantly lower than children with SLI or TD on the NEPSY-II Affect Recognition subtest. Both SLI and ASD groups scored significantly lower than TD children on Verbal tasks of the ToM subtest of NEPSY-II. However, there were no significant group differences on non-verbal Contextual tasks of the ToM subtest of the NEPSY-II. Verbal tasks of the ToM subtest were correlated with the Grammatical Closure subtest and TWF-2 in children with SLI. In children with ASD correlation between TWF-2 and ToM: Verbal tasks was moderate, almost achieving statistical significance, but no other correlations were found. Both SLI and ASD groups showed difficulties in tasks measuring verbal ToM but differences were not found in tasks measuring non-verbal Contextual ToM. The

  9. Social Approach and Emotion Recognition in Fragile X Syndrome

    Science.gov (United States)

    Williams, Tracey A.; Porter, Melanie A.; Langdon, Robyn

    2014-01-01

    Evidence is emerging that individuals with Fragile X syndrome (FXS) display emotion recognition deficits, which may contribute to their significant social difficulties. The current study investigated the emotion recognition abilities, and social approachability judgments, of FXS individuals when processing emotional stimuli. Relative to…

  10. Body Emotion Recognition Disproportionately Depends on Vertical Orientations during Childhood

    Science.gov (United States)

    Balas, Benjamin; Auen, Amanda; Saville, Alyson; Schmidt, Jamie

    2018-01-01

    Children's ability to recognize emotional expressions from faces and bodies develops during childhood. However, the low-level features that support accurate body emotion recognition during development have not been well characterized. This is in marked contrast to facial emotion recognition, which is known to depend upon specific spatial frequency…

  11. The effects of a distracting N-back task on recognition memory are reduced by negative emotional intensity.

    Directory of Open Access Journals (Sweden)

    Luciano G Buratto

    Full Text Available Memory performance is usually impaired when participants have to encode information while performing a concurrent task. Recent studies using recall tasks have found that emotional items are more resistant to such cognitive depletion effects than non-emotional items. However, when recognition tasks are used, the same effect is more elusive as recent recognition studies have obtained contradictory results. In two experiments, we provide evidence that negative emotional content can reliably reduce the effects of cognitive depletion on recognition memory only if stimuli with high levels of emotional intensity are used. In particular, we found that recognition performance for realistic pictures was impaired by a secondary 3-back working memory task during encoding if stimuli were emotionally neutral or had moderate levels of negative emotionality. In contrast, when negative pictures with high levels of emotional intensity were used, the detrimental effects of the secondary task were significantly attenuated.

  12. The Effects of a Distracting N-Back Task on Recognition Memory Are Reduced by Negative Emotional Intensity

    Science.gov (United States)

    Buratto, Luciano G.; Pottage, Claire L.; Brown, Charity; Morrison, Catriona M.; Schaefer, Alexandre

    2014-01-01

    Memory performance is usually impaired when participants have to encode information while performing a concurrent task. Recent studies using recall tasks have found that emotional items are more resistant to such cognitive depletion effects than non-emotional items. However, when recognition tasks are used, the same effect is more elusive as recent recognition studies have obtained contradictory results. In two experiments, we provide evidence that negative emotional content can reliably reduce the effects of cognitive depletion on recognition memory only if stimuli with high levels of emotional intensity are used. In particular, we found that recognition performance for realistic pictures was impaired by a secondary 3-back working memory task during encoding if stimuli were emotionally neutral or had moderate levels of negative emotionality. In contrast, when negative pictures with high levels of emotional intensity were used, the detrimental effects of the secondary task were significantly attenuated. PMID:25330251

  13. Emotional recognition in depressed epilepsy patients.

    Science.gov (United States)

    Brand, Jesse G; Burton, Leslie A; Schaffer, Sarah G; Alper, Kenneth R; Devinsky, Orrin; Barr, William B

    2009-07-01

    The current study examined the relationship between emotional recognition and depression using the Minnesota Multiphasic Personality Inventory, Second Edition (MMPI-2), in a population with epilepsy. Participants were a mixture of surgical candidates in addition to those receiving neuropsychological testing as part of a comprehensive evaluation. Results suggested that patients with epilepsy reporting increased levels of depression (Scale D) performed better than those patients reporting low levels of depression on an index of simple facial recognition, and depression was associated with poor prosody discrimination. Further, it is notable that more than half of the present sample had significantly elevated Scale D scores. The potential effects of a mood-congruent bias and implications for social functioning in depressed patients with epilepsy are discussed.

  14. Neutral and emotional episodic memory: global impairment after lorazepam or scopolamine.

    Science.gov (United States)

    Kamboj, Sunjeev K; Curran, H Valerie

    2006-11-01

    Benzodiazepines and anticholinergic drugs have repeatedly been shown to impair episodic memory for emotionally neutral material in humans. However, their effect on memory for emotionally laden stimuli has been relatively neglected. We sought to investigate the effects of the benzodiazepine, lorazepam, and the anticholinergic, scopolamine, on incidental episodic memory for neutral and emotional components of a narrative memory task in humans. A double-blind, placebo-controlled independent group design was used with 48 healthy volunteers to examine the effects of these drugs on emotional and neutral episodic memory. As expected, the emotional memory advantage was retained for recall and recognition memory under placebo conditions. However, lorazepam and scopolamine produced anterograde recognition memory impairments on both the neutral and emotional components of the narrative, although floor effects were obtained for recall memory. Furthermore, compared with placebo, recognition memory for both central (gist) and peripheral (detail) aspects of neutral and emotional elements of the narrative was poorer after either drug. Benzodiazepine-induced GABAergic enhancement or scopolamine-induced cholinergic hypofunction results in a loss of the enhancing effect of emotional arousal on memory. Furthermore, lorazepam- and scopolamine-induced memory impairment for both gist (which is amygdala dependent) and detail raises the possibility that their effects on emotional memory do not depend only on the amygdala. We discuss the results with reference to potential clinical/forensic implications of processing emotional memories under conditions of globally impaired episodic memory.

  15. Prosody recognition and audiovisual emotion matching in schizophrenia: the contribution of cognition and psychopathology.

    Science.gov (United States)

    Castagna, Filomena; Montemagni, Cristiana; Maria Milani, Anna; Rocca, Giuseppe; Rocca, Paola; Casacchia, Massimo; Bogetto, Filippo

    2013-02-28

    This study aimed to evaluate the ability to decode emotion in the auditory and audiovisual modality in a group of patients with schizophrenia, and to explore the role of cognition and psychopathology in affecting these emotion recognition abilities. Ninety-four outpatients in a stable phase and 51 healthy subjects were recruited. Patients were assessed through a psychiatric evaluation and a wide neuropsychological battery. All subjects completed the comprehensive affect testing system (CATS), a group of computerized tests designed to evaluate emotion perception abilities. With respect to the controls, patients were not impaired in the CATS tasks involving discrimination of nonemotional prosody, naming of emotional stimuli expressed by voice and judging the emotional content of a sentence, whereas they showed a specific impairment in decoding emotion in a conflicting auditory condition and in the multichannel modality. Prosody impairment was affected by executive functions, attention and negative symptoms, while deficit in multisensory emotion recognition was affected by executive functions and negative symptoms. These emotion recognition deficits, rather than being associated purely with emotion perception disturbances in schizophrenia, are affected by core symptoms of the illness. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  16. Emotion recognition pattern in adolescent boys with attention-deficit/hyperactivity disorder.

    Science.gov (United States)

    Aspan, Nikoletta; Bozsik, Csilla; Gadoros, Julia; Nagy, Peter; Inantsy-Pap, Judit; Vida, Peter; Halasz, Jozsef

    2014-01-01

    Social and emotional deficits were recently considered as inherent features of individuals with attention-deficit hyperactivity disorder (ADHD), but only sporadic literature data exist on emotion recognition in adolescents with ADHD. The aim of the present study was to establish emotion recognition profile in adolescent boys with ADHD in comparison with control adolescents. Forty-four adolescent boys (13-16 years) participated in the study after informed consent; 22 boys had a clinical diagnosis of ADHD, while data were also assessed from 22 adolescent control boys matched for age and Raven IQ. Parent- and self-reported behavioral characteristics were assessed by the means of the Strengths and Difficulties Questionnaire. The recognition of six basic emotions was evaluated by the "Facial Expressions of Emotion-Stimuli and Tests." Compared to controls, adolescents with ADHD were more sensitive in the recognition of disgust and, worse in the recognition of fear and showed a tendency for impaired recognition of sadness. Hyperactivity measures showed an inverse correlation with fear recognition. Our data suggest that adolescent boys with ADHD have alterations in the recognition of specific emotions.

  17. Facial emotion recognition in patients with focal and diffuse axonal injury.

    Science.gov (United States)

    Yassin, Walid; Callahan, Brandy L; Ubukata, Shiho; Sugihara, Genichi; Murai, Toshiya; Ueda, Keita

    2017-01-01

    Facial emotion recognition impairment has been well documented in patients with traumatic brain injury. Studies exploring the neural substrates involved in such deficits have implicated specific grey matter structures (e.g. orbitofrontal regions), as well as diffuse white matter damage. Our study aims to clarify whether different types of injuries (i.e. focal vs. diffuse) will lead to different types of impairments on facial emotion recognition tasks, as no study has directly compared these patients. The present study examined performance and response patterns on a facial emotion recognition task in 14 participants with diffuse axonal injury (DAI), 14 with focal injury (FI) and 22 healthy controls. We found that, overall, participants with FI and DAI performed more poorly than controls on the facial emotion recognition task. Further, we observed comparable emotion recognition performance in participants with FI and DAI, despite differences in the nature and distribution of their lesions. However, the rating response pattern between the patient groups was different. This is the first study to show that pure DAI, without gross focal lesions, can independently lead to facial emotion recognition deficits and that rating patterns differ depending on the type and location of trauma.

  18. The structural neuroanatomy of music emotion recognition: evidence from frontotemporal lobar degeneration.

    Science.gov (United States)

    Omar, Rohani; Henley, Susie M D; Bartlett, Jonathan W; Hailstone, Julia C; Gordon, Elizabeth; Sauter, Disa A; Frost, Chris; Scott, Sophie K; Warren, Jason D

    2011-06-01

    Despite growing clinical and neurobiological interest in the brain mechanisms that process emotion in music, these mechanisms remain incompletely understood. Patients with frontotemporal lobar degeneration (FTLD) frequently exhibit clinical syndromes that illustrate the effects of breakdown in emotional and social functioning. Here we investigated the neuroanatomical substrate for recognition of musical emotion in a cohort of 26 patients with FTLD (16 with behavioural variant frontotemporal dementia, bvFTD, 10 with semantic dementia, SemD) using voxel-based morphometry. On neuropsychological evaluation, patients with FTLD showed deficient recognition of canonical emotions (happiness, sadness, anger and fear) from music as well as faces and voices compared with healthy control subjects. Impaired recognition of emotions from music was specifically associated with grey matter loss in a distributed cerebral network including insula, orbitofrontal cortex, anterior cingulate and medial prefrontal cortex, anterior temporal and more posterior temporal and parietal cortices, amygdala and the subcortical mesolimbic system. This network constitutes an essential brain substrate for recognition of musical emotion that overlaps with brain regions previously implicated in coding emotional value, behavioural context, conceptual knowledge and theory of mind. Musical emotion recognition may probe the interface of these processes, delineating a profile of brain damage that is essential for the abstraction of complex social emotions. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. Cocaine users manifest impaired prosodic and cross-modal emotion processing

    Directory of Open Access Journals (Sweden)

    Lea M Hulka

    2013-09-01

    Full Text Available Background: A small number of previous studies have provided evidence that cocaine users exhibit impairments in complex social cognition tasks, while the more basic facial emotion recognition is widely unaffected. However, prosody and cross-modal emotion processing has not been systematically investigated in cocaine users so far. Therefore, the aim of the present study was to assess complex multisensory emotion processing in cocaine users in comparison to controls and to examine a potential association with drug use patterns.Method: The abbreviated version of the Comprehensive Affect Testing System (CATS-A was used to measure emotion perception across the three channels of facial affect, prosody, and semantic content in 58 cocaine users and 48 healthy control subjects who were matched for age, sex, verbal intelligence, and years of education.Results: Cocaine users had significantly lower scores than controls in the quotient scales of Emotion Recognition and Prosody Recognition and the subtests Conflicting Prosody/Meaning – Attend to Prosody and Match Emotional Prosody to Emotional Face either requiring to attend to prosody or to integrate cross-modal information. In contrast, no group difference emerged for the Affect Recognition Quotient. Cumulative cocaine doses and duration of cocaine use correlated negatively with emotion processing.Conclusion: Cocaine users show impaired cross-modal integration of different emotion processing channels particularly with regard to prosody, whereas more basic aspects of emotion processing such as facial affect perception are comparable to the performance of healthy controls.

  20. Facial emotion recognition in Williams syndrome and Down syndrome: A matching and developmental study.

    Science.gov (United States)

    Martínez-Castilla, Pastora; Burt, Michael; Borgatti, Renato; Gagliardi, Chiara

    2015-01-01

    In this study both the matching and developmental trajectories approaches were used to clarify questions that remain open in the literature on facial emotion recognition in Williams syndrome (WS) and Down syndrome (DS). The matching approach showed that individuals with WS or DS exhibit neither proficiency for the expression of happiness nor specific impairments for negative emotions. Instead, they present the same pattern of emotion recognition as typically developing (TD) individuals. Thus, the better performance on the recognition of positive compared to negative emotions usually reported in WS and DS is not specific of these populations but seems to represent a typical pattern. Prior studies based on the matching approach suggested that the development of facial emotion recognition is delayed in WS and atypical in DS. Nevertheless, and even though performance levels were lower in DS than in WS, the developmental trajectories approach used in this study evidenced that not only individuals with DS but also those with WS present atypical development in facial emotion recognition. Unlike in the TD participants, where developmental changes were observed along with age, in the WS and DS groups, the development of facial emotion recognition was static. Both individuals with WS and those with DS reached an early maximum developmental level due to cognitive constraints.

  1. Preserved Affective Sharing But Impaired Decoding of Contextual Complex Emotions in Alcohol Dependence.

    Science.gov (United States)

    Grynberg, Delphine; Maurage, Pierre; Nandrino, Jean-Louis

    2017-04-01

    Prior research has repeatedly shown that alcohol dependence is associated with a large range of impairments in psychological processes, which could lead to interpersonal deficits. Specifically, it has been suggested that these interpersonal difficulties are underpinned by reduced recognition and sharing of others' emotional states. However, this pattern of deficits remains to be clarified. This study thus aimed to investigate whether alcohol dependence is associated with impaired abilities in decoding contextual complex emotions and with altered sharing of others' emotions. Forty-one alcohol-dependent individuals (ADI) and 37 matched healthy individuals completed the Multifaceted Empathy Test, in which they were instructed to identify complex emotional states expressed by individuals in contextual scenes and to state to what extent they shared them. Compared to healthy individuals, ADI were impaired in identifying negative (Cohen's d = 0.75) and positive (Cohen's d = 0.46) emotional states but, conversely, presented preserved abilities in sharing others' emotional states. This study shows that alcohol dependence is characterized by an impaired ability to decode complex emotional states (both positive and negative), despite the presence of complementary contextual cues, but by preserved emotion-sharing. Therefore, these results extend earlier data describing an impaired ability to decode noncontextualized emotions toward contextualized and ecologically valid emotional states. They also indicate that some essential emotional competences such as emotion-sharing are preserved in alcohol dependence, thereby offering potential therapeutic levers. Copyright © 2017 by the Research Society on Alcoholism.

  2. A new selective developmental deficit: Impaired object recognition with normal face recognition.

    Science.gov (United States)

    Germine, Laura; Cashdollar, Nathan; Düzel, Emrah; Duchaine, Bradley

    2011-05-01

    Studies of developmental deficits in face recognition, or developmental prosopagnosia, have shown that individuals who have not suffered brain damage can show face recognition impairments coupled with normal object recognition (Duchaine and Nakayama, 2005; Duchaine et al., 2006; Nunn et al., 2001). However, no developmental cases with the opposite dissociation - normal face recognition with impaired object recognition - have been reported. The existence of a case of non-face developmental visual agnosia would indicate that the development of normal face recognition mechanisms does not rely on the development of normal object recognition mechanisms. To see whether a developmental variant of non-face visual object agnosia exists, we conducted a series of web-based object and face recognition tests to screen for individuals showing object recognition memory impairments but not face recognition impairments. Through this screening process, we identified AW, an otherwise normal 19-year-old female, who was then tested in the lab on face and object recognition tests. AW's performance was impaired in within-class visual recognition memory across six different visual categories (guns, horses, scenes, tools, doors, and cars). In contrast, she scored normally on seven tests of face recognition, tests of memory for two other object categories (houses and glasses), and tests of recall memory for visual shapes. Testing confirmed that her impairment was not related to a general deficit in lower-level perception, object perception, basic-level recognition, or memory. AW's results provide the first neuropsychological evidence that recognition memory for non-face visual object categories can be selectively impaired in individuals without brain damage or other memory impairment. These results indicate that the development of recognition memory for faces does not depend on intact object recognition memory and provide further evidence for category-specific dissociations in visual

  3. Ventromedial prefrontal cortex mediates visual attention during facial emotion recognition.

    Science.gov (United States)

    Wolf, Richard C; Philippi, Carissa L; Motzkin, Julian C; Baskaya, Mustafa K; Koenigs, Michael

    2014-06-01

    The ventromedial prefrontal cortex is known to play a crucial role in regulating human social and emotional behaviour, yet the precise mechanisms by which it subserves this broad function remain unclear. Whereas previous neuropsychological studies have largely focused on the role of the ventromedial prefrontal cortex in higher-order deliberative processes related to valuation and decision-making, here we test whether ventromedial prefrontal cortex may also be critical for more basic aspects of orienting attention to socially and emotionally meaningful stimuli. Using eye tracking during a test of facial emotion recognition in a sample of lesion patients, we show that bilateral ventromedial prefrontal cortex damage impairs visual attention to the eye regions of faces, particularly for fearful faces. This finding demonstrates a heretofore unrecognized function of the ventromedial prefrontal cortex-the basic attentional process of controlling eye movements to faces expressing emotion. © The Author (2014). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Emotional availability, understanding emotions, and recognition of facial emotions in obese mothers with young children.

    Science.gov (United States)

    Bergmann, Sarah; von Klitzing, Kai; Keitel-Korndörfer, Anja; Wendt, Verena; Grube, Matthias; Herpertz, Sarah; Schütz, Astrid; Klein, Annette M

    2016-01-01

    Recent research has identified mother-child relationships of low quality as possible risk factors for childhood obesity. However, it remains open how mothers' own obesity influences the quality of mother-child interaction, and particularly emotional availability (EA). Also unclear is the influence of maternal emotional competencies, i.e. understanding emotions and recognizing facial emotions. This study aimed to (1) investigate differences between obese and normal-weight mothers regarding mother-child EA, maternal understanding emotions and recognition of facial emotions, and (2) explore how maternal emotional competencies and maternal weight interact with each other in predicting EA. A better understanding of these associations could inform strategies of obesity prevention especially in children at risk. We assessed EA, understanding emotions and recognition of facial emotions in 73 obese versus 73 normal-weight mothers, and their children aged 6 to 47 months (Mchild age=24.49, 80 females). Obese mothers showed lower EA and understanding emotions. Mothers' normal weight and their ability to understand emotions were positively associated with EA. The ability to recognize facial emotions was positively associated with EA in obese but not in normal-weight mothers. Maternal weight status indirectly influenced EA through its effect on understanding emotions. Maternal emotional competencies may play an important role for establishing high EA in interaction with the child. Children of obese mothers experience lower EA, which may contribute to overweight development. We suggest including elements that aim to improve maternal emotional competencies and mother-child EA in prevention or intervention programmes targeting childhood obesity. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Eye-Gaze Analysis of Facial Emotion Recognition and Expression in Adolescents with ASD.

    Science.gov (United States)

    Wieckowski, Andrea Trubanova; White, Susan W

    2017-01-01

    Impaired emotion recognition and expression in individuals with autism spectrum disorder (ASD) may contribute to observed social impairment. The aim of this study was to examine the role of visual attention directed toward nonsocial aspects of a scene as a possible mechanism underlying recognition and expressive ability deficiency in ASD. One recognition and two expression tasks were administered. Recognition was assessed in force-choice paradigm, and expression was assessed during scripted and free-choice response (in response to emotional stimuli) tasks in youth with ASD (n = 20) and an age-matched sample of typically developing youth (n = 20). During stimulus presentation prior to response in each task, participants' eye gaze was tracked. Youth with ASD were less accurate at identifying disgust and sadness in the recognition task. They fixated less to the eye region of stimuli showing surprise. A group difference was found during the free-choice response task, such that those with ASD expressed emotion less clearly but not during the scripted task. Results suggest altered eye gaze to the mouth region but not the eye region as a candidate mechanism for decreased ability to recognize or express emotion. Findings inform our understanding of the association between social attention and emotion recognition and expression deficits.

  6. Comparison of emotion recognition from facial expression and music.

    Science.gov (United States)

    Gaspar, Tina; Labor, Marina; Jurić, Iva; Dumancić, Dijana; Ilakovac, Vesna; Heffer, Marija

    2011-01-01

    The recognition of basic emotions in everyday communication involves interpretation of different visual and auditory clues. The ability to recognize emotions is not clearly determined as their presentation is usually very short (micro expressions), whereas the recognition itself does not have to be a conscious process. We assumed that the recognition from facial expressions is selected over the recognition of emotions communicated through music. In order to compare the success rate in recognizing emotions presented as facial expressions or in classical music works we conducted a survey which included 90 elementary school and 87 high school students from Osijek (Croatia). The participants had to match 8 photographs of different emotions expressed on the face and 8 pieces of classical music works with 8 offered emotions. The recognition of emotions expressed through classical music pieces was significantly less successful than the recognition of emotional facial expressions. The high school students were significantly better at recognizing facial emotions than the elementary school students, whereas girls were better than boys. The success rate in recognizing emotions from music pieces was associated with higher grades in mathematics. Basic emotions are far better recognized if presented on human faces than in music, possibly because the understanding of facial emotions is one of the oldest communication skills in human society. Female advantage in emotion recognition was selected due to the necessity of their communication with the newborns during early development. The proficiency in recognizing emotional content of music and mathematical skills probably share some general cognitive skills like attention, memory and motivation. Music pieces were differently processed in brain than facial expressions and consequently, probably differently evaluated as relevant emotional clues.

  7. Emotion recognition in body dysmorphic disorder: application of the Reading the Mind in the Eyes Task.

    Science.gov (United States)

    Buhlmann, Ulrike; Winter, Anna; Kathmann, Norbert

    2013-03-01

    Body dysmorphic disorder (BDD) is characterized by perceived appearance-related defects, often tied to aspects of the face or head (e.g., acne). Deficits in decoding emotional expressions have been examined in several psychological disorders including BDD. Previous research indicates that BDD is associated with impaired facial emotion recognition, particularly in situations that involve the BDD sufferer him/herself. The purpose of this study was to further evaluate the ability to read other people's emotions among 31 individuals with BDD, and 31 mentally healthy controls. We applied the Reading the Mind in the Eyes task, in which participants are presented with a series of pairs of eyes, one at a time, and are asked to identify the emotion that describes the stimulus best. The groups did not differ with respect to decoding other people's emotions by looking into their eyes. Findings are discussed in light of previous research examining emotion recognition in BDD. Copyright © 2013. Published by Elsevier Ltd.

  8. Recognition of Emotion by Chinese and Australian Children.

    Science.gov (United States)

    Markham, Roslyn; Wang, Lei

    1996-01-01

    Compared the recognition of emotion from facial expression by 72 Chinese and 72 Australian children using photographs of Chinese and Caucasian faces. Results provide some evidence for an ethnic bias effect in emotion recognition and demonstrate an increase in overall accuracy with age. Cultural differences are discussed. (SLD)

  9. Influences on Facial Emotion Recognition in Deaf Children

    Science.gov (United States)

    Sidera, Francesc; Amadó, Anna; Martínez, Laura

    2017-01-01

    This exploratory research is aimed at studying facial emotion recognition abilities in deaf children and how they relate to linguistic skills and the characteristics of deafness. A total of 166 participants (75 deaf) aged 3-8 years were administered the following tasks: facial emotion recognition, naming vocabulary and cognitive ability. The…

  10. Neuroticism and facial emotion recognition in healthy adults

    NARCIS (Netherlands)

    Andric, Sanja; Maric, Nadja P.; Knezevic, Goran; Mihaljevic, Marina; Mirjanic, Tijana; Velthorst, Eva; van Os, Jim

    2016-01-01

    The aim of the present study was to examine whether healthy individuals with higher levels of neuroticism, a robust independent predictor of psychopathology, exhibit altered facial emotion recognition performance. Facial emotion recognition accuracy was investigated in 104 healthy adults using the

  11. Impaired Odor Recognition Memory in Patients with Hippocampal Lesions

    Science.gov (United States)

    Levy, Daniel A.; Squire, Larry R.; Hopkins, Ramona O.

    2004-01-01

    In humans, impaired recognition memory following lesions thought to be limited to the hippocampal region has been demonstrated for a wide variety of tasks. However, the importance of the human hippocampus for olfactory recognition memory has scarcely been explored. We evaluated the ability of memory-impaired patients with damage thought to be…

  12. SPEECH EMOTION RECOGNITION USING MODIFIED QUADRATIC DISCRIMINATION FUNCTION

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Quadratic Discrimination Function(QDF)is commonly used in speech emotion recognition,which proceeds on the premise that the input data is normal distribution.In this Paper,we propose a transformation to normalize the emotional features,then derivate a Modified QDF(MQDF) to speech emotion recognition.Features based on prosody and voice quality are extracted and Principal Component Analysis Neural Network (PCANN) is used to reduce dimension of the feature vectors.The results show that voice quality features are effective supplement for recognition.and the method in this paper could improve the recognition ratio effectively.

  13. Automatic Emotion Recognition in Speech: Possibilities and Significance

    Directory of Open Access Journals (Sweden)

    Milana Bojanić

    2009-12-01

    Full Text Available Automatic speech recognition and spoken language understanding are crucial steps towards a natural humanmachine interaction. The main task of the speech communication process is the recognition of the word sequence, but the recognition of prosody, emotion and stress tags may be of particular importance as well. This paper discusses thepossibilities of recognition emotion from speech signal in order to improve ASR, and also provides the analysis of acoustic features that can be used for the detection of speaker’s emotion and stress. The paper also provides a short overview of emotion and stress classification techniques. The importance and place of emotional speech recognition is shown in the domain of human-computer interactive systems and transaction communication model. The directions for future work are given at the end of this work.

  14. Facial emotion perception impairments in schizophrenia patients with comorbid antisocial personality disorder.

    Science.gov (United States)

    Tang, Dorothy Y Y; Liu, Amy C Y; Lui, Simon S Y; Lam, Bess Y H; Siu, Bonnie W M; Lee, Tatia M C; Cheung, Eric F C

    2016-02-28

    Impairment in facial emotion perception is believed to be associated with aggression. Schizophrenia patients with antisocial features are more impaired in facial emotion perception than their counterparts without these features. However, previous studies did not define the comorbidity of antisocial personality disorder (ASPD) using stringent criteria. We recruited 30 participants with dual diagnoses of ASPD and schizophrenia, 30 participants with schizophrenia and 30 controls. We employed the Facial Emotional Recognition paradigm to measure facial emotion perception, and administered a battery of neurocognitive tests. The Life History of Aggression scale was used. ANOVAs and ANCOVAs were conducted to examine group differences in facial emotion perception, and control for the effect of other neurocognitive dysfunctions on facial emotion perception. Correlational analyses were conducted to examine the association between facial emotion perception and aggression. Patients with dual diagnoses performed worst in facial emotion perception among the three groups. The group differences in facial emotion perception remained significant, even after other neurocognitive impairments were controlled for. Severity of aggression was correlated with impairment in perceiving negative-valenced facial emotions in patients with dual diagnoses. Our findings support the presence of facial emotion perception impairment and its association with aggression in schizophrenia patients with comorbid ASPD. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. Facial Emotion Recognition Using Context Based Multimodal Approach

    Directory of Open Access Journals (Sweden)

    Priya Metri

    2011-12-01

    Full Text Available Emotions play a crucial role in person to person interaction. In recent years, there has been a growing interest in improving all aspects of interaction between humans and computers. The ability to understand human emotions is desirable for the computer in several applications especially by observing facial expressions. This paper explores a ways of human-computer interaction that enable the computer to be more aware of the user’s emotional expressions we present a approach for the emotion recognition from a facial expression, hand and body posture. Our model uses multimodal emotion recognition system in which we use two different models for facial expression recognition and for hand and body posture recognition and then combining the result of both classifiers using a third classifier which give the resulting emotion . Multimodal system gives more accurate result than a signal or bimodal system

  16. Facial emotion recognition deficits following moderate-severe Traumatic Brain Injury (TBI): re-examining the valence effect and the role of emotion intensity.

    Science.gov (United States)

    Rosenberg, Hannah; McDonald, Skye; Dethier, Marie; Kessels, Roy P C; Westbrook, R Frederick

    2014-11-01

    Many individuals who sustain moderate-severe traumatic brain injuries (TBI) are poor at recognizing emotional expressions, with a greater impairment in recognizing negative (e.g., fear, disgust, sadness, and anger) than positive emotions (e.g., happiness and surprise). It has been questioned whether this "valence effect" might be an artifact of the wide use of static facial emotion stimuli (usually full-blown expressions) which differ in difficulty rather than a real consequence of brain impairment. This study aimed to investigate the valence effect in TBI, while examining emotion recognition across different intensities (low, medium, and high). Twenty-seven individuals with TBI and 28 matched control participants were tested on the Emotion Recognition Task (ERT). The TBI group was more impaired in overall emotion recognition, and less accurate recognizing negative emotions. However, examining the performance across the different intensities indicated that this difference was driven by some emotions (e.g., happiness) being much easier to recognize than others (e.g., fear and surprise). Our findings indicate that individuals with TBI have an overall deficit in facial emotion recognition, and that both people with TBI and control participants found some emotions more difficult than others. These results suggest that conventional measures of facial affect recognition that do not examine variance in the difficulty of emotions may produce erroneous conclusions about differential impairment. They also cast doubt on the notion that dissociable neural pathways underlie the recognition of positive and negative emotions, which are differentially affected by TBI and potentially other neurological or psychiatric disorders.

  17. Facial emotional recognition in schizophrenia: preliminary results of the virtual reality program for facial emotional recognition

    Directory of Open Access Journals (Sweden)

    Teresa Souto

    2013-01-01

    Full Text Available BACKGROUND: Significant deficits in emotional recognition and social perception characterize patients with schizophrenia and have direct negative impact both in inter-personal relationships and in social functioning. Virtual reality, as a methodological resource, might have a high potential for assessment and training skills in people suffering from mental illness. OBJECTIVES: To present preliminary results of a facial emotional recognition assessment designed for patients with schizophrenia, using 3D avatars and virtual reality. METHODS: Presentation of 3D avatars which reproduce images developed with the FaceGen® software and integrated in a three-dimensional virtual environment. Each avatar was presented to a group of 12 patients with schizophrenia and a reference group of 12 subjects without psychiatric pathology. RESULTS: The results show that the facial emotions of happiness and anger are better recognized by both groups and that the major difficulties arise in fear and disgust recognition. Frontal alpha electroencephalography variations were found during the presentation of anger and disgust stimuli among patients with schizophrenia. DISCUSSION: The developed program evaluation module can be of surplus value both for patient and therapist, providing the task execution in a non anxiogenic environment, however similar to the actual experience.

  18. Meta-Analysis of Facial Emotion Recognition in Behavioral Variant Frontotemporal Dementia: Comparison With Alzheimer Disease and Healthy Controls.

    Science.gov (United States)

    Bora, Emre; Velakoulis, Dennis; Walterfang, Mark

    2016-07-01

    Behavioral disturbances and lack of empathy are distinctive clinical features of behavioral variant frontotemporal dementia (bvFTD) in comparison to Alzheimer disease (AD). The aim of this meta-analytic review was to compare facial emotion recognition performances of bvFTD with healthy controls and AD. The current meta-analysis included a total of 19 studies and involved comparisons of 288 individuals with bvFTD and 329 healthy controls and 162 bvFTD and 147 patients with AD. Facial emotion recognition was significantly impaired in bvFTD in comparison to the healthy controls (d = 1.81) and AD (d = 1.23). In bvFTD, recognition of negative emotions, especially anger (d = 1.48) and disgust (d = 1.41), were severely impaired. Emotion recognition was significantly impaired in bvFTD in comparison to AD in all emotions other than happiness. Impairment of emotion recognition is a relatively specific feature of bvFTD. Routine assessment of social-cognitive abilities including emotion recognition can be helpful in better differentiating between cortical dementias such as bvFTD and AD. © The Author(s) 2016.

  19. Facial Emotion Recognition in Schizophrenia: The Impact of Gender

    OpenAIRE

    Erol, Alm?la; Putgul, Gulperi; Kosger, Ferdi; Ersoy, Bilal

    2013-01-01

    Objective Previous studies reported gender differences for facial emotion recognition in healthy people, with women performing better than men. Few studies that examined gender differences for facial emotion recognition in schizophrenia brought out inconsistent findings. The aim of this study is to investigate gender differences for facial emotion identification and discrimination abilities in patients with schizophrenia. Methods 35 female and 35 male patients with schizophrenia, along with 3...

  20. Theory of Mind and Emotion Recognition Skills in Children with Specific Language Impairment, Autism Spectrum Disorder and Typical Development: Group Differences and Connection to Knowledge of Grammatical Morphology, Word-Finding Abilities and Verbal Working Memory

    Science.gov (United States)

    Loukusa, Soile; Mäkinen, Leena; Kuusikko-Gauffin, Sanna; Ebeling, Hanna; Moilanen, Irma

    2014-01-01

    Background: Social perception skills, such as understanding the mind and emotions of others, affect children's communication abilities in real-life situations. In addition to autism spectrum disorder (ASD), there is increasing knowledge that children with specific language impairment (SLI) also demonstrate difficulties in their social…

  1. [Recognition of facial expression of emotions in Parkinson's disease: a theoretical review].

    Science.gov (United States)

    Alonso-Recio, L; Serrano-Rodriguez, J M; Carvajal-Molina, F; Loeches-Alonso, A; Martin-Plasencia, P

    2012-04-16

    Emotional facial expression is a basic guide during social interaction and, therefore, alterations in their expression or recognition are important limitations for communication. To examine facial expression recognition abilities and their possible impairment in Parkinson's disease. First, we review the studies on this topic which have not found entirely similar results. Second, we analyze the factors that may explain these discrepancies and, in particular, as third objective, we consider the relationship between emotional recognition problems and cognitive impairment associated with the disease. Finally, we propose alternatives strategies for the development of studies that could clarify the state of these abilities in Parkinson's disease. Most studies suggest deficits in facial expression recognition, especially in those with negative emotional content. However, it is possible that these alterations are related to those that also appear in the course of the disease in other perceptual and executive processes. To advance in this issue, we consider necessary to design emotional recognition studies implicating differentially the executive or visuospatial processes, and/or contrasting cognitive abilities with facial expressions and non emotional stimuli. The precision of the status of these abilities, as well as increase our knowledge of the functional consequences of the characteristic brain damage in the disease, may indicate if we should pay special attention in their rehabilitation inside the programs implemented.

  2. Improving Negative Emotion Recognition in Young Offenders Reduces Subsequent Crime.

    Directory of Open Access Journals (Sweden)

    Kelly Hubble

    Full Text Available Children with antisocial behaviour show deficits in the perception of emotional expressions in others that may contribute to the development and persistence of antisocial and aggressive behaviour. Current treatments for antisocial youngsters are limited in effectiveness. It has been argued that more attention should be devoted to interventions that target neuropsychological correlates of antisocial behaviour. This study examined the effect of emotion recognition training on criminal behaviour.Emotion recognition and crime levels were studied in 50 juvenile offenders. Whilst all young offenders received their statutory interventions as the study was conducted, a subgroup of twenty-four offenders also took part in a facial affect training aimed at improving emotion recognition. Offenders in the training and control groups were matched for age, SES, IQ and lifetime crime level. All offenders were tested twice for emotion recognition performance, and recent crime data were collected after the testing had been completed.Before the training there were no differences between the groups in emotion recognition, with both groups displaying poor fear, sadness and anger recognition. After the training fear, sadness and anger recognition improved significantly in juvenile offenders in the training group. Although crime rates dropped in all offenders in the 6 months following emotion testing, only the group of offenders who had received the emotion training showed a significant reduction in the severity of the crimes they committed.The study indicates that emotion recognition can be relatively easily improved in youths who engage in serious antisocial and criminal behavior. The results suggest that improved emotion recognition has the potential to reduce the severity of reoffending.

  3. Improving Negative Emotion Recognition in Young Offenders Reduces Subsequent Crime.

    Science.gov (United States)

    Hubble, Kelly; Bowen, Katharine L; Moore, Simon C; van Goozen, Stephanie H M

    2015-01-01

    Children with antisocial behaviour show deficits in the perception of emotional expressions in others that may contribute to the development and persistence of antisocial and aggressive behaviour. Current treatments for antisocial youngsters are limited in effectiveness. It has been argued that more attention should be devoted to interventions that target neuropsychological correlates of antisocial behaviour. This study examined the effect of emotion recognition training on criminal behaviour. Emotion recognition and crime levels were studied in 50 juvenile offenders. Whilst all young offenders received their statutory interventions as the study was conducted, a subgroup of twenty-four offenders also took part in a facial affect training aimed at improving emotion recognition. Offenders in the training and control groups were matched for age, SES, IQ and lifetime crime level. All offenders were tested twice for emotion recognition performance, and recent crime data were collected after the testing had been completed. Before the training there were no differences between the groups in emotion recognition, with both groups displaying poor fear, sadness and anger recognition. After the training fear, sadness and anger recognition improved significantly in juvenile offenders in the training group. Although crime rates dropped in all offenders in the 6 months following emotion testing, only the group of offenders who had received the emotion training showed a significant reduction in the severity of the crimes they committed. The study indicates that emotion recognition can be relatively easily improved in youths who engage in serious antisocial and criminal behavior. The results suggest that improved emotion recognition has the potential to reduce the severity of reoffending.

  4. Emotion recognition in early Parkinson's disease patients undergoing deep brain stimulation or dopaminergic therapy: a comparison to healthy participants

    Directory of Open Access Journals (Sweden)

    Lindsey G. McIntosh

    2015-01-01

    Full Text Available Parkinson’s disease (PD is traditionally regarded as a neurodegenerative movement disorder, however, nigrostriatal dopaminergic degeneration is also thought to disrupt non-motor loops connecting basal ganglia to areas in frontal cortex involved in cognition and emotion processing. PD patients are impaired on tests of emotion recognition, but it is difficult to disentangle this deficit from the more general cognitive dysfunction that frequently accompanies disease progression. Testing for emotion recognition deficits early in the disease course, prior to cognitive decline, better assesses the sensitivity of these non-motor corticobasal ganglia-thalamocortical loops involved in emotion processing to early degenerative change in basal ganglia circuits. In addition, contrasting this with a group of healthy aging individuals demonstrates changes in emotion processing specific to the degeneration of basal ganglia circuitry in PD. Early PD patients (EPD were recruited from a randomized clinical trial testing the safety and tolerability of deep brain stimulation of the subthalamic nucleus (STN-DBS in early-staged PD. EPD patients were previously randomized to receive optimal drug therapy only (ODT, or drug therapy plus STN-DBS (ODT+DBS. Matched healthy elderly controls (HEC and young controls (HYC also participated in this study. Participants completed two control tasks and three emotion recognition tests that varied in stimulus domain. EPD patients were impaired on all emotion recognition tasks compared to HEC. Neither therapy type (ODT or ODT+DBS nor therapy state (ON/OFF altered emotion recognition performance in this study. Finally, HEC were impaired on vocal emotion recognition relative to HYC, suggesting a decline related to healthy aging. This study supports the existence of impaired emotion recognition early in the PD course, implicating an early disruption of fronto-striatal loops mediating emotional function.

  5. Impaired white matter connections of the limbic system networks associated with impaired emotional memory in Alzheimer's disease

    Directory of Open Access Journals (Sweden)

    Xiaoshu Li

    2016-10-01

    Full Text Available Background: Discrepancies persist regarding retainment of emotional enhancement of memory (EEM in mild cognitive impairment (MCI and early Alzheimer’s disease (AD patients. In addition, the neural mechanisms are still poorly understood, little is known about emotional memory related changes in white matter (WM.Objective: To observe whether EEM is absent in amnestic MCI (aMCI and AD patients, and to investigate if emotional memory is associated with WM connections and gray matters (GM of the limbic system networks. Methods: Twenty-one AD patients, 20 aMCI patients and 25 normal controls participated in emotional picture recognition tests and MRI scanning. Tract-based spatial statistics (TBSS and voxel-based morphometry (VBM methods were used to determine white and gray matter changes of patients. Fourteen regions of interest (ROI of WM and 20 ROIs of GM were then selected for the correlation analyses with behavioral scores. Results: The EEM effect was lost in AD patients. Both white and gray matter of the limbic system networks were impaired in AD patients. Significant correlations or tendencies between the bilateral uncinate fasciculus, corpus callosum (genu and body, left cingulum bundle, left parahippocampal WM and the recognition sensitivity of emotional valence pictures, and significant correlations or tendencies between the splenium of corpus callosum, left cingulum bundle, left crus of fornix and stria terminalis and the recognition sensitivity of EEM were found. The volume of left amygdala, bilateral insula, medial frontal lobe, anterior and middle cingulum gyrus were positively correlated with the recognition sensitivity of emotional photos, and the right precuneus was positively correlated with the negative EEM effect. However, the affected brain areas of aMCI patients were more localized, and aMCI patients benefited only from positive stimuli. Conclusion: There are impairments of the limbic system networks of AD patients. Damaged WM

  6. Emotion improves and impairs early vision.

    Science.gov (United States)

    Bocanegra, Bruno R; Zeelenberg, René

    2009-06-01

    Recent studies indicate that emotion enhances early vision, but the generality of this finding remains unknown. Do the benefits of emotion extend to all basic aspects of vision, or are they limited in scope? Our results show that the brief presentation of a fearful face, compared with a neutral face, enhances sensitivity for the orientation of subsequently presented low-spatial-frequency stimuli, but diminishes orientation sensitivity for high-spatial-frequency stimuli. This is the first demonstration that emotion not only improves but also impairs low-level vision. The selective low-spatial-frequency benefits are consistent with the idea that emotion enhances magnocellular processing. Additionally, we suggest that the high-spatial-frequency deficits are due to inhibitory interactions between magnocellular and parvocellular pathways. Our results suggest an emotion-induced trade-off in visual processing, rather than a general improvement. This trade-off may benefit perceptual dimensions that are relevant for survival at the expense of those that are less relevant.

  7. Gender Differences in the Recognition of Vocal Emotions

    Science.gov (United States)

    Lausen, Adi; Schacht, Annekathrin

    2018-01-01

    The conflicting findings from the few studies conducted with regard to gender differences in the recognition of vocal expressions of emotion have left the exact nature of these differences unclear. Several investigators have argued that a comprehensive understanding of gender differences in vocal emotion recognition can only be achieved by replicating these studies while accounting for influential factors such as stimulus type, gender-balanced samples, number of encoders, decoders, and emotional categories. This study aimed to account for these factors by investigating whether emotion recognition from vocal expressions differs as a function of both listeners' and speakers' gender. A total of N = 290 participants were randomly and equally allocated to two groups. One group listened to words and pseudo-words, while the other group listened to sentences and affect bursts. Participants were asked to categorize the stimuli with respect to the expressed emotions in a fixed-choice response format. Overall, females were more accurate than males when decoding vocal emotions, however, when testing for specific emotions these differences were small in magnitude. Speakers' gender had a significant impact on how listeners' judged emotions from the voice. The group listening to words and pseudo-words had higher identification rates for emotions spoken by male than by female actors, whereas in the group listening to sentences and affect bursts the identification rates were higher when emotions were uttered by female than male actors. The mixed pattern for emotion-specific effects, however, indicates that, in the vocal channel, the reliability of emotion judgments is not systematically influenced by speakers' gender and the related stereotypes of emotional expressivity. Together, these results extend previous findings by showing effects of listeners' and speakers' gender on the recognition of vocal emotions. They stress the importance of distinguishing these factors to explain

  8. Gender Differences in the Recognition of Vocal Emotions

    Directory of Open Access Journals (Sweden)

    Adi Lausen

    2018-06-01

    Full Text Available The conflicting findings from the few studies conducted with regard to gender differences in the recognition of vocal expressions of emotion have left the exact nature of these differences unclear. Several investigators have argued that a comprehensive understanding of gender differences in vocal emotion recognition can only be achieved by replicating these studies while accounting for influential factors such as stimulus type, gender-balanced samples, number of encoders, decoders, and emotional categories. This study aimed to account for these factors by investigating whether emotion recognition from vocal expressions differs as a function of both listeners' and speakers' gender. A total of N = 290 participants were randomly and equally allocated to two groups. One group listened to words and pseudo-words, while the other group listened to sentences and affect bursts. Participants were asked to categorize the stimuli with respect to the expressed emotions in a fixed-choice response format. Overall, females were more accurate than males when decoding vocal emotions, however, when testing for specific emotions these differences were small in magnitude. Speakers' gender had a significant impact on how listeners' judged emotions from the voice. The group listening to words and pseudo-words had higher identification rates for emotions spoken by male than by female actors, whereas in the group listening to sentences and affect bursts the identification rates were higher when emotions were uttered by female than male actors. The mixed pattern for emotion-specific effects, however, indicates that, in the vocal channel, the reliability of emotion judgments is not systematically influenced by speakers' gender and the related stereotypes of emotional expressivity. Together, these results extend previous findings by showing effects of listeners' and speakers' gender on the recognition of vocal emotions. They stress the importance of distinguishing these

  9. Face Age and Eye Gaze Influence Older Adults' Emotion Recognition.

    Science.gov (United States)

    Campbell, Anna; Murray, Janice E; Atkinson, Lianne; Ruffman, Ted

    2017-07-01

    Eye gaze has been shown to influence emotion recognition. In addition, older adults (over 65 years) are not as influenced by gaze direction cues as young adults (18-30 years). Nevertheless, these differences might stem from the use of young to middle-aged faces in emotion recognition research because older adults have an attention bias toward old-age faces. Therefore, using older face stimuli might allow older adults to process gaze direction cues to influence emotion recognition. To investigate this idea, young and older adults completed an emotion recognition task with young and older face stimuli displaying direct and averted gaze, assessing labeling accuracy for angry, disgusted, fearful, happy, and sad faces. Direct gaze rather than averted gaze improved young adults' recognition of emotions in young and older faces, but for older adults this was true only for older faces. The current study highlights the impact of stimulus face age and gaze direction on emotion recognition in young and older adults. The use of young face stimuli with direct gaze in most research might contribute to age-related emotion recognition differences. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. Response procedure, memory, and dichotic emotion recognition.

    Science.gov (United States)

    Voyer, Daniel; Dempsey, Danielle; Harding, Jennifer A

    2014-03-01

    Three experiments investigated the role of memory and rehearsal in a dichotic emotion recognition task by manipulating the response procedure as well as the interval between encoding and retrieval while taking into account order of report. For all experiments, right-handed undergraduates were presented with dichotic pairs of the words bower, dower, power, and tower pronounced in a sad, angry, happy, or neutral tone of voice. Participants were asked to report the two emotions presented on each trial by clicking on the corresponding drawings or words on a computer screen, either following no delay or a five second delay. Experiment 1 applied the delay conditions as a between-subjects factor whereas it was a within-subject factor in Experiment 2. In Experiments 1 and 2, more correct responses occurred for the left than the right ear, reflecting a left ear advantage (LEA) that was slightly larger with a nonverbal than a verbal response. The LEA was also found to be larger with no delay than with the 5s delay. In addition, participants typically responded first to the left ear stimulus. In fact, the first response produced a LEA whereas the second response produced a right ear advantage. Experiment 3 involved a concurrent task during the delay to prevent rehearsal. In Experiment 3, the pattern of results supported the claim that rehearsal could account for the findings of the first two experiments. The findings are interpreted in the context of the role of rehearsal and memory in models of dichotic listening. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Facial and prosodic emotion recognition in social anxiety disorder.

    Science.gov (United States)

    Tseng, Huai-Hsuan; Huang, Yu-Lien; Chen, Jian-Ting; Liang, Kuei-Yu; Lin, Chao-Cheng; Chen, Sue-Huei

    2017-07-01

    Patients with social anxiety disorder (SAD) have a cognitive preference to negatively evaluate emotional information. In particular, the preferential biases in prosodic emotion recognition in SAD have been much less explored. The present study aims to investigate whether SAD patients retain negative evaluation biases across visual and auditory modalities when given sufficient response time to recognise emotions. Thirty-one SAD patients and 31 age- and gender-matched healthy participants completed a culturally suitable non-verbal emotion recognition task and received clinical assessments for social anxiety and depressive symptoms. A repeated measures analysis of variance was conducted to examine group differences in emotion recognition. Compared to healthy participants, SAD patients were significantly less accurate at recognising facial and prosodic emotions, and spent more time on emotion recognition. The differences were mainly driven by the lower accuracy and longer reaction times for recognising fearful emotions in SAD patients. Within the SAD patients, lower accuracy of sad face recognition was associated with higher severity of depressive and social anxiety symptoms, particularly with avoidance symptoms. These findings may represent a cross-modality pattern of avoidance in the later stage of identifying negative emotions in SAD. This pattern may be linked to clinical symptom severity.

  12. Age, gender and puberty influence the development of facial emotion recognition

    Directory of Open Access Journals (Sweden)

    Kate eLawrence

    2015-06-01

    Full Text Available Our ability to differentiate between simple facial expressions of emotion develops between infancy and early adulthood, yet few studies have explored the developmental trajectory of emotion recognition using a single methodology across a wide age-range. We investigated the development of emotion recognition abilities through childhood and adolescence, testing the hypothesis that children’s ability to recognise simple emotions is modulated by chronological age, pubertal stage and gender. In order to establish norms, we assessed 478 children aged 6-16 years, using the Ekman-Friesen Pictures of Facial Affect. We then modelled these cross-sectional data in terms of competence in accurate recognition of the six emotions studied, when the positive correlation between emotion recognition and IQ was controlled. Significant linear trends were seen in children’s ability to recognise facial expressions of happiness, surprise, fear and disgust; there was improvement with increasing age. In contrast, for sad and angry expressions there is little or no change in accuracy over the age range 6-16 years; near-adult levels of competence are established by middle-childhood. In a sampled subset, pubertal status influenced the ability to recognize facial expressions of disgust and anger; there was an increase in competence from mid to late puberty, which occurred independently of age. A small female advantage was found in the recognition of some facial expressions. The normative data provided in this study will aid clinicians and researchers in assessing the emotion recognition abilities of children and will facilitate the identification of abnormalities in a skill that is often impaired in neurodevelopmental disorders. If emotion recognition abilities are a good model with which to understand adolescent development, then these results could have implications for the education, mental health provision and legal treatment of teenagers.

  13. Age, gender, and puberty influence the development of facial emotion recognition.

    Science.gov (United States)

    Lawrence, Kate; Campbell, Ruth; Skuse, David

    2015-01-01

    Our ability to differentiate between simple facial expressions of emotion develops between infancy and early adulthood, yet few studies have explored the developmental trajectory of emotion recognition using a single methodology across a wide age-range. We investigated the development of emotion recognition abilities through childhood and adolescence, testing the hypothesis that children's ability to recognize simple emotions is modulated by chronological age, pubertal stage and gender. In order to establish norms, we assessed 478 children aged 6-16 years, using the Ekman-Friesen Pictures of Facial Affect. We then modeled these cross-sectional data in terms of competence in accurate recognition of the six emotions studied, when the positive correlation between emotion recognition and IQ was controlled. Significant linear trends were seen in children's ability to recognize facial expressions of happiness, surprise, fear, and disgust; there was improvement with increasing age. In contrast, for sad and angry expressions there is little or no change in accuracy over the age range 6-16 years; near-adult levels of competence are established by middle-childhood. In a sampled subset, pubertal status influenced the ability to recognize facial expressions of disgust and anger; there was an increase in competence from mid to late puberty, which occurred independently of age. A small female advantage was found in the recognition of some facial expressions. The normative data provided in this study will aid clinicians and researchers in assessing the emotion recognition abilities of children and will facilitate the identification of abnormalities in a skill that is often impaired in neurodevelopmental disorders. If emotion recognition abilities are a good model with which to understand adolescent development, then these results could have implications for the education, mental health provision and legal treatment of teenagers.

  14. Age, gender, and puberty influence the development of facial emotion recognition

    Science.gov (United States)

    Lawrence, Kate; Campbell, Ruth; Skuse, David

    2015-01-01

    Our ability to differentiate between simple facial expressions of emotion develops between infancy and early adulthood, yet few studies have explored the developmental trajectory of emotion recognition using a single methodology across a wide age-range. We investigated the development of emotion recognition abilities through childhood and adolescence, testing the hypothesis that children’s ability to recognize simple emotions is modulated by chronological age, pubertal stage and gender. In order to establish norms, we assessed 478 children aged 6–16 years, using the Ekman-Friesen Pictures of Facial Affect. We then modeled these cross-sectional data in terms of competence in accurate recognition of the six emotions studied, when the positive correlation between emotion recognition and IQ was controlled. Significant linear trends were seen in children’s ability to recognize facial expressions of happiness, surprise, fear, and disgust; there was improvement with increasing age. In contrast, for sad and angry expressions there is little or no change in accuracy over the age range 6–16 years; near-adult levels of competence are established by middle-childhood. In a sampled subset, pubertal status influenced the ability to recognize facial expressions of disgust and anger; there was an increase in competence from mid to late puberty, which occurred independently of age. A small female advantage was found in the recognition of some facial expressions. The normative data provided in this study will aid clinicians and researchers in assessing the emotion recognition abilities of children and will facilitate the identification of abnormalities in a skill that is often impaired in neurodevelopmental disorders. If emotion recognition abilities are a good model with which to understand adolescent development, then these results could have implications for the education, mental health provision and legal treatment of teenagers. PMID:26136697

  15. Multimodal Emotion Recognition Is Resilient to Insufficient Sleep: Results From Cross-Sectional and Experimental Studies.

    Science.gov (United States)

    Holding, Benjamin C; Laukka, Petri; Fischer, Håkan; Bänziger, Tanja; Axelsson, John; Sundelin, Tina

    2017-11-01

    Insufficient sleep has been associated with impaired recognition of facial emotions. However, previous studies have found inconsistent results, potentially stemming from the type of static picture task used. We therefore examined whether insufficient sleep was associated with decreased emotion recognition ability in two separate studies using a dynamic multimodal task. Study 1 used a cross-sectional design consisting of 291 participants with questionnaire measures assessing sleep duration and self-reported sleep quality for the previous night. Study 2 used an experimental design involving 181 participants where individuals were quasi-randomized into either a sleep-deprivation (N = 90) or a sleep-control (N = 91) condition. All participants from both studies were tested on the same forced-choice multimodal test of emotion recognition to assess the accuracy of emotion categorization. Sleep duration, self-reported sleep quality (study 1), and sleep deprivation (study 2) did not predict overall emotion recognition accuracy or speed. Similarly, the responses to each of the twelve emotions tested showed no evidence of impaired recognition ability, apart from one positive association suggesting that greater self-reported sleep quality could predict more accurate recognition of disgust (study 1). The studies presented here involve considerably larger samples than previous studies and the results support the null hypotheses. Therefore, we suggest that the ability to accurately categorize the emotions of others is not associated with short-term sleep duration or sleep quality and is resilient to acute periods of insufficient sleep. © Sleep Research Society 2017. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved. For permissions, please e-mail journals.permissions@oup.com.

  16. The Change in Facial Emotion Recognition Ability in Inpatients with Treatment Resistant Schizophrenia After Electroconvulsive Therapy.

    Science.gov (United States)

    Dalkıran, Mihriban; Tasdemir, Akif; Salihoglu, Tamer; Emul, Murat; Duran, Alaattin; Ugur, Mufit; Yavuz, Ruhi

    2017-09-01

    People with schizophrenia have impairments in emotion recognition along with other social cognitive deficits. In the current study, we aimed to investigate the immediate benefits of ECT on facial emotion recognition ability. Thirty-two treatment resistant patients with schizophrenia who have been indicated for ECT enrolled in the study. Facial emotion stimuli were a set of 56 photographs that depicted seven basic emotions: sadness, anger, happiness, disgust, surprise, fear, and neutral faces. The average age of the participants was 33.4 ± 10.5 years. The rate of recognizing the disgusted facial expression increased significantly after ECT (p facial expressions (p > 0.05). After the ECT, the time period of responding to the fear and happy facial expressions were significantly shorter (p Facial emotion recognition ability is an important social cognitive skill for social harmony, proper relation and living independently. At least, the ECT sessions do not seem to affect facial emotion recognition ability negatively and seem to improve identifying disgusted facial emotion which is related with dopamine enriched regions in brain.

  17. Robust emotion recognition using spectral and prosodic features

    CERN Document Server

    Rao, K Sreenivasa

    2013-01-01

    In this brief, the authors discuss recently explored spectral (sub-segmental and pitch synchronous) and prosodic (global and local features at word and syllable levels in different parts of the utterance) features for discerning emotions in a robust manner. The authors also delve into the complementary evidences obtained from excitation source, vocal tract system and prosodic features for the purpose of enhancing emotion recognition performance. Features based on speaking rate characteristics are explored with the help of multi-stage and hybrid models for further improving emotion recognition performance. Proposed spectral and prosodic features are evaluated on real life emotional speech corpus.

  18. Emotion Recognition in Children With Down Syndrome: Influence of Emotion Label and Expression Intensity.

    Science.gov (United States)

    Cebula, Katie R; Wishart, Jennifer G; Willis, Diane S; Pitcairn, Tom K

    2017-03-01

    Some children with Down syndrome may experience difficulties in recognizing facial emotions, particularly fear, but it is not clear why, nor how such skills can best be facilitated. Using a photo-matching task, emotion recognition was tested in children with Down syndrome, children with nonspecific intellectual disability and cognitively matched, typically developing children (all groups N = 21) under four conditions: veridical vs. exaggerated emotions and emotion-labelling vs. generic task instructions. In all groups, exaggerating emotions facilitated recognition accuracy and speed, with emotion labelling facilitating recognition accuracy. Overall accuracy and speed did not differ in the children with Down syndrome, although recognition of fear was poorer than in the typically developing children and unrelated to emotion label use. Implications for interventions are considered.

  19. Parents’ Emotion-Related Beliefs, Behaviors, and Skills Predict Children's Recognition of Emotion

    Science.gov (United States)

    Castro, Vanessa L.; Halberstadt, Amy G.; Lozada, Fantasy T.; Craig, Ashley B.

    2015-01-01

    Children who are able to recognize others’ emotions are successful in a variety of socioemotional domains, yet we know little about how school-aged children's abilities develop, particularly in the family context. We hypothesized that children develop emotion recognition skill as a function of parents’ own emotion-related beliefs, behaviors, and skills. We examined parents’ beliefs about the value of emotion and guidance of children's emotion, parents’ emotion labeling and teaching behaviors, and parents’ skill in recognizing children's emotions in relation to their school-aged children's emotion recognition skills. Sixty-nine parent-child dyads completed questionnaires, participated in dyadic laboratory tasks, and identified their own emotions and emotions felt by the other participant from videotaped segments. Regression analyses indicate that parents’ beliefs, behaviors, and skills together account for 37% of the variance in child emotion recognition ability, even after controlling for parent and child expressive clarity. The findings suggest the importance of the family milieu in the development of children's emotion recognition skill in middle childhood, and add to accumulating evidence suggesting important age-related shifts in the relation between parental emotion socialization and child emotional development. PMID:26005393

  20. Dissociation between facial and bodily expressions in emotion recognition: A case study.

    Science.gov (United States)

    Leiva, Samanta; Margulis, Laura; Micciulli, Andrea; Ferreres, Aldo

    2017-12-21

    Existing single-case studies have reported deficit in recognizing basic emotions through facial expression and unaffected performance with body expressions, but not the opposite pattern. The aim of this paper is to present a case study with impaired emotion recognition through body expressions and intact performance with facial expressions. In this single-case study we assessed a 30-year-old patient with autism spectrum disorder, without intellectual disability, and a healthy control group (n = 30) with four tasks of basic and complex emotion recognition through face and body movements, and two non-emotional control tasks. To analyze the dissociation between facial and body expressions, we used Crawford and Garthwaite's operational criteria, and we compared the patient and the control group performance with a modified one-tailed t-test designed specifically for single-case studies. There were no statistically significant differences between the patient's and the control group's performances on the non-emotional body movement task or the facial perception task. For both kinds of emotions (basic and complex) when the patient's performance was compared to the control group's, statistically significant differences were only observed for the recognition of body expressions. There were no significant differences between the patient's and the control group's correct answers for emotional facial stimuli. Our results showed a profile of impaired emotion recognition through body expressions and intact performance with facial expressions. This is the first case study that describes the existence of this kind of dissociation pattern between facial and body expressions of basic and complex emotions.

  1. Family environment influences emotion recognition following paediatric traumatic brain injury.

    Science.gov (United States)

    Schmidt, Adam T; Orsten, Kimberley D; Hanten, Gerri R; Li, Xiaoqi; Levin, Harvey S

    2010-01-01

    This study investigated the relationship between family functioning and performance on two tasks of emotion recognition (emotional prosody and face emotion recognition) and a cognitive control procedure (the Flanker task) following paediatric traumatic brain injury (TBI) or orthopaedic injury (OI). A total of 142 children (75 TBI, 67 OI) were assessed on three occasions: baseline, 3 months and 1 year post-injury on the two emotion recognition tasks and the Flanker task. Caregivers also completed the Life Stressors and Resources Scale (LISRES) on each occasion. Growth curve analysis was used to analyse the data. Results indicated that family functioning influenced performance on the emotional prosody and Flanker tasks but not on the face emotion recognition task. Findings on both the emotional prosody and Flanker tasks were generally similar across groups. However, financial resources emerged as significantly related to emotional prosody performance in the TBI group only (p = 0.0123). Findings suggest family functioning variables--especially financial resources--can influence performance on an emotional processing task following TBI in children.

  2. Emotional Intelligence as Assessed by Situational Judgment and Emotion Recognition Tests: Building the Nomological Net

    Directory of Open Access Journals (Sweden)

    Carolyn MacCann

    2011-12-01

    Full Text Available Recent research on emotion recognition ability (ERA suggests that the capacity to process emotional information may differ for disparate emotions. However, little research has examined whether this findings holds for emotional understanding and emotion management, as well as emotion recognition. Moreover, little research has examined whether the abilities to recognize emotions, understand emotions, and manage emotions form a distinct emotional intelligence (EI construct that is independent from traditional cognitive ability factors. The current study addressed these issues. Participants (N=118 completed two ERA measures, two situational judgment tests assessing emotional understanding and emotion management, and three cognitive ability tests. Exploratory and confirmatory factor analyses of both the understanding and management item parcels showed that a three-factor model relating to fear, sadness, and anger content was a better fit than a one-factor model, supporting an emotion-specific view of EI. In addition, an EI factor composed of emotion recognition, emotional understanding, and emotion management was distinct from a cognitive ability factor composed of a matrices task, general knowledge test, and reading comprehension task. Results are discussed in terms of their potential implications for theory and practice, as well as the integration of EI research with known models of cognitive ability.

  3. Theory of mind in schizophrenia: correlation with clinical symptomatology, emotional recognition and ward behavior.

    Science.gov (United States)

    Lee, Woo Kyeong; Kim, Yong Kyu

    2013-09-01

    Several studies have suggested the presence of a theory of mind (ToM) deficit in schizophrenic disorders. This study examined the relationship of emotion recognition, theory of mind, and ward behavior in patients with schizophrenia. Fifty-five patients with chronic schizophrenia completed measures of emotion recognition, ToM, intelligence, Positive and Negative Syndrome Scale (PANSS) and Nurse's Observation Scale for Inpatient Evaluation (NOSIE). Theory of mind sum score correlated significantly with IQ, emotion recognition, and ward behavior. Ward behavior was linked to the duration of the illness, and even more so to theory of mind deficits. Theory of mind contributed a significant proportion of the amount of variance to explain social behavior on the ward. Considering our study results, impaired theory of mind contributes significantly to the understanding of social competence in patients with schizophrenia. Copyright © 2012 Wiley Publishing Asia Pty Ltd.

  4. Development of Facial Emotion Recognition in Childhood : Age-related Differences in a Shortened Version of the Facial Expressions of Emotion - Stimuli and Tests

    NARCIS (Netherlands)

    Coenen, Maraike; Aarnoudse, Ceciel; Huitema, Rients; Braams, Olga; Veenstra, Wencke S.

    2013-01-01

    Introduction Facial emotion recognition is essential for social interaction. The development of emotion recognition abilities is not yet entirely understood (Tonks et al. 2007). Facial emotion recognition emerges gradually, with happiness recognized earliest (Herba & Phillips, 2004). The recognition

  5. Odor recognition memory is not idepentently impaired in Parkinson's disease

    NARCIS (Netherlands)

    Boesveldt, S.; Muinck Keizer, de R.J.O.; Wolters, E.C.H.; Berendse, H.W.

    2009-01-01

    The results of previous studies in small groups of Parkinson's disease (PD) patients are inconclusive with regard to the presence of an odor recognition memory impairment in PD. The aim of the present study was to investigate odor recognition memory in PD in a larger group of patients. Odor

  6. Facial emotion recognition, socio-occupational functioning and expressed emotions in schizophrenia versus bipolar disorder.

    Science.gov (United States)

    Thonse, Umesh; Behere, Rishikesh V; Praharaj, Samir Kumar; Sharma, Podila Sathya Venkata Narasimha

    2018-06-01

    Facial emotion recognition deficits have been consistently demonstrated in patients with severe mental disorders. Expressed emotion is found to be an important predictor of relapse. However, the relationship between facial emotion recognition abilities and expressed emotions and its influence on socio-occupational functioning in schizophrenia versus bipolar disorder has not been studied. In this study we examined 91 patients with schizophrenia and 71 with bipolar disorder for psychopathology, socio occupational functioning and emotion recognition abilities. Primary caregivers of 62 patients with schizophrenia and 49 with bipolar disorder were assessed on Family Attitude Questionnaire to assess their expressed emotions. Patients of schizophrenia and bipolar disorder performed similarly on the emotion recognition task. Patients with schizophrenia group experienced higher critical comments and had a poorer socio-occupational functioning as compared to patients with bipolar disorder. Poorer socio-occupational functioning in patients with schizophrenia was significantly associated with greater dissatisfaction in their caregivers. In patients with bipolar disorder, poorer emotion recognition scores significantly correlated with poorer adaptive living skills and greater hostility and dissatisfaction in their caregivers. The findings of our study suggest that emotion recognition abilities in patients with bipolar disorder are associated with negative expressed emotions leading to problems in adaptive living skills. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Oxytocin Promotes Facial Emotion Recognition and Amygdala Reactivity in Adults with Asperger Syndrome

    Science.gov (United States)

    Domes, Gregor; Kumbier, Ekkehardt; Heinrichs, Markus; Herpertz, Sabine C

    2014-01-01

    The neuropeptide oxytocin has recently been shown to enhance eye gaze and emotion recognition in healthy men. Here, we report a randomized double-blind, placebo-controlled trial that examined the neural and behavioral effects of a single dose of intranasal oxytocin on emotion recognition in individuals with Asperger syndrome (AS), a clinical condition characterized by impaired eye gaze and facial emotion recognition. Using functional magnetic resonance imaging, we examined whether oxytocin would enhance emotion recognition from facial sections of the eye vs the mouth region and modulate regional activity in brain areas associated with face perception in both adults with AS, and a neurotypical control group. Intranasal administration of the neuropeptide oxytocin improved performance in a facial emotion recognition task in individuals with AS. This was linked to increased left amygdala reactivity in response to facial stimuli and increased activity in the neural network involved in social cognition. Our data suggest that the amygdala, together with functionally associated cortical areas mediate the positive effect of oxytocin on social cognitive functioning in AS. PMID:24067301

  8. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias.

    Science.gov (United States)

    Invitto, Sara; Calcagnì, Antonio; Mignozzi, Arianna; Scardino, Rosanna; Piraino, Giulia; Turchi, Daniele; De Feudis, Irio; Brunetti, Antonio; Bevilacqua, Vitoantonio; de Tommaso, Marina

    2017-01-01

    Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians). Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP) and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment). A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.

  9. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias

    Directory of Open Access Journals (Sweden)

    Sara Invitto

    2017-08-01

    Full Text Available Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians. Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment. A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.

  10. Facial emotion recognition in Parkinson's disease: A review and new hypotheses

    Science.gov (United States)

    Vérin, Marc; Sauleau, Paul; Grandjean, Didier

    2018-01-01

    Abstract Parkinson's disease is a neurodegenerative disorder classically characterized by motor symptoms. Among them, hypomimia affects facial expressiveness and social communication and has a highly negative impact on patients' and relatives' quality of life. Patients also frequently experience nonmotor symptoms, including emotional‐processing impairments, leading to difficulty in recognizing emotions from faces. Aside from its theoretical importance, understanding the disruption of facial emotion recognition in PD is crucial for improving quality of life for both patients and caregivers, as this impairment is associated with heightened interpersonal difficulties. However, studies assessing abilities in recognizing facial emotions in PD still report contradictory outcomes. The origins of this inconsistency are unclear, and several questions (regarding the role of dopamine replacement therapy or the possible consequences of hypomimia) remain unanswered. We therefore undertook a fresh review of relevant articles focusing on facial emotion recognition in PD to deepen current understanding of this nonmotor feature, exploring multiple significant potential confounding factors, both clinical and methodological, and discussing probable pathophysiological mechanisms. This led us to examine recent proposals about the role of basal ganglia‐based circuits in emotion and to consider the involvement of facial mimicry in this deficit from the perspective of embodied simulation theory. We believe our findings will inform clinical practice and increase fundamental knowledge, particularly in relation to potential embodied emotion impairment in PD. © 2018 The Authors. Movement Disorders published by Wiley Periodicals, Inc. on behalf of International Parkinson and Movement Disorder Society. PMID:29473661

  11. Facial Emotion Recognition in Child Psychiatry: A Systematic Review

    Science.gov (United States)

    Collin, Lisa; Bindra, Jasmeet; Raju, Monika; Gillberg, Christopher; Minnis, Helen

    2013-01-01

    This review focuses on facial affect (emotion) recognition in children and adolescents with psychiatric disorders other than autism. A systematic search, using PRISMA guidelines, was conducted to identify original articles published prior to October 2011 pertaining to face recognition tasks in case-control studies. Used in the qualitative…

  12. Facial emotion recognition, face scan paths, and face perception in children with neurofibromatosis type 1.

    Science.gov (United States)

    Lewis, Amelia K; Porter, Melanie A; Williams, Tracey A; Bzishvili, Samantha; North, Kathryn N; Payne, Jonathan M

    2017-05-01

    This study aimed to investigate face scan paths and face perception abilities in children with Neurofibromatosis Type 1 (NF1) and how these might relate to emotion recognition abilities in this population. The authors investigated facial emotion recognition, face scan paths, and face perception in 29 children with NF1 compared to 29 chronological age-matched typically developing controls. Correlations between facial emotion recognition, face scan paths, and face perception in children with NF1 were examined. Children with NF1 displayed significantly poorer recognition of fearful expressions compared to controls, as well as a nonsignificant trend toward poorer recognition of anger. Although there was no significant difference between groups in time spent viewing individual core facial features (eyes, nose, mouth, and nonfeature regions), children with NF1 spent significantly less time than controls viewing the face as a whole. Children with NF1 also displayed significantly poorer face perception abilities than typically developing controls. Facial emotion recognition deficits were not significantly associated with aberrant face scan paths or face perception abilities in the NF1 group. These results suggest that impairments in the perception, identification, and interpretation of information from faces are important aspects of the social-cognitive phenotype of NF1. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. Encoding conditions affect recognition of vocally expressed emotions across cultures

    Directory of Open Access Journals (Sweden)

    Rebecca eJürgens

    2013-03-01

    Full Text Available Although the expression of emotions in humans is considered to be largely universal, cultural effects contribute to both emotion expression and recognition. To disentangle the interplay between these factors, play-acted and authentic (non-instructed vocal expressions of emotions were used, on the assumption that cultural effects may contribute differentially to the recognition of staged and spontaneous emotions. Speech tokens depicting four emotions (anger, sadness, joy, fear were obtained from German radio archives and reenacted by professional actors, and presented to 120 participants from Germany, Romania, and Indonesia. Participants in all three countries were poor at distinguishing between play-acted and spontaneous emotional utterances (58.73% correct on average with only marginal cultural differences. Nevertheless, authenticity influenced emotion recognition: across cultures, anger was recognized more accurately when play-acted (z = 15.06, p < .001 and sadness when authentic (z = 6.63, p < .001, replicating previous findings from German populations. German subjects revealed a slight advantage in recognizing emotions, indicating a moderate in-group advantage. There was no difference between Romanian and Indonesian subjects in the overall emotion recognition. Differential cultural effects became particularly apparent in terms of differential biases in emotion attribution. While all participants labeled play-acted expressions as anger more frequently than expected, German participants exhibited a further bias towards choosing anger for spontaneous stimuli. In contrast to the German sample, Romanian and Indonesian participants were biased towards choosing sadness. These results support the view that emotion recognition rests on a complex interaction of human universals and cultural specificities. Whether and in which way the observed biases are linked to cultural differences in self-construal remains an issue for further investigation.

  14. Encoding conditions affect recognition of vocally expressed emotions across cultures.

    Science.gov (United States)

    Jürgens, Rebecca; Drolet, Matthis; Pirow, Ralph; Scheiner, Elisabeth; Fischer, Julia

    2013-01-01

    Although the expression of emotions in humans is considered to be largely universal, cultural effects contribute to both emotion expression and recognition. To disentangle the interplay between these factors, play-acted and authentic (non-instructed) vocal expressions of emotions were used, on the assumption that cultural effects may contribute differentially to the recognition of staged and spontaneous emotions. Speech tokens depicting four emotions (anger, sadness, joy, fear) were obtained from German radio archives and re-enacted by professional actors, and presented to 120 participants from Germany, Romania, and Indonesia. Participants in all three countries were poor at distinguishing between play-acted and spontaneous emotional utterances (58.73% correct on average with only marginal cultural differences). Nevertheless, authenticity influenced emotion recognition: across cultures, anger was recognized more accurately when play-acted (z = 15.06, p emotions, indicating a moderate in-group advantage. There was no difference between Romanian and Indonesian subjects in the overall emotion recognition. Differential cultural effects became particularly apparent in terms of differential biases in emotion attribution. While all participants labeled play-acted expressions as anger more frequently than expected, German participants exhibited a further bias toward choosing anger for spontaneous stimuli. In contrast to the German sample, Romanian and Indonesian participants were biased toward choosing sadness. These results support the view that emotion recognition rests on a complex interaction of human universals and cultural specificities. Whether and in which way the observed biases are linked to cultural differences in self-construal remains an issue for further investigation.

  15. Are there differential deficits in facial emotion recognition between paranoid and non-paranoid schizophrenia? A signal detection analysis.

    Science.gov (United States)

    Huang, Charles Lung-Cheng; Hsiao, Sigmund; Hwu, Hai-Gwo; Howng, Shen-Long

    2013-10-30

    This study assessed facial emotion recognition abilities in subjects with paranoid and non-paranoid schizophrenia (NPS) using signal detection theory. We explore the differential deficits in facial emotion recognition in 44 paranoid patients with schizophrenia (PS) and 30 non-paranoid patients with schizophrenia (NPS), compared to 80 healthy controls. We used morphed faces with different intensities of emotion and computed the sensitivity index (d') of each emotion. The results showed that performance differed between the schizophrenia and healthy controls groups in the recognition of both negative and positive affects. The PS group performed worse than the healthy controls group but better than the NPS group in overall performance. Performance differed between the NPS and healthy controls groups in the recognition of all basic emotions and neutral faces; between the PS and healthy controls groups in the recognition of angry faces; and between the PS and NPS groups in the recognition of happiness, anger, sadness, disgust, and neutral affects. The facial emotion recognition impairment in schizophrenia may reflect a generalized deficit rather than a negative-emotion specific deficit. The PS group performed worse than the control group, but better than the NPS group in facial expression recognition, with differential deficits between PS and NPS patients. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. A new look at emotion perception: Concepts speed and shape facial emotion recognition.

    Science.gov (United States)

    Nook, Erik C; Lindquist, Kristen A; Zaki, Jamil

    2015-10-01

    Decades ago, the "New Look" movement challenged how scientists thought about vision by suggesting that conceptual processes shape visual perceptions. Currently, affective scientists are likewise debating the role of concepts in emotion perception. Here, we utilized a repetition-priming paradigm in conjunction with signal detection and individual difference analyses to examine how providing emotion labels-which correspond to discrete emotion concepts-affects emotion recognition. In Study 1, pairing emotional faces with emotion labels (e.g., "sad") increased individuals' speed and sensitivity in recognizing emotions. Additionally, individuals with alexithymia-who have difficulty labeling their own emotions-struggled to recognize emotions based on visual cues alone, but not when emotion labels were provided. Study 2 replicated these findings and further demonstrated that emotion concepts can shape perceptions of facial expressions. Together, these results suggest that emotion perception involves conceptual processing. We discuss the implications of these findings for affective, social, and clinical psychology. (c) 2015 APA, all rights reserved).

  17. Emotional and cognitive social processes are impaired in Parkinson's disease and are related to behavioral disorders.

    Science.gov (United States)

    Narme, Pauline; Mouras, Harold; Roussel, Martine; Duru, Cécile; Krystkowiak, Pierre; Godefroy, Olivier

    2013-03-01

    Parkinson's disease (PD) is associated with behavioral disorders that can affect social functioning but are poorly understood. Since emotional and cognitive social processes are known to be crucial in social relationships, impairment of these processes may account for the emergence of behavioral disorders. We used a systematic battery of tests to assess emotional processes and social cognition in PD patients and relate our findings to conventional neuropsychological data (especially behavioral disorders). Twenty-three PD patients and 46 controls (matched for age and educational level) were included in the study and underwent neuropsychological testing, including an assessment of the behavioral and cognitive components of executive function. Emotional and cognitive social processes were assessed with the Interpersonal Reactivity Index caregiver-administered questionnaire (as a measure of empathy), a facial emotion recognition task and two theory of mind (ToM) tasks. When compared with controls, PD patients showed low levels of empathy (p = .006), impaired facial emotion recognition (which persisted after correction for perceptual abilities) (p = .001), poor performance in a second-order ToM task (p = .008) that assessed both cognitive (p = .004) and affective (p = .03) inferences and, lastly, frequent dysexecutive behavioral disorders (in over 40% of the patients). Overall, impaired emotional and cognitive social functioning was observed in 17% of patients and was related to certain cognitive dysexecutive disorders. In terms of behavioral dysexecutive disorders, social behavior disorders were related to impaired emotional and cognitive social functioning (p = .04) but were independent of cognitive impairments. Emotional and cognitive social processes were found to be impaired in Parkinson's disease. This impairment may account for the emergence of social behavioral disorders. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  18. On the Time Course of Vocal Emotion Recognition

    Science.gov (United States)

    Pell, Marc D.; Kotz, Sonja A.

    2011-01-01

    How quickly do listeners recognize emotions from a speaker's voice, and does the time course for recognition vary by emotion type? To address these questions, we adapted the auditory gating paradigm to estimate how much vocal information is needed for listeners to categorize five basic emotions (anger, disgust, fear, sadness, happiness) and neutral utterances produced by male and female speakers of English. Semantically-anomalous pseudo-utterances (e.g., The rivix jolled the silling) conveying each emotion were divided into seven gate intervals according to the number of syllables that listeners heard from sentence onset. Participants (n = 48) judged the emotional meaning of stimuli presented at each gate duration interval, in a successive, blocked presentation format. Analyses looked at how recognition of each emotion evolves as an utterance unfolds and estimated the “identification point” for each emotion. Results showed that anger, sadness, fear, and neutral expressions are recognized more accurately at short gate intervals than happiness, and particularly disgust; however, as speech unfolds, recognition of happiness improves significantly towards the end of the utterance (and fear is recognized more accurately than other emotions). When the gate associated with the emotion identification point of each stimulus was calculated, data indicated that fear (M = 517 ms), sadness (M = 576 ms), and neutral (M = 510 ms) expressions were identified from shorter acoustic events than the other emotions. These data reveal differences in the underlying time course for conscious recognition of basic emotions from vocal expressions, which should be accounted for in studies of emotional speech processing. PMID:22087275

  19. State anxiety and emotional face recognition in healthy volunteers

    OpenAIRE

    Attwood, Angela S.; Easey, Kayleigh E.; Dalili, Michael N.; Skinner, Andrew L.; Woods, Andy; Crick, Lana; Ilett, Elizabeth; Penton-Voak, Ian S.; Munafò, Marcus R.

    2017-01-01

    High trait anxiety has been associated with detriments in emotional face processing. By contrast, relatively little is known about the effects of state anxiety on emotional face processing. We investigated the effects of state anxiety on recognition of emotional expressions (anger, sadness, surprise, disgust, fear and happiness) experimentally, using the 7.5% carbon dioxide (CO2) model to induce state anxiety, and in a large observational study. The experimental studies indicated reduced glob...

  20. The Primacy of Perceiving: Emotion Recognition Buffers Negative Effects of Emotional Labor

    Science.gov (United States)

    Bechtoldt, Myriam N.; Rohrmann, Sonja; De Pater, Irene E.; Beersma, Bianca

    2011-01-01

    There is ample empirical evidence for negative effects of emotional labor (surface acting and deep acting) on workers' well-being. This study analyzed to what extent workers' ability to recognize others' emotions may buffer these effects. In a 4-week study with 85 nurses and police officers, emotion recognition moderated the relationship between…

  1. Altered Kinematics of Facial Emotion Expression and Emotion Recognition Deficits Are Unrelated in Parkinson's Disease.

    Science.gov (United States)

    Bologna, Matteo; Berardelli, Isabella; Paparella, Giulia; Marsili, Luca; Ricciardi, Lucia; Fabbrini, Giovanni; Berardelli, Alfredo

    2016-01-01

    Altered emotional processing, including reduced emotion facial expression and defective emotion recognition, has been reported in patients with Parkinson's disease (PD). However, few studies have objectively investigated facial expression abnormalities in PD using neurophysiological techniques. It is not known whether altered facial expression and recognition in PD are related. To investigate possible deficits in facial emotion expression and emotion recognition and their relationship, if any, in patients with PD. Eighteen patients with PD and 16 healthy controls were enrolled in this study. Facial expressions of emotion were recorded using a 3D optoelectronic system and analyzed using the facial action coding system. Possible deficits in emotion recognition were assessed using the Ekman test. Participants were assessed in one experimental session. Possible relationship between the kinematic variables of facial emotion expression, the Ekman test scores, and clinical and demographic data in patients were evaluated using the Spearman's test and multiple regression analysis. The facial expression of all six basic emotions had slower velocity and lower amplitude in patients in comparison to healthy controls (all P s facial expression kinematics and emotion recognition deficits were unrelated in patients (all P s > 0.05). Finally, no relationship emerged between kinematic variables of facial emotion expression, the Ekman test scores, and clinical and demographic data in patients (all P s > 0.05). The results in this study provide further evidence of altered emotional processing in PD. The lack of any correlation between altered facial emotion expression kinematics and emotion recognition deficits in patients suggests that these abnormalities are mediated by separate pathophysiological mechanisms.

  2. Utterance independent bimodal emotion recognition in spontaneous communication

    Science.gov (United States)

    Tao, Jianhua; Pan, Shifeng; Yang, Minghao; Li, Ya; Mu, Kaihui; Che, Jianfeng

    2011-12-01

    Emotion expressions sometimes are mixed with the utterance expression in spontaneous face-to-face communication, which makes difficulties for emotion recognition. This article introduces the methods of reducing the utterance influences in visual parameters for the audio-visual-based emotion recognition. The audio and visual channels are first combined under a Multistream Hidden Markov Model (MHMM). Then, the utterance reduction is finished by finding the residual between the real visual parameters and the outputs of the utterance related visual parameters. This article introduces the Fused Hidden Markov Model Inversion method which is trained in the neutral expressed audio-visual corpus to solve the problem. To reduce the computing complexity the inversion model is further simplified to a Gaussian Mixture Model (GMM) mapping. Compared with traditional bimodal emotion recognition methods (e.g., SVM, CART, Boosting), the utterance reduction method can give better results of emotion recognition. The experiments also show the effectiveness of our emotion recognition system when it was used in a live environment.

  3. Impaired face recognition is associated with social inhibition.

    Science.gov (United States)

    Avery, Suzanne N; VanDerKlok, Ross M; Heckers, Stephan; Blackford, Jennifer U

    2016-02-28

    Face recognition is fundamental to successful social interaction. Individuals with deficits in face recognition are likely to have social functioning impairments that may lead to heightened risk for social anxiety. A critical component of social interaction is how quickly a face is learned during initial exposure to a new individual. Here, we used a novel Repeated Faces task to assess how quickly memory for faces is established. Face recognition was measured over multiple exposures in 52 young adults ranging from low to high in social inhibition, a core dimension of social anxiety. High social inhibition was associated with a smaller slope of change in recognition memory over repeated face exposure, indicating participants with higher social inhibition showed smaller improvements in recognition memory after seeing faces multiple times. We propose that impaired face learning is an important mechanism underlying social inhibition and may contribute to, or maintain, social anxiety. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  4. State-dependent alteration in face emotion recognition in depression.

    Science.gov (United States)

    Anderson, Ian M; Shippen, Clare; Juhasz, Gabriella; Chase, Diana; Thomas, Emma; Downey, Darragh; Toth, Zoltan G; Lloyd-Williams, Kathryn; Elliott, Rebecca; Deakin, J F William

    2011-04-01

    Negative biases in emotional processing are well recognised in people who are currently depressed but are less well described in those with a history of depression, where such biases may contribute to vulnerability to relapse. To compare accuracy, discrimination and bias in face emotion recognition in those with current and remitted depression. The sample comprised a control group (n = 101), a currently depressed group (n = 30) and a remitted depression group (n = 99). Participants provided valid data after receiving a computerised face emotion recognition task following standardised assessment of diagnosis and mood symptoms. In the control group women were more accurate in recognising emotions than men owing to greater discrimination. Among participants with depression, those in remission correctly identified more emotions than controls owing to increased response bias, whereas those currently depressed recognised fewer emotions owing to decreased discrimination. These effects were most marked for anger, fear and sadness but there was no significant emotion × group interaction, and a similar pattern tended to be seen for happiness although not for surprise or disgust. These differences were confined to participants who were antidepressant-free, with those taking antidepressants having similar results to the control group. Abnormalities in face emotion recognition differ between people with current depression and those in remission. Reduced discrimination in depressed participants may reflect withdrawal from the emotions of others, whereas the increased bias in those with a history of depression could contribute to vulnerability to relapse. The normal face emotion recognition seen in those taking medication may relate to the known effects of antidepressants on emotional processing and could contribute to their ability to protect against depressive relapse.

  5. Cost-Sensitive Learning for Emotion Robust Speaker Recognition

    Directory of Open Access Journals (Sweden)

    Dongdong Li

    2014-01-01

    Full Text Available In the field of information security, voice is one of the most important parts in biometrics. Especially, with the development of voice communication through the Internet or telephone system, huge voice data resources are accessed. In speaker recognition, voiceprint can be applied as the unique password for the user to prove his/her identity. However, speech with various emotions can cause an unacceptably high error rate and aggravate the performance of speaker recognition system. This paper deals with this problem by introducing a cost-sensitive learning technology to reweight the probability of test affective utterances in the pitch envelop level, which can enhance the robustness in emotion-dependent speaker recognition effectively. Based on that technology, a new architecture of recognition system as well as its components is proposed in this paper. The experiment conducted on the Mandarin Affective Speech Corpus shows that an improvement of 8% identification rate over the traditional speaker recognition is achieved.

  6. Cost-sensitive learning for emotion robust speaker recognition.

    Science.gov (United States)

    Li, Dongdong; Yang, Yingchun; Dai, Weihui

    2014-01-01

    In the field of information security, voice is one of the most important parts in biometrics. Especially, with the development of voice communication through the Internet or telephone system, huge voice data resources are accessed. In speaker recognition, voiceprint can be applied as the unique password for the user to prove his/her identity. However, speech with various emotions can cause an unacceptably high error rate and aggravate the performance of speaker recognition system. This paper deals with this problem by introducing a cost-sensitive learning technology to reweight the probability of test affective utterances in the pitch envelop level, which can enhance the robustness in emotion-dependent speaker recognition effectively. Based on that technology, a new architecture of recognition system as well as its components is proposed in this paper. The experiment conducted on the Mandarin Affective Speech Corpus shows that an improvement of 8% identification rate over the traditional speaker recognition is achieved.

  7. Weighted Feature Gaussian Kernel SVM for Emotion Recognition.

    Science.gov (United States)

    Wei, Wei; Jia, Qingxuan

    2016-01-01

    Emotion recognition with weighted feature based on facial expression is a challenging research topic and has attracted great attention in the past few years. This paper presents a novel method, utilizing subregion recognition rate to weight kernel function. First, we divide the facial expression image into some uniform subregions and calculate corresponding recognition rate and weight. Then, we get a weighted feature Gaussian kernel function and construct a classifier based on Support Vector Machine (SVM). At last, the experimental results suggest that the approach based on weighted feature Gaussian kernel function has good performance on the correct rate in emotion recognition. The experiments on the extended Cohn-Kanade (CK+) dataset show that our method has achieved encouraging recognition results compared to the state-of-the-art methods.

  8. Is emotional memory enhancement preserved in amnestic mild cognitive impairment? Evidence from separating recollection and familiarity.

    Science.gov (United States)

    Wang, Pengyun; Li, Juan; Li, Huijie; Li, Bing; Jiang, Yang; Bao, Feng; Zhang, Shouzi

    2013-11-01

    This study investigated whether the observed absence of emotional memory enhancement in recognition tasks in patients with amnestic mild cognitive impairment (aMCI) could be related to their greater proportion of familiarity-based responses for all stimuli, and whether recognition tests with emotional items had better discriminative power for aMCI patients than those with neutral items. In total, 31 aMCI patients and 30 healthy older adults participated in a recognition test followed by remember/know judgments. Positive, neutral, and negative faces were used as stimuli. For overall recognition performance, emotional memory enhancement was found only in healthy controls; they remembered more negative and positive stimuli than neutral ones. For "remember" responses, we found equivalent emotional memory enhancement in both groups, though a greater proportion of "remember" responses was observed in normal controls. For "know" responses, aMCI patients presented a larger proportion than normal controls did, and their "know" responses were not affected by emotion. A negative correlation was found between emotional enhancement effect and the memory performance related to "know" responses. In addition, receiver operating characteristic curve analysis revealed higher diagnostic accuracy for recognition test with emotional stimuli than with neutral stimuli. The present results implied that the absence of the emotional memory enhancement effect in aMCI patients might be related to their tendency to rely more on familiarity-based "know" responses for all stimuli. Furthermore, recognition memory tests using emotional stimuli may be better able than neutral stimuli to differentiate people with aMCI from cognitively normal older adults. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  9. Gender differences in emotion recognition: Impact of sensory modality and emotional category.

    Science.gov (United States)

    Lambrecht, Lena; Kreifelts, Benjamin; Wildgruber, Dirk

    2014-04-01

    Results from studies on gender differences in emotion recognition vary, depending on the types of emotion and the sensory modalities used for stimulus presentation. This makes comparability between different studies problematic. This study investigated emotion recognition of healthy participants (N = 84; 40 males; ages 20 to 70 years), using dynamic stimuli, displayed by two genders in three different sensory modalities (auditory, visual, audio-visual) and five emotional categories. The participants were asked to categorise the stimuli on the basis of their nonverbal emotional content (happy, alluring, neutral, angry, and disgusted). Hit rates and category selection biases were analysed. Women were found to be more accurate in recognition of emotional prosody. This effect was partially mediated by hearing loss for the frequency of 8,000 Hz. Moreover, there was a gender-specific selection bias for alluring stimuli: Men, as compared to women, chose "alluring" more often when a stimulus was presented by a woman as compared to a man.

  10. An Investigation of Emotion Recognition and Theory of Mind in People with Chronic Heart Failure

    Science.gov (United States)

    Habota, Tina; McLennan, Skye N.; Cameron, Jan; Ski, Chantal F.; Thompson, David R.; Rendell, Peter G.

    2015-01-01

    Objectives Cognitive deficits are common in patients with chronic heart failure (CHF), but no study has investigated whether these deficits extend to social cognition. The present study provided the first empirical assessment of emotion recognition and theory of mind (ToM) in patients with CHF. In addition, it assessed whether each of these social cognitive constructs was associated with more general cognitive impairment. Methods A group comparison design was used, with 31 CHF patients compared to 38 demographically matched controls. The Ekman Faces test was used to assess emotion recognition, and the Mind in the Eyes test to measure ToM. Measures assessing global cognition, executive functions, and verbal memory were also administered. Results There were no differences between groups on emotion recognition or ToM. The CHF group’s performance was poorer on some executive measures, but memory was relatively preserved. In the CHF group, both emotion recognition performance and ToM ability correlated moderately with global cognition (r = .38, p = .034; r = .49, p = .005, respectively), but not with executive function or verbal memory. Conclusion CHF patients with lower cognitive ability were more likely to have difficulty recognizing emotions and inferring the mental states of others. Clinical implications of these findings are discussed. PMID:26529409

  11. Is the emotion recognition deficit associated with frontotemporal dementia caused by selective inattention to diagnostic facial features?

    Science.gov (United States)

    Oliver, Lindsay D; Virani, Karim; Finger, Elizabeth C; Mitchell, Derek G V

    2014-07-01

    Frontotemporal dementia (FTD) is a debilitating neurodegenerative disorder characterized by severely impaired social and emotional behaviour, including emotion recognition deficits. Though fear recognition impairments seen in particular neurological and developmental disorders can be ameliorated by reallocating attention to critical facial features, the possibility that similar benefits can be conferred to patients with FTD has yet to be explored. In the current study, we examined the impact of presenting distinct regions of the face (whole face, eyes-only, and eyes-removed) on the ability to recognize expressions of anger, fear, disgust, and happiness in 24 patients with FTD and 24 healthy controls. A recognition deficit was demonstrated across emotions by patients with FTD relative to controls. Crucially, removal of diagnostic facial features resulted in an appropriate decline in performance for both groups; furthermore, patients with FTD demonstrated a lack of disproportionate improvement in emotion recognition accuracy as a result of isolating critical facial features relative to controls. Thus, unlike some neurological and developmental disorders featuring amygdala dysfunction, the emotion recognition deficit observed in FTD is not likely driven by selective inattention to critical facial features. Patients with FTD also mislabelled negative facial expressions as happy more often than controls, providing further evidence for abnormalities in the representation of positive affect in FTD. This work suggests that the emotional expression recognition deficit associated with FTD is unlikely to be rectified by adjusting selective attention to diagnostic features, as has proven useful in other select disorders. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Speaker emotion recognition: from classical classifiers to deep neural networks

    Science.gov (United States)

    Mezghani, Eya; Charfeddine, Maha; Nicolas, Henri; Ben Amar, Chokri

    2018-04-01

    Speaker emotion recognition is considered among the most challenging tasks in recent years. In fact, automatic systems for security, medicine or education can be improved when considering the speech affective state. In this paper, a twofold approach for speech emotion classification is proposed. At the first side, a relevant set of features is adopted, and then at the second one, numerous supervised training techniques, involving classic methods as well as deep learning, are experimented. Experimental results indicate that deep architecture can improve classification performance on two affective databases, the Berlin Dataset of Emotional Speech and the SAVEE Dataset Surrey Audio-Visual Expressed Emotion.

  13. Cognitive penetrability and emotion recognition in human facial expressions

    Directory of Open Access Journals (Sweden)

    Francesco eMarchi

    2015-06-01

    Full Text Available Do our background beliefs, desires, and mental images influence our perceptual experience of the emotions of others? In this paper, we will address the possibility of cognitive penetration of perceptual experience in the domain of social cognition. In particular, we focus on emotion recognition based on the visual experience of facial expressions. After introducing the current debate on cognitive penetration, we review examples of perceptual adaptation for facial expressions of emotion. This evidence supports the idea that facial expressions are perceptually processed as wholes. That is, the perceptual system integrates lower-level facial features, such as eyebrow orientation, mouth angle etc., into facial compounds. We then present additional experimental evidence showing that in some cases, emotion recognition on the basis of facial expression is sensitive to and modified by the background knowledge of the subject. We argue that such sensitivity is best explained as a difference in the visual experience of the facial expression, not just as a modification of the judgment based on this experience. The difference in experience is characterized as the result of the interference of background knowledge with the perceptual integration process for faces. Thus, according to the best explanation, we have to accept cognitive penetration in some cases of emotion recognition. Finally, we highlight a recent model of social vision in order to propose a mechanism for cognitive penetration used in the face-based recognition of emotion.

  14. A change in strategy: Static emotion recognition in Malaysian Chinese

    Directory of Open Access Journals (Sweden)

    Chrystalle B.Y. Tan

    2015-12-01

    Full Text Available Studies have shown that while East Asians focused on the center of the face to recognize identities, participants adapted their strategy by focusing more on the eyes to identify emotions, suggesting that the eyes may contain salient information pertaining to emotional state in Eastern cultures. However, Western Caucasians employ the same strategy by moving between the eyes and mouth to identify both identities and emotions. Malaysian Chinese have been shown to focus on the eyes and nose more than the mouth during face recognition task, which represents an intermediate between Eastern and Western looking strategies. The current study examined whether Malaysian Chinese continue to employ an intermediate strategy or shift towards an Eastern or Western pattern (by fixating more on the eyes or mouth respectively during an emotion recognition task. Participants focused more on the eyes, followed by the nose then mouth. Directing attention towards the eye region resulted in better recognition of certain own- than other-race emotions. Although the fixation patterns appear similar for both tasks, further analyses showed that fixations on the eyes were reduced whereas fixations on the nose and mouth were increased during emotion recognition, indicating that participants adapt looking strategies based on their aims.

  15. The level of cognitive function and recognition of emotions in older adults.

    Science.gov (United States)

    Virtanen, Marianna; Singh-Manoux, Archana; Batty, G David; Ebmeier, Klaus P; Jokela, Markus; Harmer, Catherine J; Kivimäki, Mika

    2017-01-01

    The association between cognitive decline and the ability to recognise emotions in interpersonal communication is not well understood. We aimed to investigate the association between cognitive function and the ability to recognise emotions in other people's facial expressions across the full continuum of cognitive capacity. Cross-sectional analysis of 4039 participants (3016 men, 1023 women aged 59 to 82 years) in the Whitehall II study. Cognitive function was assessed using a 30-item Mini-Mental State Examination (MMSE), further classified into 8 groups: 30, 29, 28, 27, 26, 25, 24, and <24 (possible dementia) MMSE points. The Facial Expression Recognition Task (FERT) was used to examine recognition of anger, fear, disgust, sadness, and happiness. The multivariable adjusted difference in the percentage of accurate recognition between the highest and lowest MMSE group was 14.9 (95%CI, 11.1-18.7) for anger, 15.5 (11.9-19.2) for fear, 18.5 (15.2-21.8) for disgust, 11.6 (7.3-16.0) for sadness, and 6.3 (3.1-9.4) for happiness. However, recognition of several emotions was reduced already after 1 to 2-point reduction in MMSE and with further points down in MMSE, the recognition worsened at an accelerated rate. The ability to recognize emotion in facial expressions is affected at an early stage of cognitive impairment and might decline at an accelerated rate with the deterioration of cognitive function. Accurate recognition of happiness seems to be less affected by a severe decline in cognitive performance than recognition of negatively valued emotions.

  16. The level of cognitive function and recognition of emotions in older adults.

    Directory of Open Access Journals (Sweden)

    Marianna Virtanen

    Full Text Available The association between cognitive decline and the ability to recognise emotions in interpersonal communication is not well understood. We aimed to investigate the association between cognitive function and the ability to recognise emotions in other people's facial expressions across the full continuum of cognitive capacity.Cross-sectional analysis of 4039 participants (3016 men, 1023 women aged 59 to 82 years in the Whitehall II study. Cognitive function was assessed using a 30-item Mini-Mental State Examination (MMSE, further classified into 8 groups: 30, 29, 28, 27, 26, 25, 24, and <24 (possible dementia MMSE points. The Facial Expression Recognition Task (FERT was used to examine recognition of anger, fear, disgust, sadness, and happiness.The multivariable adjusted difference in the percentage of accurate recognition between the highest and lowest MMSE group was 14.9 (95%CI, 11.1-18.7 for anger, 15.5 (11.9-19.2 for fear, 18.5 (15.2-21.8 for disgust, 11.6 (7.3-16.0 for sadness, and 6.3 (3.1-9.4 for happiness. However, recognition of several emotions was reduced already after 1 to 2-point reduction in MMSE and with further points down in MMSE, the recognition worsened at an accelerated rate.The ability to recognize emotion in facial expressions is affected at an early stage of cognitive impairment and might decline at an accelerated rate with the deterioration of cognitive function. Accurate recognition of happiness seems to be less affected by a severe decline in cognitive performance than recognition of negatively valued emotions.

  17. VIRTUAL AVATAR FOR EMOTION RECOGNITION IN PATIENTS WITH SCHIZOPHRENIA: A PILOT STUDY

    Directory of Open Access Journals (Sweden)

    Samuel Marcos Pablos

    2016-08-01

    Full Text Available Persons who suffer from schizophrenia have difficulties in recognizing emotions in others’ facial expressions, which affects their capabilities for social interaction and hinders their social integration. Photographic images have traditionally been used to explore emotion recognition impairments in schizophrenia patients, which lack of the dynamism that is inherent to face to face social interactions. In order to overcome those inconveniences, in the present work the use of an animated, virtual face is approached. The avatar has the appearance of a highly realistic human face and is able to express different emotions dynamically, introducing some advantages over photograph-based approaches such as its dynamic appearance.We present the results of a pilot study in order to assess the validity of the interface as a tool for clinical psychiatrists. 20 subjects who suffer from schizophrenia of long evolution and 20 control subjects were invited to recognize a set of facial emotions showed by a virtual avatar and images. The objective of the study is to explore the possibilities of using a realistic-looking avatar for the assessment of emotion recognition deficits in patients who suffer schizophrenia. Our results suggest that the proposed avatar may be a suitable tool for the diagnosis and treatment of deficits in the facial recognition of emotions.

  18. Recognition of a Baby's Emotional Cry towards Robotics Baby Caregiver

    Directory of Open Access Journals (Sweden)

    Shota Yamamoto

    2013-02-01

    Full Text Available We developed a method for pattern recognition of baby's emotions (discomfortable, hungry, or sleepy expressed in the baby's cries. A 32-dimensional fast Fourier transform is performed for sound form clips, detected by our reported method and used as training data. The power of the sound form judged as a silent region is subtracted from each power of the frequency element. The power of each frequency element after the subtraction is treated as one of the elements of the feature vector. We perform principal component analysis (PCA for the feature vectors of the training data. The emotion of the baby is recognized by the nearest neighbor criterion applied to the feature vector obtained from the test data of sound form clips after projecting the feature vector on the PCA space from the training data. Then, the emotion with the highest frequency among the recognition results for a sound form clip is judged as the emotion expressed by the baby's cry. We successfully applied the proposed method to pattern recognition of baby's emotions. The present investigation concerns the first stage of the development of a robotics baby caregiver that has the ability to detect baby's emotions. In this first stage, we have developed a method for detecting baby's emotions. We expect that the proposed method could be used in robots that can help take care of babies.

  19. Altered Kinematics of Facial Emotion Expression and Emotion Recognition Deficits Are Unrelated in Parkinson?s Disease

    OpenAIRE

    Bologna, Matteo; Berardelli, Isabella; Paparella, Giulia; Marsili, Luca; Ricciardi, Lucia; Fabbrini, Giovanni; Berardelli, Alfredo

    2016-01-01

    Background Altered emotional processing, including reduced emotion facial expression and defective emotion recognition, has been reported in patients with Parkinson?s disease (PD). However, few studies have objectively investigated facial expression abnormalities in PD using neurophysiological techniques. It is not known whether altered facial expression and recognition in PD are related. Objective To investigate possible deficits in facial emotion expression and emotion recognition and their...

  20. Sleep deprivation impairs object recognition in mice

    NARCIS (Netherlands)

    Palchykova, S; Winsky-Sommerer, R; Meerlo, P; Durr, R; Tobler, Irene

    2006-01-01

    Many studies in animals and humans suggest that sleep facilitates learning, memory consolidation, and retrieval. Moreover, sleep deprivation (SD) incurred after learning, impaired memory in humans, mice, rats, and hamsters. We investigated the importance of sleep and its timing in in object

  1. Pattern Recognition Methods and Features Selection for Speech Emotion Recognition System.

    Science.gov (United States)

    Partila, Pavol; Voznak, Miroslav; Tovarek, Jaromir

    2015-01-01

    The impact of the classification method and features selection for the speech emotion recognition accuracy is discussed in this paper. Selecting the correct parameters in combination with the classifier is an important part of reducing the complexity of system computing. This step is necessary especially for systems that will be deployed in real-time applications. The reason for the development and improvement of speech emotion recognition systems is wide usability in nowadays automatic voice controlled systems. Berlin database of emotional recordings was used in this experiment. Classification accuracy of artificial neural networks, k-nearest neighbours, and Gaussian mixture model is measured considering the selection of prosodic, spectral, and voice quality features. The purpose was to find an optimal combination of methods and group of features for stress detection in human speech. The research contribution lies in the design of the speech emotion recognition system due to its accuracy and efficiency.

  2. Pattern Recognition Methods and Features Selection for Speech Emotion Recognition System

    Directory of Open Access Journals (Sweden)

    Pavol Partila

    2015-01-01

    Full Text Available The impact of the classification method and features selection for the speech emotion recognition accuracy is discussed in this paper. Selecting the correct parameters in combination with the classifier is an important part of reducing the complexity of system computing. This step is necessary especially for systems that will be deployed in real-time applications. The reason for the development and improvement of speech emotion recognition systems is wide usability in nowadays automatic voice controlled systems. Berlin database of emotional recordings was used in this experiment. Classification accuracy of artificial neural networks, k-nearest neighbours, and Gaussian mixture model is measured considering the selection of prosodic, spectral, and voice quality features. The purpose was to find an optimal combination of methods and group of features for stress detection in human speech. The research contribution lies in the design of the speech emotion recognition system due to its accuracy and efficiency.

  3. Glucocorticoid effects on object recognition memory require training-associated emotional arousal.

    Science.gov (United States)

    Okuda, Shoki; Roozendaal, Benno; McGaugh, James L

    2004-01-20

    Considerable evidence implicates glucocorticoid hormones in the regulation of memory consolidation and memory retrieval. The present experiments investigated whether the influence of these hormones on memory depends on the level of emotional arousal induced by the training experience. We investigated this issue in male Sprague-Dawley rats by examining the effects of immediate posttraining systemic injections of the glucocorticoid corticosterone on object recognition memory under two conditions that differed in their training-associated emotional arousal. In rats that were not previously habituated to the experimental context, corticosterone (0.3, 1.0, or 3.0 mg/kg, s.c.) administered immediately after a 3-min training trial enhanced 24-hr retention performance in an inverted-U shaped dose-response relationship. In contrast, corticosterone did not affect 24-hr retention of rats that received extensive prior habituation to the experimental context and, thus, had decreased novelty-induced emotional arousal during training. Additionally, immediate posttraining administration of corticosterone to nonhabituated rats, in doses that enhanced 24-hr retention, impaired object recognition performance at a 1-hr retention interval whereas corticosterone administered after training to well-habituated rats did not impair 1-hr retention. Thus, the present findings suggest that training-induced emotional arousal may be essential for glucocorticoid effects on object recognition memory.

  4. A New Fuzzy Cognitive Map Learning Algorithm for Speech Emotion Recognition

    OpenAIRE

    Zhang, Wei; Zhang, Xueying; Sun, Ying

    2017-01-01

    Selecting an appropriate recognition method is crucial in speech emotion recognition applications. However, the current methods do not consider the relationship between emotions. Thus, in this study, a speech emotion recognition system based on the fuzzy cognitive map (FCM) approach is constructed. Moreover, a new FCM learning algorithm for speech emotion recognition is proposed. This algorithm includes the use of the pleasure-arousal-dominance emotion scale to calculate the weights between e...

  5. EXTENDED SPEECH EMOTION RECOGNITION AND PREDICTION

    Directory of Open Access Journals (Sweden)

    Theodoros Anagnostopoulos

    2014-11-01

    Full Text Available Humans are considered to reason and act rationally and that is believed to be their fundamental difference from the rest of the living entities. Furthermore, modern approaches in the science of psychology underline that humans as a thinking creatures are also sentimental and emotional organisms. There are fifteen universal extended emotions plus neutral emotion: hot anger, cold anger, panic, fear, anxiety, despair, sadness, elation, happiness, interest, boredom, shame, pride, disgust, contempt and neutral position. The scope of the current research is to understand the emotional state of a human being by capturing the speech utterances that one uses during a common conversation. It is proved that having enough acoustic evidence available the emotional state of a person can be classified by a set of majority voting classifiers. The proposed set of classifiers is based on three main classifiers: kNN, C4.5 and SVM RBF Kernel. This set achieves better performance than each basic classifier taken separately. It is compared with two other sets of classifiers: one-against-all (OAA multiclass SVM with Hybrid kernels and the set of classifiers which consists of the following two basic classifiers: C5.0 and Neural Network. The proposed variant achieves better performance than the other two sets of classifiers. The paper deals with emotion classification by a set of majority voting classifiers that combines three certain types of basic classifiers with low computational complexity. The basic classifiers stem from different theoretical background in order to avoid bias and redundancy which gives the proposed set of classifiers the ability to generalize in the emotion domain space.

  6. Speech emotion recognition based on statistical pitch model

    Institute of Scientific and Technical Information of China (English)

    WANG Zhiping; ZHAO Li; ZOU Cairong

    2006-01-01

    A modified Parzen-window method, which keep high resolution in low frequencies and keep smoothness in high frequencies, is proposed to obtain statistical model. Then, a gender classification method utilizing the statistical model is proposed, which have a 98% accuracy of gender classification while long sentence is dealt with. By separation the male voice and female voice, the mean and standard deviation of speech training samples with different emotion are used to create the corresponding emotion models. Then the Bhattacharyya distance between the test sample and statistical models of pitch, are utilized for emotion recognition in speech.The normalization of pitch for the male voice and female voice are also considered, in order to illustrate them into a uniform space. Finally, the speech emotion recognition experiment based on K Nearest Neighbor shows that, the correct rate of 81% is achieved, where it is only 73.85%if the traditional parameters are utilized.

  7. The Influence of Emotion on Recognition Memory for Scenes

    OpenAIRE

    Pryde, Beatrice

    2012-01-01

    According to dual-process models, recognition memory is supported by two distinct processes: familiarity, a relatively automatic process that involves the retrieval of a previously encountered item, and recollection, a more effortful process that involves the retrieval of information associated with the context in which an item was encoded (Mickes, Wais & Wixted, 2009). There is a wealth of research suggesting that recognition memory performance is affected by the emotional content of stimul...

  8. Emotion and language: Valence and arousal affect word recognition

    Science.gov (United States)

    Brysbaert, Marc; Warriner, Amy Beth

    2014-01-01

    Emotion influences most aspects of cognition and behavior, but emotional factors are conspicuously absent from current models of word recognition. The influence of emotion on word recognition has mostly been reported in prior studies on the automatic vigilance for negative stimuli, but the precise nature of this relationship is unclear. Various models of automatic vigilance have claimed that the effect of valence on response times is categorical, an inverted-U, or interactive with arousal. The present study used a sample of 12,658 words, and included many lexical and semantic control factors, to determine the precise nature of the effects of arousal and valence on word recognition. Converging empirical patterns observed in word-level and trial-level data from lexical decision and naming indicate that valence and arousal exert independent monotonic effects: Negative words are recognized more slowly than positive words, and arousing words are recognized more slowly than calming words. Valence explained about 2% of the variance in word recognition latencies, whereas the effect of arousal was smaller. Valence and arousal do not interact, but both interact with word frequency, such that valence and arousal exert larger effects among low-frequency words than among high-frequency words. These results necessitate a new model of affective word processing whereby the degree of negativity monotonically and independently predicts the speed of responding. This research also demonstrates that incorporating emotional factors, especially valence, improves the performance of models of word recognition. PMID:24490848

  9. Emotion Recognition in Face and Body Motion in Bulimia Nervosa.

    Science.gov (United States)

    Dapelo, Marcela Marin; Surguladze, Simon; Morris, Robin; Tchanturia, Kate

    2017-11-01

    Social cognition has been studied extensively in anorexia nervosa (AN), but there are few studies in bulimia nervosa (BN). This study investigated the ability of people with BN to recognise emotions in ambiguous facial expressions and in body movement. Participants were 26 women with BN, who were compared with 35 with AN, and 42 healthy controls. Participants completed an emotion recognition task by using faces portraying blended emotions, along with a body emotion recognition task by using videos of point-light walkers. The results indicated that BN participants exhibited difficulties recognising disgust in less-ambiguous facial expressions, and a tendency to interpret non-angry faces as anger, compared with healthy controls. These difficulties were similar to those found in AN. There were no significant differences amongst the groups in body motion emotion recognition. The findings suggest that difficulties with disgust and anger recognition in facial expressions may be shared transdiagnostically in people with eating disorders. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association.

  10. Face and emotion recognition deficits in Turner syndrome: a possible role for X-linked genes in amygdala development.

    Science.gov (United States)

    Lawrence, Kate; Kuntsi, Jonna; Coleman, Michael; Campbell, Ruth; Skuse, David

    2003-01-01

    Face recognition is thought to rely on configural visual processing. Where face recognition impairments have been identified, qualitatively delayed or anomalous configural processing has also been found. A group of women with Turner syndrome (TS) with monosomy for a single maternal X chromosome (45, Xm) showed an impairment in face recognition skills compared with normally developing women. However, normal configural face-processing abilities were apparent. The ability to recognize facial expressions of emotion, particularly fear, was also impaired in this TS subgroup. Face recognition and fear recognition accuracy were significantly correlated in the female control group but not in women with TS. The authors therefore suggest that anomalies in amygdala function may be a neurological feature of TS of this karyotype.

  11. Impaired Perception of Emotional Expression in Amyotrophic Lateral Sclerosis.

    Science.gov (United States)

    Oh, Seong Il; Oh, Ki Wook; Kim, Hee Jin; Park, Jin Seok; Kim, Seung Hyun

    2016-07-01

    The increasing recognition that deficits in social emotions occur in amyotrophic lateral sclerosis (ALS) is helping to explain the spectrum of neuropsychological dysfunctions, thus supporting the view of ALS as a multisystem disorder involving neuropsychological deficits as well as motor deficits. The aim of this study was to characterize the emotion perception abilities of Korean patients with ALS based on the recognition of facial expressions. Twenty-four patients with ALS and 24 age- and sex-matched healthy controls completed neuropsychological tests and facial emotion recognition tasks [ChaeLee Korean Facial Expressions of Emotions (ChaeLee-E)]. The ChaeLee-E test includes facial expressions for seven emotions: happiness, sadness, anger, disgust, fear, surprise, and neutral. The ability to perceive facial emotions was significantly worse among ALS patients performed than among healthy controls [65.2±18.0% vs. 77.1±6.6% (mean±SD), p=0.009]. Eight of the 24 patients (33%) scored below the 5th percentile score of controls for recognizing facial emotions. Emotion perception deficits occur in Korean ALS patients, particularly regarding facial expressions of emotion. These findings expand the spectrum of cognitive and behavioral dysfunction associated with ALS into emotion processing dysfunction.

  12. Is emotion recognition the only problem in ADHD? effects of pharmacotherapy on face and emotion recognition in children with ADHD.

    Science.gov (United States)

    Demirci, Esra; Erdogan, Ayten

    2016-12-01

    The objectives of this study were to evaluate both face and emotion recognition, to detect differences among attention deficit and hyperactivity disorder (ADHD) subgroups, to identify effects of the gender and to assess the effects of methylphenidate and atomoxetine treatment on both face and emotion recognition in patients with ADHD. The study sample consisted of 41 male, 29 female patients, 8-15 years of age, who were diagnosed as having combined type ADHD (N = 26), hyperactive/impulsive type ADHD (N = 21) or inattentive type ADHD (N = 23) but had not previously used any medication for ADHD and 35 male, 25 female healthy individuals. Long-acting methylphenidate (OROS-MPH) was prescribed to 38 patients, whereas atomoxetine was prescribed to 32 patients. The reading the mind in the eyes test (RMET) and Benton face recognition test (BFRT) were applied to all participants before and after treatment. The patients with ADHD had a significantly lower number of correct answers in child and adolescent RMET and in BFRT than the healthy controls. Among the ADHD subtypes, the hyperactive/impulsive subtype had a lower number of correct answers in the RMET than the inattentive subtypes, and the hyperactive/impulsive subtype had a lower number of correct answers in short and long form of BFRT than the combined and inattentive subtypes. Male and female patients with ADHD did not differ significantly with respect to the number of correct answers on the RMET and BFRT. The patients showed significant improvement in RMET and BFRT after treatment with OROS-MPH or atomoxetine. Patients with ADHD have difficulties in face recognition as well as emotion recognition. Both OROS-MPH and atomoxetine affect emotion recognition. However, further studies on the face and emotion recognition are needed in ADHD.

  13. Development of Perceptual Expertise in Emotion Recognition

    Science.gov (United States)

    Pollak, Seth D.; Messner, Michael; Kistler, Doris J.; Cohn, Jeffrey F.

    2009-01-01

    How do children's early social experiences influence their perception of emotion-specific information communicated by the face? To examine this question, we tested a group of abused children who had been exposed to extremely high levels of parental anger expression and physical threat. Children were presented with arrays of stimuli that depicted…

  14. Emotion Recognition in Children with Down Syndrome: Influence of Emotion Label and Expression Intensity

    Science.gov (United States)

    Cebula, Katie R.; Wishart, Jennifer G.; Willis, Diane S.; Pitcairn, Tom K.

    2017-01-01

    Some children with Down syndrome may experience difficulties in recognizing facial emotions, particularly fear, but it is not clear why, nor how such skills can best be facilitated. Using a photo-matching task, emotion recognition was tested in children with Down syndrome, children with nonspecific intellectual disability and cognitively matched,…

  15. Optical character recognition reading aid for the visually impaired.

    Science.gov (United States)

    Grandin, Juan Carlos; Cremaschi, Fabian; Lombardo, Elva; Vitu, Ed; Dujovny, Manuel

    2008-06-01

    An optical character recognition (OCR) reading machine is a significant help for visually impaired patients. An OCR reading machine is used. This instrument can provide a significant help in order to improve the quality of life of patients with low vision or blindness.

  16. Emotion Recognition in Adolescents with Down Syndrome: A Nonverbal Approach

    Directory of Open Access Journals (Sweden)

    Régis Pochon

    2017-05-01

    Full Text Available Several studies have reported that persons with Down syndrome (DS have difficulties recognizing emotions; however, there is insufficient research to prove that a deficit of emotional knowledge exists in DS. The aim of this study was to evaluate the recognition of emotional facial expressions without making use of emotional vocabulary, given the language problems known to be associated with this syndrome. The ability to recognize six emotions was assessed in 24 adolescents with DS. Their performance was compared to that of 24 typically developing children with the same nonverbal-developmental age, as assessed by Raven’s Progressive Matrices. Analysis of the results revealed no global difference; only marginal differences in the recognition of different emotions appeared. Study of the developmental trajectories revealed a developmental difference: the nonverbal reasoning level assessed by Raven’s matrices did not predict success on the experimental tasks in the DS group, contrary to the typically developing group. These results do not corroborate the hypothesis that there is an emotional knowledge deficit in DS and emphasize the importance of using dynamic, strictly nonverbal tasks in populations with language disorders.

  17. Recognition of Emotions in Autism: A Formal Meta-Analysis

    Science.gov (United States)

    Uljarevic, Mirko; Hamilton, Antonia

    2013-01-01

    Determining the integrity of emotion recognition in autistic spectrum disorder is important to our theoretical understanding of autism and to teaching social skills. Previous studies have reported both positive and negative results. Here, we take a formal meta-analytic approach, bringing together data from 48 papers testing over 980 participants…

  18. Development of Emotional Facial Recognition in Late Childhood and Adolescence

    Science.gov (United States)

    Thomas, Laura A.; De Bellis, Michael D.; Graham, Reiko; Labar, Kevin S.

    2007-01-01

    The ability to interpret emotions in facial expressions is crucial for social functioning across the lifespan. Facial expression recognition develops rapidly during infancy and improves with age during the preschool years. However, the developmental trajectory from late childhood to adulthood is less clear. We tested older children, adolescents…

  19. Towards Multimodal Emotion Recognition in E-Learning Environments

    Science.gov (United States)

    Bahreini, Kiavash; Nadolski, Rob; Westera, Wim

    2016-01-01

    This paper presents a framework (FILTWAM (Framework for Improving Learning Through Webcams And Microphones)) for real-time emotion recognition in e-learning by using webcams. FILTWAM offers timely and relevant feedback based upon learner's facial expressions and verbalizations. FILTWAM's facial expression software module has been developed and…

  20. Towards multimodal emotion recognition in E-learning environments

    NARCIS (Netherlands)

    Bahreini, Kiavash; Nadolski, Rob; Westera, Wim

    2014-01-01

    This paper presents a framework (FILTWAM (Framework for Improving Learning Through Webcams And Microphones)) for real-time emotion recognition in e-learning by using webcams. FILTWAM offers timely and relevant feedback based upon learner’s facial expressions and verbalizations. FILTWAM’s facial

  1. Bank note recognition for the vision impaired.

    Science.gov (United States)

    Hinwood, A; Preston, P; Suaning, G J; Lovell, N H

    2006-06-01

    Blind Australians find great difficulty in recognising bank notes. Each note has the same feel, with no Braille markings, irregular edges or other tangible features. In Australia, there is only one device available that can assist blind people recognise their notes. Internationally, there are devices available; however they are expensive, complex and have not been developed to cater for Australian currency. This paper discusses a new device, the MoneyTalker that takes advantage of the largely different colours and patterns on each Australian bank note and recognises the notes electronically, using the reflection and transmission properties of light. Different coloured lights are transmitted through the inserted note and the corresponding sensors detect distinct ranges of values depending on the colour of the note. Various classification algorithms were studied and the final algorithm was chosen based on accuracy and speed of recognition. The MoneyTalker has shown an accuracy of more than 99%. A blind subject has tested the device and believes that it is usable, compact and affordable. Based on the devices that are available currently in Australia, the MoneyTalker is an effective alternative in terms of accuracy and usability.

  2. False recognition of facial expressions of emotion: causes and implications.

    Science.gov (United States)

    Fernández-Dols, José-Miguel; Carrera, Pilar; Barchard, Kimberly A; Gacitua, Marta

    2008-08-01

    This article examines the importance of semantic processes in the recognition of emotional expressions, through a series of three studies on false recognition. The first study found a high frequency of false recognition of prototypical expressions of emotion when participants viewed slides and video clips of nonprototypical fearful and happy expressions. The second study tested whether semantic processes caused false recognition. The authors found that participants made significantly higher error rates when asked to detect expressions that corresponded to semantic labels than when asked to detect visual stimuli. Finally, given that previous research reported that false memories are less prevalent in younger children, the third study tested whether false recognition of prototypical expressions increased with age. The authors found that 67% of eight- to nine-year-old children reported nonpresent prototypical expressions of fear in a fearful context, but only 40% of 6- to 7-year-old children did so. Taken together, these three studies demonstrate the importance of semantic processes in the detection and categorization of prototypical emotional expressions.

  3. Facial Recognition of Happiness Is Impaired in Musicians with High Music Performance Anxiety.

    Science.gov (United States)

    Sabino, Alini Daniéli Viana; Camargo, Cristielli M; Chagas, Marcos Hortes N; Osório, Flávia L

    2018-01-01

    Music performance anxiety (MPA) can be defined as a lasting and intense apprehension connected with musical performance in public. Studies suggest that MPA can be regarded as a subtype of social anxiety. Since individuals with social anxiety have deficits in the recognition of facial emotion, we hypothesized that musicians with high levels of MPA would share similar impairments. The aim of this study was to compare parameters of facial emotion recognition (FER) between musicians with high and low MPA. 150 amateur and professional musicians with different musical backgrounds were assessed in respect to their level of MPA and completed a dynamic FER task. The outcomes investigated were accuracy, response time, emotional intensity, and response bias. Musicians with high MPA were less accurate in the recognition of happiness ( p  = 0.04; d  = 0.34), had increased response bias toward fear ( p  = 0.03), and increased response time to facial emotions as a whole ( p  = 0.02; d  = 0.39). Musicians with high MPA displayed FER deficits that were independent of general anxiety levels and possibly of general cognitive capacity. These deficits may favor the maintenance and exacerbation of experiences of anxiety during public performance, since cues of approval, satisfaction, and encouragement are not adequately recognized.

  4. Expression intensity, gender and facial emotion recognition: Women recognize only subtle facial emotions better than men.

    Science.gov (United States)

    Hoffmann, Holger; Kessler, Henrik; Eppel, Tobias; Rukavina, Stefanie; Traue, Harald C

    2010-11-01

    Two experiments were conducted in order to investigate the effect of expression intensity on gender differences in the recognition of facial emotions. The first experiment compared recognition accuracy between female and male participants when emotional faces were shown with full-blown (100% emotional content) or subtle expressiveness (50%). In a second experiment more finely grained analyses were applied in order to measure recognition accuracy as a function of expression intensity (40%-100%). The results show that although women were more accurate than men in recognizing subtle facial displays of emotion, there was no difference between male and female participants when recognizing highly expressive stimuli. Copyright © 2010 Elsevier B.V. All rights reserved.

  5. Clinicians' recognition and management of emotions during difficult healthcare conversations.

    Science.gov (United States)

    Martin, Elliott B; Mazzola, Natalia M; Brandano, Jessica; Luff, Donna; Zurakowski, David; Meyer, Elaine C

    2015-10-01

    To examine the most commonly reported emotions encountered among healthcare practitioners when holding difficult conversations, including frequency and impact on care delivery. Interprofessional learners from a range of experience levels and specialties completed self-report questionnaires prior to simulation-based communication workshops. Clinicians were asked to describe up to three emotions they experienced when having difficult healthcare conversations; subsequent questions used Likert-scales to measure frequency of each emotion, and whether care was affected. 152 participants completed questionnaires, including physicians, nurses, and psychosocial professionals. Most commonly reported emotions were anxiety, sadness, empathy, frustration, and insecurity. There were significant differences in how clinicians perceived these different emotions affecting care. Empathy and anxiety were emotions perceived to influence care more than sadness, frustration, and insecurity. Most clinicians, regardless of clinical experience and discipline, find their emotional state influences the quality of their care delivery. Most clinicians rate themselves as somewhat to quite capable of recognizing and managing their emotions, acknowledging significant room to grow. Further education designed to increase clinicians' recognition of, reflection on, and management of emotion would likely prove helpful in improving their ability to navigate difficult healthcare conversations. Interventions aimed at anxiety management are particularly needed. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. Evaluating deep learning architectures for Speech Emotion Recognition.

    Science.gov (United States)

    Fayek, Haytham M; Lech, Margaret; Cavedon, Lawrence

    2017-08-01

    Speech Emotion Recognition (SER) can be regarded as a static or dynamic classification problem, which makes SER an excellent test bed for investigating and comparing various deep learning architectures. We describe a frame-based formulation to SER that relies on minimal speech processing and end-to-end deep learning to model intra-utterance dynamics. We use the proposed SER system to empirically explore feed-forward and recurrent neural network architectures and their variants. Experiments conducted illuminate the advantages and limitations of these architectures in paralinguistic speech recognition and emotion recognition in particular. As a result of our exploration, we report state-of-the-art results on the IEMOCAP database for speaker-independent SER and present quantitative and qualitative assessments of the models' performances. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Emotion recognition and emergent leadership : Unraveling mediating mechanisms and boundary conditions

    NARCIS (Netherlands)

    Walter, F.; Cole, M.S.; van der Vegt, G.S.; Rubin, R.S.; Bommer, W.H.

    2012-01-01

    This study examines the complex connection between individuals' emotion recognition capability and their emergence as leaders. It is hypothesized that emotion recognition and extraversion interactively relate with an individual's task coordination behavior which, in turn, influences the likelihood

  8. A gesture-controlled Serious Game for teaching emotion recognition skills to preschoolers with autism

    DEFF Research Database (Denmark)

    Christinaki, Eirini; Triantafyllidis, Georgios; Vidakis, Nikolaos

    The recognition of facial expressions is important for the perception of emotions. Understanding emotions is essential in human communication and social interaction. Children with autism have been reported to exhibit deficits in the recognition of affective expressions. With the appropriate...

  9. Emotion impairs extrinsic source memory--An ERP study.

    Science.gov (United States)

    Mao, Xinrui; You, Yuqi; Li, Wen; Guo, Chunyan

    2015-09-01

    Substantial advancements in understanding emotional modulation of item memory notwithstanding, controversies remain as to how emotion influences source memory. Using an emotional extrinsic source memory paradigm combined with remember/know judgments and two key event-related potentials (ERPs)-the FN400 (a frontal potential at 300-500 ms related to familiarity) and the LPC (a later parietal potential at 500-700 ms related to recollection), our research investigated the impact of emotion on extrinsic source memory and the underlying processes. We varied a semantic prompt (either "people" or "scene") preceding a study item to manipulate the extrinsic source. Behavioral data indicated a significant effect of emotion on "remember" responses to extrinsic source details, suggesting impaired recollection-based source memory in emotional (both positive and negative) relative to neutral conditions. In parallel, differential FN400 and LPC amplitudes (correctly remembered - incorrectly remembered sources) revealed emotion-related interference, suggesting impaired familiarity and recollection memory of extrinsic sources associated with positive or negative items. These findings thus lend support to the notion of emotion-induced memory trade off: while enhancing memory of central items and intrinsic/integral source details, emotion nevertheless disrupts memory of peripheral contextual details, potentially impairing both familiarity and recollection. Importantly, that positive and negative items result in comparable memory impairment suggests that arousal (vs. affective valence) plays a critical role in modulating dynamic interactions among automatic and elaborate processes involved in memory. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Normal mere exposure effect with impaired recognition in Alzheimer's disease.

    Science.gov (United States)

    Willems, Sylvie; Adam, Stéphane; Van der Linden, Martial

    2002-02-01

    We investigated the mere exposure effect and the explicit memory in Alzheimer's disease (AD) patients and elderly control subjects, using unfamiliar faces. During the exposure phase, the subjects estimated the age of briefly flashed faces. The mere exposure effect was examined by presenting pairs of faces (old and new) and asking participants to select the face they liked. The participants were then presented with a forced-choice explicit recognition task. Controls subjects exhibited above-chance preference and recognition scores for old faces. The AD patients also showed the mere exposure effect but no explicit recognition. These results suggest that the processes involved in the mere exposure effect are preserved in AD patients despite their impaired explicit recognition. The results are discussed in terms of Seamon et al.'s (1995) proposal that processes involved in the mere exposure effect are equivalent to those subserving perceptual priming. These processes would depend on extrastriate areas which are relatively preserved in AD patients.

  11. Infliximab ameliorates AD-associated object recognition memory impairment.

    Science.gov (United States)

    Kim, Dong Hyun; Choi, Seong-Min; Jho, Jihoon; Park, Man-Seok; Kang, Jisu; Park, Se Jin; Ryu, Jong Hoon; Jo, Jihoon; Kim, Hyun Hee; Kim, Byeong C

    2016-09-15

    Dysfunctions in the perirhinal cortex (PRh) are associated with visual recognition memory deficit, which is frequently detected in the early stage of Alzheimer's disease. Muscarinic acetylcholine receptor-dependent long-term depression (mAChR-LTD) of synaptic transmission is known as a key pathway in eliciting this type of memory, and Tg2576 mice expressing enhanced levels of Aβ oligomers are found to have impaired mAChR-LTD in this brain area at as early as 3 months of age. We found that the administration of Aβ oligomers in young normal mice also induced visual recognition memory impairment and perturbed mAChR-LTD in mouse PRh slices. In addition, when mice were treated with infliximab, a monoclonal antibody against TNF-α, visual recognition memory impaired by pre-administered Aβ oligomers dramatically improved and the detrimental Aβ effect on mAChR-LTD was annulled. Taken together, these findings suggest that Aβ-induced inflammation is mediated through TNF-α signaling cascades, disturbing synaptic transmission in the PRh, and leading to visual recognition memory deficits. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Does emotional memory enhancement assist the memory-impaired?

    Directory of Open Access Journals (Sweden)

    Lucas S. Broster

    2012-03-01

    Full Text Available We review recent work on emotional memory enhancement in older adults and patients with mild cognitive impairment or Alzheimer dementia and evaluate the viability of incorporating emotional components into cognitive rehabilitation for these groups. First, we identify converging evidence regarding the effects of emotional valence on working memory in healthy aging. Second, we introduce work that suggests a more complex role for emotional memory enhancement in aging and identify a model capable of unifying disparate research findings. Third, we identify neuroimaging evidence that the amygdala may play a key role in mediating emotional memory enhancement in mild cognitive impairment and early Alzheimer dementia. Finally, we assess the theoretical feasibility of incorporating emotional content into cognitive rehabilitation given all available evidence.

  13. Positive emotion can protect against source memory impairment.

    Science.gov (United States)

    MacKenzie, Graham; Powell, Tim F; Donaldson, David I

    2015-01-01

    Despite widespread belief that memory is enhanced by emotion, evidence also suggests that emotion can impair memory. Here we test predictions inspired by object-based binding theory, which states that memory enhancement or impairment depends on the nature of the information to be retrieved. We investigated emotional memory in the context of source retrieval, using images of scenes that were negative, neutral or positive in valence. At study each scene was paired with a colour and during retrieval participants reported the source colour for recognised scenes. Critically, we isolated effects of valence by equating stimulus arousal across conditions. In Experiment 1 colour borders surrounded scenes at study: memory impairment was found for both negative and positive scenes. Experiment 2 used colours superimposed over scenes at study: valence affected source retrieval, with memory impairment for negative scenes only. These findings challenge current theories of emotional memory by showing that emotion can impair memory for both intrinsic and extrinsic source information, even when arousal is equated between emotional and neutral stimuli, and by dissociating the effects of positive and negative emotion on episodic memory retrieval.

  14. Social Cognition in Borderline Personality Disorder: Evidence for Disturbed Recognition of the Emotions, Thoughts, and Intentions of Others

    Directory of Open Access Journals (Sweden)

    Sandra Preißler

    2010-12-01

    Full Text Available Disturbed relatedness is a core feature of borderline personality disorder (BPD, and impaired social cognition or deficits in mentalization are hypothesized to underlie this feature. To date, only weak empirical evidence argues for impairment in the recognition of emotions, thoughts, or intentions in BPD. Data from facial emotion recognition research indicate that these abilities are altered in BPD only if tasks are complex. The present study aims to assess social cognitive abilities in BPD. Sixty-four women with BPD and 38 healthy controls watched the Movie for the Assessment of Social Cognition (MASC, a newly developed film displaying social interactions, and asking for an assessment of the intentions, emotions, and thoughts of the characters. In addition, participants completed an established but less ecologically valid measure of social cognition (Reading the Mind in the Eyes; RME. In the RME task, BPD patients did not display impairment in social cognition compared to healthy controls. By contrast, on the more sensitive MASC, women with BPD showed significantly impaired abilities in social cognition compared to healthy controls in their recognition of emotions, thoughts, and intentions. Comorbid PTSD, intrusions, and sexual trauma negatively predicted social cognitive abilities on the more sensitive MASC. Thus, our results suggest impaired social cognitive abilities in BPD. Especially for comorbid PTSD, intrusive symptoms and history of sexual trauma predicted poor outcomes on social cognition tasks.

  15. Enhancing emotion recognition in VIPs with haptic feedback

    NARCIS (Netherlands)

    Buimer, Hendrik; Bittner, Marian; Kostelijk, Tjerk; van der Geest, Thea; van Wezel, Richard Jack Anton; Zhao, Yan; Stephanidis, Constantine

    2016-01-01

    The rise of smart technologies has created new opportunities to support blind and visually impaired persons (VIPs). One of the biggest problems we identified in our previous research on problems VIPs face during activities of daily life concerned the recognition of persons and their facial

  16. Automatic recognition of emotions from facial expressions

    Science.gov (United States)

    Xue, Henry; Gertner, Izidor

    2014-06-01

    In the human-computer interaction (HCI) process it is desirable to have an artificial intelligent (AI) system that can identify and categorize human emotions from facial expressions. Such systems can be used in security, in entertainment industries, and also to study visual perception, social interactions and disorders (e.g. schizophrenia and autism). In this work we survey and compare the performance of different feature extraction algorithms and classification schemes. We introduce a faster feature extraction method that resizes and applies a set of filters to the data images without sacrificing the accuracy. In addition, we have enhanced SVM to multiple dimensions while retaining the high accuracy rate of SVM. The algorithms were tested using the Japanese Female Facial Expression (JAFFE) Database and the Database of Faces (AT&T Faces).

  17. Individual differences in components of impulsivity and effortful control moderate the relation between borderline personality disorder traits and emotion recognition in a sample of university students.

    Science.gov (United States)

    Preti, Emanuele; Richetin, Juliette; Suttora, Chiara; Pisani, Alberto

    2016-04-30

    Dysfunctions in social cognition characterize personality disorders. However, mixed results emerged from literature on emotion processing. Borderline Personality Disorder (BPD) traits are either associated with enhanced emotion recognition, impairments, or equal functioning compared to controls. These apparent contradictions might result from the complexity of emotion recognition tasks used and from individual differences in impulsivity and effortful control. We conducted a study in a sample of undergraduate students (n=80), assessing BPD traits, using an emotion recognition task that requires the processing of only visual information or both visual and acoustic information. We also measured individual differences in impulsivity and effortful control. Results demonstrated the moderating role of some components of impulsivity and effortful control on the capability of BPD traits in predicting anger and happiness recognition. We organized the discussion around the interaction between different components of regulatory functioning and task complexity for a better understanding of emotion recognition in BPD samples. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. Comparing the Recognition of Emotional Facial Expressions in Patients with

    Directory of Open Access Journals (Sweden)

    Abdollah Ghasempour

    2014-05-01

    Full Text Available Background: Recognition of emotional facial expressions is one of the psychological factors which involve in obsessive-compulsive disorder (OCD and major depressive disorder (MDD. The aim of present study was to compare the ability of recognizing emotional facial expressions in patients with Obsessive-Compulsive Disorder and major depressive disorder. Materials and Methods: The present study is a cross-sectional and ex-post facto investigation (causal-comparative method. Forty participants (20 patients with OCD, 20 patients with MDD were selected through available sampling method from the clients referred to Tabriz Bozorgmehr clinic. Data were collected through Structured Clinical Interview and Recognition of Emotional Facial States test. The data were analyzed utilizing MANOVA. Results: The obtained results showed that there is no significant difference between groups in the mean score of recognition emotional states of surprise, sadness, happiness and fear; but groups had a significant difference in the mean score of diagnosing disgust and anger states (p<0.05. Conclusion: Patients suffering from both OCD and MDD show equal ability to recognize surprise, sadness, happiness and fear. However, the former are less competent in recognizing disgust and anger than the latter.

  19. Gaze Dynamics in the Recognition of Facial Expressions of Emotion.

    Science.gov (United States)

    Barabanschikov, Vladimir A

    2015-01-01

    We studied preferably fixated parts and features of human face in the process of recognition of facial expressions of emotion. Photographs of facial expressions were used. Participants were to categorize these as basic emotions; during this process, eye movements were registered. It was found that variation in the intensity of an expression is mirrored in accuracy of emotion recognition; it was also reflected by several indices of oculomotor function: duration of inspection of certain areas of the face, its upper and bottom or right parts, right and left sides; location, number and duration of fixations, viewing trajectory. In particular, for low-intensity expressions, right side of the face was found to be attended predominantly (right-side dominance); the right-side dominance effect, was, however, absent for expressions of high intensity. For both low- and high-intensity expressions, upper face part was predominantly fixated, though with greater fixation of high-intensity expressions. The majority of trials (70%), in line with findings in previous studies, revealed a V-shaped pattern of inspection trajectory. No relationship, between accuracy of recognition of emotional expressions, was found, though, with either location and duration of fixations or pattern of gaze directedness in the face. © The Author(s) 2015.

  20. Reduced Recognition of Dynamic Facial Emotional Expressions and Emotion-Specific Response Bias in Children with an Autism Spectrum Disorder

    Science.gov (United States)

    Evers, Kris; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2015-01-01

    Emotion labelling was evaluated in two matched samples of 6-14-year old children with and without an autism spectrum disorder (ASD; N = 45 and N = 50, resp.), using six dynamic facial expressions. The Emotion Recognition Task proved to be valuable demonstrating subtle emotion recognition difficulties in ASD, as we showed a general poorer emotion…

  1. Stein and Honneth on Empathy and Emotional Recognition

    DEFF Research Database (Denmark)

    Jardine, James Alexander

    2015-01-01

    My aim in this paper is to make use of Edith Stein’s phenomenological analyses of empathy, emotion, and personhood to clarify and critically assess the recent suggestion by Axel Honneth that a basic form of recognition is affective in nature. I will begin by considering Honneth’s own presentation...... of this claim in his discussion of the role of affect in recognitive gestures, as well as in his notion of ‘elementary recognition,’ arguing that while his account contains much of value it also generates problems. On the basis of this analysis, I will try to show that Stein’s account of empathy demarcates...... an elementary form of recognition in a less problematic fashion than does Honneth’s own treatment of this issue. I will then spell out the consequences of this move for the emotional recognition thesis, arguing that Stein’s treatment lends it further credence, before ending with some remarks on the connection...

  2. Emotional Intelligence Levels of Students with Sensory Impairment

    Science.gov (United States)

    Al-Tal, Suhair; AL-Jawaldeh, Fuad; AL-Taj, Heyam; Maharmeh, Lina

    2017-01-01

    This study aimed at revealing the emotional intelligence levels of students with sensory disability in Amman in Jordan. The participants of the study were 200 students; 140 hearing impaired students and 60 visual impaired students enrolled in the special education schools and centers for the academic year 2016-2017. The study adopted the…

  3. Neural basis of emotion recognition deficits in first-episode major depression

    NARCIS (Netherlands)

    van Wingen, G. A.; van Eijndhoven, P.; Tendolkar, I.; Buitelaar, J.; Verkes, R. J.; Fernández, G.

    2011-01-01

    Depressed individuals demonstrate a poorer ability to recognize the emotions of others, which could contribute to difficulties in interpersonal behaviour. This emotion recognition deficit appears related to the depressive state and is particularly pronounced when emotions are labelled semantically.

  4. Neural basis of emotion recognition deficits in first-episode major depression

    NARCIS (Netherlands)

    Wingen, G.A. van; Eijndhoven, P.F.P. van; Tendolkar, I.; Buitelaar, J.K.; Verkes, R.J.; Fernandez, G.S.E.

    2011-01-01

    BACKGROUND: Depressed individuals demonstrate a poorer ability to recognize the emotions of others, which could contribute to difficulties in interpersonal behaviour. This emotion recognition deficit appears related to the depressive state and is particularly pronounced when emotions are labelled

  5. The recognition of emotional expression in prosopagnosia: decoding whole and part faces.

    Science.gov (United States)

    Stephan, Blossom Christa Maree; Breen, Nora; Caine, Diana

    2006-11-01

    Prosopagnosia is currently viewed within the constraints of two competing theories of face recognition, one highlighting the analysis of features, the other focusing on configural processing of the whole face. This study investigated the role of feature analysis versus whole face configural processing in the recognition of facial expression. A prosopagnosic patient, SC made expression decisions from whole and incomplete (eyes-only and mouth-only) faces where features had been obscured. SC was impaired at recognizing some (e.g., anger, sadness, and fear), but not all (e.g., happiness) emotional expressions from the whole face. Analyses of his performance on incomplete faces indicated that his recognition of some expressions actually improved relative to his performance on the whole face condition. We argue that in SC interference from damaged configural processes seem to override an intact ability to utilize part-based or local feature cues.

  6. Heterogeneity of long-history migration predicts emotion recognition accuracy.

    Science.gov (United States)

    Wood, Adrienne; Rychlowska, Magdalena; Niedenthal, Paula M

    2016-06-01

    Recent work (Rychlowska et al., 2015) demonstrated the power of a relatively new cultural dimension, historical heterogeneity, in predicting cultural differences in the endorsement of emotion expression norms. Historical heterogeneity describes the number of source countries that have contributed to a country's present-day population over the last 500 years. People in cultures originating from a large number of source countries may have historically benefited from greater and clearer emotional expressivity, because they lacked a common language and well-established social norms. We therefore hypothesized that in addition to endorsing more expressive display rules, individuals from heterogeneous cultures will also produce facial expressions that are easier to recognize by people from other cultures. By reanalyzing cross-cultural emotion recognition data from 92 papers and 82 cultures, we show that emotion expressions of people from heterogeneous cultures are more easily recognized by observers from other cultures than are the expressions produced in homogeneous cultures. Heterogeneity influences expression recognition rates alongside the individualism-collectivism of the perceivers' culture, as more individualistic cultures were more accurate in emotion judgments than collectivistic cultures. This work reveals the present-day behavioral consequences of long-term historical migration patterns and demonstrates the predictive power of historical heterogeneity. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  7. Does remembering emotional items impair recall of same-emotion items?

    Science.gov (United States)

    Sison, Jo Ann G; Mather, Mara

    2007-04-01

    In the part-set cuing effect, cuing a subset of previously studied items impairs recall of the remaining noncued items. This experiment reveals that cuing participants with previously-studied emotional pictures (e.g., fear-evoking pictures of people) can impair recall of pictures involving the same emotion but different content (e.g., fear-evoking pictures of animals). This indicates that new events can be organized in memory using emotion as a grouping function to create associations. However, whether new information is organized in memory along emotional or nonemotional lines appears to be a flexible process that depends on people's current focus. Mentioning in the instructions that the pictures were either amusement- or fear-related led to memory impairment for pictures with the same emotion as cued pictures, whereas mentioning that the pictures depicted either animals or people led to memory impairment for pictures with the same type of actor.

  8. Impaired Facial Expression Recognition in Children with Temporal Lobe Epilepsy: Impact of Early Seizure Onset on Fear Recognition

    Science.gov (United States)

    Golouboff, Nathalie; Fiori, Nicole; Delalande, Olivier; Fohlen, Martine; Dellatolas, Georges; Jambaque, Isabelle

    2008-01-01

    The amygdala has been implicated in the recognition of facial emotions, especially fearful expressions, in adults with early-onset right temporal lobe epilepsy (TLE). The present study investigates the recognition of facial emotions in children and adolescents, 8-16 years old, with epilepsy. Twenty-nine subjects had TLE (13 right, 16 left) and…

  9. Impact of Social Cognition on Alcohol Dependence Treatment Outcome: Poorer Facial Emotion Recognition Predicts Relapse/Dropout.

    Science.gov (United States)

    Rupp, Claudia I; Derntl, Birgit; Osthaus, Friederike; Kemmler, Georg; Fleischhacker, W Wolfgang

    2017-12-01

    treatment. Impaired facial emotion recognition represents a neurocognitive risk factor that should be taken into account in alcohol dependence treatment. Treatments targeting the improvement of these social cognition deficits in AUD may offer a promising future approach. Copyright © 2017 by the Research Society on Alcoholism.

  10. Impaired Emotional Mirroring in Parkinson’s Disease—A Study on Brain Activation during Processing of Facial Expressions

    Directory of Open Access Journals (Sweden)

    Anna Pohl

    2017-12-01

    Full Text Available BackgroundAffective dysfunctions are common in patients with Parkinson’s disease, but the underlying neurobiological deviations have rarely been examined. Parkinson’s disease is characterized by a loss of dopamine neurons in the substantia nigra resulting in impairment of motor and non-motor basal ganglia-cortical loops. Concerning emotional deficits, some studies provide evidence for altered brain processing in limbic- and lateral-orbitofrontal gating loops. In a second line of evidence, human premotor and inferior parietal homologs of mirror neuron areas were involved in processing and understanding of emotional facial expressions. We examined deviations in brain activation during processing of facial expressions in patients and related these to emotion recognition accuracy.Methods13 patients and 13 healthy controls underwent an emotion recognition task and a functional magnetic resonance imaging (fMRI measurement. In the Emotion Hexagon test, participants were presented with blends of two emotions and had to indicate which emotion best described the presented picture. Blended pictures with three levels of difficulty were included. During fMRI scanning, participants observed video clips depicting emotional, non-emotional, and neutral facial expressions or were asked to produce these facial expressions themselves.ResultsPatients performed slightly worse in the emotion recognition task, but only when judging the most ambiguous facial expressions. Both groups activated inferior frontal and anterior inferior parietal homologs of mirror neuron areas during observation and execution of the emotional facial expressions. During observation, responses in the pars opercularis of the right inferior frontal gyrus, in the bilateral inferior parietal lobule and in the bilateral supplementary motor cortex were decreased in patients. Furthermore, in patients, activation of the right anterior inferior parietal lobule was positively related to accuracy in

  11. A Research of Speech Emotion Recognition Based on Deep Belief Network and SVM

    Directory of Open Access Journals (Sweden)

    Chenchen Huang

    2014-01-01

    Full Text Available Feature extraction is a very important part in speech emotion recognition, and in allusion to feature extraction in speech emotion recognition problems, this paper proposed a new method of feature extraction, using DBNs in DNN to extract emotional features in speech signal automatically. By training a 5 layers depth DBNs, to extract speech emotion feature and incorporate multiple consecutive frames to form a high dimensional feature. The features after training in DBNs were the input of nonlinear SVM classifier, and finally speech emotion recognition multiple classifier system was achieved. The speech emotion recognition rate of the system reached 86.5%, which was 7% higher than the original method.

  12. Facial expressions recognition with an emotion expressive robotic head

    Science.gov (United States)

    Doroftei, I.; Adascalitei, F.; Lefeber, D.; Vanderborght, B.; Doroftei, I. A.

    2016-08-01

    The purpose of this study is to present the preliminary steps in facial expressions recognition with a new version of an expressive social robotic head. So, in a first phase, our main goal was to reach a minimum level of emotional expressiveness in order to obtain nonverbal communication between the robot and human by building six basic facial expressions. To evaluate the facial expressions, the robot was used in some preliminary user studies, among children and adults.

  13. Recognition of a Baby's Emotional Cry Towards Robotics Baby Caregiver

    OpenAIRE

    Yamamoto, Shota; Yoshitomi, Yasunari; Tabuse, Masayoshi; Kushida, Kou; Asada, Taro

    2013-01-01

    We developed a method for pattern recognition of baby's emotions (discomfortable, hungry, or sleepy) expressed in the baby's cries. A 32-dimensional fast Fourier transform is performed for sound form clips, detected by our reported method and used as training data. The power of the sound form judged as a silent region is subtracted from each power of the frequency element. The power of each frequency element after the subtraction is treated as one of the elements of the feature vector. We per...

  14. [Impact of facial emotional recognition alterations in Dementia of the Alzheimer type].

    Science.gov (United States)

    Rubinstein, Wanda; Cossini, Florencia; Politis, Daniel

    2016-07-01

    Face recognition of basic emotions is independent of other deficits in dementia of the Alzheimer type. Among these deficits, there is disagreement about what emotions are more difficult to recognize. Our aim was to study the presence of alterations in the process of facial recognition of basic emotions, and to investigate if there were differences in the recognition of each type of emotion in Alzheimer's disease. With three tests of recognition of basic facial emotions we evaluated 29 patients who had been diagnosed with dementia of the Alzheimer type and 18 control subjects. Significant differences were obtained in tests of recognition of basic facial emotions and between each. Since the amygdala, one of the brain structures responsible for emotional reaction, is affected in the early stages of this disease, our findings become relevant to understand how this alteration of the process of emotional recognition impacts the difficulties these patients have in both interpersonal relations and behavioral disorders.

  15. A familiar font drives early emotional effects in word recognition.

    Science.gov (United States)

    Kuchinke, Lars; Krause, Beatrix; Fritsch, Nathalie; Briesemeister, Benny B

    2014-10-01

    The emotional connotation of a word is known to shift the process of word recognition. Using the electroencephalographic event-related potentials (ERPs) approach it has been documented that early attentional processing of high-arousing negative words is shifted at a stage of processing where a presented word cannot have been fully identified. Contextual learning has been discussed to contribute to these effects. The present study shows that a manipulation of the familiarity with a word's shape interferes with these earliest emotional ERP effects. Presenting high-arousing negative and neutral words in a familiar or an unfamiliar font results in very early emotion differences only in case of familiar shapes, whereas later processing stages reveal similar emotional effects in both font conditions. Because these early emotion-related differences predict later behavioral differences, it is suggested that contextual learning of emotional valence comprises more visual features than previously expected to guide early visual-sensory processing. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Children's Recognition of Emotional Facial Expressions Through Photographs and Drawings.

    Science.gov (United States)

    Brechet, Claire

    2017-01-01

    The author's purpose was to examine children's recognition of emotional facial expressions, by comparing two types of stimulus: photographs and drawings. The author aimed to investigate whether drawings could be considered as a more evocative material than photographs, as a function of age and emotion. Five- and 7-year-old children were presented with photographs and drawings displaying facial expressions of 4 basic emotions (i.e., happiness, sadness, anger, and fear) and were asked to perform a matching task by pointing to the face corresponding to the target emotion labeled by the experimenter. The photographs we used were selected from the Radboud Faces Database and the drawings were designed on the basis of both the facial components involved in the expression of these emotions and the graphic cues children tend to use when asked to depict these emotions in their own drawings. Our results show that drawings are better recognized than photographs, for sadness, anger, and fear (with no difference for happiness, due to a ceiling effect). And that the difference between the 2 types of stimuli tends to be more important for 5-year-olds compared to 7-year-olds. These results are discussed in view of their implications, both for future research and for practical application.

  17. Emotion Recognition of Speech Signals Based on Filter Methods

    Directory of Open Access Journals (Sweden)

    Narjes Yazdanian

    2016-10-01

    Full Text Available Speech is the basic mean of communication among human beings.With the increase of transaction between human and machine, necessity of automatic dialogue and removing human factor has been considered. The aim of this study was to determine a set of affective features the speech signal is based on emotions. In this study system was designs that include three mains sections, features extraction, features selection and classification. After extraction of useful features such as, mel frequency cepstral coefficient (MFCC, linear prediction cepstral coefficients (LPC, perceptive linear prediction coefficients (PLP, ferment frequency, zero crossing rate, cepstral coefficients and pitch frequency, Mean, Jitter, Shimmer, Energy, Minimum, Maximum, Amplitude, Standard Deviation, at a later stage with filter methods such as Pearson Correlation Coefficient, t-test, relief and information gain, we came up with a method to rank and select effective features in emotion recognition. Then Result, are given to the classification system as a subset of input. In this classification stage, multi support vector machine are used to classify seven type of emotion. According to the results, that method of relief, together with multi support vector machine, has the most classification accuracy with emotion recognition rate of 93.94%.

  18. Self-imagining enhances recognition memory in memory-impaired individuals with neurological damage.

    Science.gov (United States)

    Grilli, Matthew D; Glisky, Elizabeth L

    2010-11-01

    The ability to imagine an elaborative event from a personal perspective relies on several cognitive processes that may potentially enhance subsequent memory for the event, including visual imagery, semantic elaboration, emotional processing, and self-referential processing. In an effort to find a novel strategy for enhancing memory in memory-impaired individuals with neurological damage, we investigated the mnemonic benefit of a method we refer to as self-imagining-the imagining of an event from a realistic, personal perspective. Fourteen individuals with neurologically based memory deficits and 14 healthy control participants intentionally encoded neutral and emotional sentences under three instructions: structural-baseline processing, semantic processing, and self-imagining. Findings revealed a robust "self-imagination effect (SIE)," as self-imagination enhanced recognition memory relative to deep semantic elaboration in both memory-impaired individuals, F(1, 13) = 32.11, p memory disorder nor were they related to self-reported vividness of visual imagery, semantic processing, or emotional content of the materials. The findings suggest that the SIE may depend on unique mnemonic mechanisms possibly related to self-referential processing and that imagining an event from a personal perspective makes that event particularly memorable even for those individuals with severe memory deficits. Self-imagining may thus provide an effective rehabilitation strategy for individuals with memory impairment.

  19. Associations between autistic traits and emotion recognition ability in non-clinical young adults

    OpenAIRE

    Lindahl, Christina

    2013-01-01

    This study investigated the associations between emotion recognition ability and autistic traits in a sample of non-clinical young adults. Two hundred and forty nine individuals took part in an emotion recognition test, which assessed recognition of 12 emotions portrayed by actors. Emotion portrayals were presented as short video clips, both with and without sound, and as sound only. Autistic traits were assessed using the Autism Spectrum Quotient (ASQ) questionnaire. Results showed that men ...

  20. Impaired socio-emotional processing in a developmental music disorder

    Science.gov (United States)

    Lima, César F.; Brancatisano, Olivia; Fancourt, Amy; Müllensiefen, Daniel; Scott, Sophie K.; Warren, Jason D.; Stewart, Lauren

    2016-01-01

    Some individuals show a congenital deficit for music processing despite normal peripheral auditory processing, cognitive functioning, and music exposure. This condition, termed congenital amusia, is typically approached regarding its profile of musical and pitch difficulties. Here, we examine whether amusia also affects socio-emotional processing, probing auditory and visual domains. Thirteen adults with amusia and 11 controls completed two experiments. In Experiment 1, participants judged emotions in emotional speech prosody, nonverbal vocalizations (e.g., crying), and (silent) facial expressions. Target emotions were: amusement, anger, disgust, fear, pleasure, relief, and sadness. Compared to controls, amusics were impaired for all stimulus types, and the magnitude of their impairment was similar for auditory and visual emotions. In Experiment 2, participants listened to spontaneous and posed laughs, and either inferred the authenticity of the speaker’s state, or judged how much laughs were contagious. Amusics showed decreased sensitivity to laughter authenticity, but normal contagion responses. Across the experiments, mixed-effects models revealed that the acoustic features of vocal signals predicted socio-emotional evaluations in both groups, but the profile of predictive acoustic features was different in amusia. These findings suggest that a developmental music disorder can affect socio-emotional cognition in subtle ways, an impairment not restricted to auditory information. PMID:27725686

  1. Social emotion recognition, social functioning, and attempted suicide in late-life depression.

    Science.gov (United States)

    Szanto, Katalin; Dombrovski, Alexandre Y; Sahakian, Barbara J; Mulsant, Benoit H; Houck, Patricia R; Reynolds, Charles F; Clark, Luke

    2012-03-01

    : Lack of feeling connected and poor social problem solving have been described in suicide attempters. However, cognitive substrates of this apparent social impairment in suicide attempters remain unknown. One possible deficit, the inability to recognize others' complex emotional states has been observed not only in disorders characterized by prominent social deficits (autism-spectrum disorders and frontotemporal dementia) but also in depression and normal aging. This study assessed the relationship between social emotion recognition, problem solving, social functioning, and attempted suicide in late-life depression. : There were 90 participants: 24 older depressed suicide attempters, 38 nonsuicidal depressed elders, and 28 comparison subjects with no psychiatric history. We compared performance on the Reading the Mind in the Eyes test and measures of social networks, social support, social problem solving, and chronic interpersonal difficulties in these three groups. : Suicide attempters committed significantly more errors in social emotion recognition and showed poorer global cognitive performance than elders with no psychiatric history. Attempters had restricted social networks: they were less likely to talk to their children, had fewer close friends, and did not engage in volunteer activities, compared to nonsuicidal depressed elders and those with no psychiatric history. They also reported a pattern of struggle against others and hostility in relationships, felt a lack of social support, perceived social problems as impossible to resolve, and displayed a careless/impulsive approach to problems. : Suicide attempts in depressed elders were associated with poor social problem solving, constricted social networks, and disruptive interpersonal relationships. Impaired social emotion recognition in the suicide attempter group was related.

  2. Emotion recognition and cognitive empathy deficits in adolescent offenders revealed by context-sensitive tasks

    Directory of Open Access Journals (Sweden)

    Maria Luz eGonzalez-Gadea

    2014-10-01

    Full Text Available Emotion recognition and empathy abilities require the integration of contextual information in real-life scenarios. Previous reports have explored these domains in adolescent offenders (AOs but have not used tasks that replicate everyday situations. In this study we included ecological measures with different levels of contextual dependence to evaluate emotion recognition and empathy in AOs relative to non-offenders, controlling for the effect of demographic variables. We also explored the influence of fluid intelligence (FI and executive functions (EFs in the prediction of relevant deficits in these domains. Our results showed that AOs exhibit deficits in context-sensitive measures of emotion recognition and cognitive empathy. Difficulties in these tasks were neither explained by demographic variables nor predicted by FI or EFs. However, performance on measures that included simpler stimuli or could be solved by explicit knowledge was either only partially affected by demographic variables or preserved in AOs. These findings indicate that AOs show contextual social-cognition impairments which are relatively independent of basic cognitive functioning and demographic variables.

  3. Aging and emotional expressions: is there a positivity bias during dynamic emotion recognition?

    Directory of Open Access Journals (Sweden)

    Alberto eDi Domenico

    2015-08-01

    Full Text Available In this study, we investigated whether age-related differences in emotion regulation priorities influence online dynamic emotional facial discrimination. A group of 40 younger and a group of 40 older adults were invited to recognize a positive or negative expression as soon as the expression slowly emerged and subsequently rate it in terms of intensity. Our findings show that older adults recognized happy expressions faster than angry ones, while the direction of emotional expression does not seem to affect younger adults’ performance. Furthermore, older adults rated both negative and positive emotional faces as more intense compared to younger controls. This study detects age-related differences with a dynamic online paradigm and suggests that different regulation strategies may shape emotional face recognition.

  4. Increased deficits in emotion recognition and regulation in children and adolescents with exogenous obesity.

    Science.gov (United States)

    Percinel, Ipek; Ozbaran, Burcu; Kose, Sezen; Simsek, Damla Goksen; Darcan, Sukran

    2018-03-01

    In this study we aimed to evaluate emotion recognition and emotion regulation skills of children with exogenous obesity between the ages of 11 and 18 years and compare them with healthy controls. The Schedule for Affective Disorders and Schizophrenia for School Aged Children was used for psychiatric evaluations. Emotion recognition skills were evaluated using Faces Test and Reading the Mind in the Eyes Test. The Difficulties in Emotions Regulation Scale was used for evaluating skills of emotion regulation. Children with obesity had lower scores on Faces Test and Reading the Mind in the Eyes Test, and experienced greater difficulty in emotional regulation skills. Improved understanding of emotional recognition and emotion regulation in young people with obesity may improve their social adaptation and help in the treatment of their disorder. To the best of our knowledge, this is the first study to evaluate both emotional recognition and emotion regulation functions in obese children and obese adolescents between 11 and 18 years of age.

  5. Improving emotion recognition systems by embedding cardiorespiratory coupling

    International Nuclear Information System (INIS)

    Valenza, Gaetano; Lanatá, Antonio; Scilingo, Enzo Pasquale

    2013-01-01

    This work aims at showing improved performances of an emotion recognition system embedding information gathered from cardiorespiratory (CR) coupling. Here, we propose a novel methodology able to robustly identify up to 25 regions of a two-dimensional space model, namely the well-known circumplex model of affect (CMA). The novelty of embedding CR coupling information in an autonomic nervous system-based feature space better reveals the sympathetic activations upon emotional stimuli. A CR synchrogram analysis was used to quantify such a coupling in terms of number of heartbeats per respiratory period. Physiological data were gathered from 35 healthy subjects emotionally elicited by means of affective pictures of the international affective picture system database. In this study, we finely detected five levels of arousal and five levels of valence as well as the neutral state, whose combinations were used for identifying 25 different affective states in the CMA plane. We show that the inclusion of the bivariate CR measures in a previously developed system based only on monovariate measures of heart rate variability, respiration dynamics and electrodermal response dramatically increases the recognition accuracy of a quadratic discriminant classifier, obtaining more than 90% of correct classification per class. Finally, we propose a comprehensive description of the CR coupling during sympathetic elicitation adapting an existing theoretical nonlinear model with external driving. The theoretical idea behind this model is that the CR system is comprised of weakly coupled self-sustained oscillators that, when exposed to an external perturbation (i.e. sympathetic activity), becomes synchronized and less sensible to input variations. Given the demonstrated role of the CR coupling, this model can constitute a general tool which is easily embedded in other model-based emotion recognition systems. (paper)

  6. Speech Emotion Recognition Based on Power Normalized Cepstral Coefficients in Noisy Conditions

    Directory of Open Access Journals (Sweden)

    M. Bashirpour

    2016-09-01

    Full Text Available Automatic recognition of speech emotional states in noisy conditions has become an important research topic in the emotional speech recognition area, in recent years. This paper considers the recognition of emotional states via speech in real environments. For this task, we employ the power normalized cepstral coefficients (PNCC in a speech emotion recognition system. We investigate its performance in emotion recognition using clean and noisy speech materials and compare it with the performances of the well-known MFCC, LPCC, RASTA-PLP, and also TEMFCC features. Speech samples are extracted from the Berlin emotional speech database (Emo DB and Persian emotional speech database (Persian ESD which are corrupted with 4 different noise types under various SNR levels. The experiments are conducted in clean train/noisy test scenarios to simulate practical conditions with noise sources. Simulation results show that higher recognition rates are achieved for PNCC as compared with the conventional features under noisy conditions.

  7. Neural Circuitry of Impaired Emotion Regulation in Substance Use Disorders.

    Science.gov (United States)

    Wilcox, Claire E; Pommy, Jessica M; Adinoff, Bryon

    2016-04-01

    Impaired emotion regulation contributes to the development and severity of substance use disorders (substance disorders). This review summarizes the literature on alterations in emotion regulation neural circuitry in substance disorders, particularly in relation to disorders of negative affect (without substance disorder), and it presents promising areas of future research. Emotion regulation paradigms during functional magnetic resonance imaging are conceptualized into four dimensions: affect intensity and reactivity, affective modulation, cognitive modulation, and behavioral control. The neural circuitry associated with impaired emotion regulation is compared in individuals with and without substance disorders, with a focus on amygdala, insula, and prefrontal cortex activation and their functional and structural connectivity. Hypoactivation of the rostral anterior cingulate cortex/ventromedial prefrontal cortex (rACC/vmPFC) is the most consistent finding across studies, dimensions, and clinical populations (individuals with and without substance disorders). The same pattern is evident for regions in the cognitive control network (anterior cingulate and dorsal and ventrolateral prefrontal cortices) during cognitive modulation and behavioral control. These congruent findings are possibly related to attenuated functional and/or structural connectivity between the amygdala and insula and between the rACC/vmPFC and cognitive control network. Although increased amygdala and insula activation is associated with impaired emotion regulation in individuals without substance disorders, it is not consistently observed in substance disorders. Emotion regulation disturbances in substance disorders may therefore stem from impairments in prefrontal functioning, rather than excessive reactivity to emotional stimuli. Treatments for emotion regulation in individuals without substance disorders that normalize prefrontal functioning may offer greater efficacy for substance disorders

  8. Emotional processing in patients with mild cognitive impairment: the influence of the valence and intensity of emotional stimuli: the valence and intensity of emotional stimuli influence emotional processing in patients with mild cognitive impairment.

    Science.gov (United States)

    Sarabia-Cobo, Carmen M; García-Rodríguez, Beatriz; Navas, M José; Ellgring, Heiner

    2015-10-15

    We studied the ability of individuals with mild cognitive impairment (MCI) to process emotional facial expressions (EFEs). To date, no systematic study has addressed how variation in intensity affects recognition of the different type of EFEs in such subjects. Two groups of 50 elderly subjects, 50 healthy individuals and 50 with MCI, completed a task that involved identifying 180 EFEs prepared using virtual models. Two features of the EFEs were contemplated, their valence (operationalized in six basic emotions) and five levels of intensity. At all levels of intensity, elderly individuals with MCI were significantly worse at identifying each EFE than healthy subjects. Some emotions were easier to identify than others, with happiness proving to be the easiest to identify and disgust the hardest, and intensity influenced the identification of the EFEs (the stronger the intensity, the greater the number of correct identifications). Overall, elderly individuals with MCI had a poorer capacity to process EFEs, suggesting that cognitive ability modulates the processing of emotions, where features of such stimuli also seem to play a prominent role (e.g., valence and intensity). Thus, the neurological substrates involved in emotional processing appear to be affected by MCI. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Emotion Recognition and Visual-Scan Paths in Fragile X Syndrome

    Science.gov (United States)

    Shaw, Tracey A.; Porter, Melanie A.

    2013-01-01

    This study investigated emotion recognition abilities and visual scanning of emotional faces in 16 Fragile X syndrome (FXS) individuals compared to 16 chronological-age and 16 mental-age matched controls. The relationships between emotion recognition, visual scan-paths and symptoms of social anxiety, schizotypy and autism were also explored.…

  10. Recognition of emotional facial expressions and broad autism phenotype in parents of children diagnosed with autistic spectrum disorder.

    Science.gov (United States)

    Kadak, Muhammed Tayyib; Demirel, Omer Faruk; Yavuz, Mesut; Demir, Türkay

    2014-07-01

    Research findings debate about features of broad autism phenotype. In this study, we tested whether parents of children with autism have problems recognizing emotional facial expression and the contribution of such an impairment to the broad phenotype of autism. Seventy-two parents of children with autistic spectrum disorder and 38 parents of control group participated in the study. Broad autism features was measured with Autism Quotient (AQ). Recognition of Emotional Face Expression Test was assessed with the Emotion Recognition Test, consisting a set of photographs from Ekman & Friesen's. In a two-tailed analysis of variance of AQ, there was a significant difference for social skills (F(1, 106)=6.095; p<.05). Analyses of variance revealed significant difference in the recognition of happy, surprised and neutral expressions (F(1, 106)=4.068, p=.046; F(1, 106)=4.068, p=.046; F(1, 106)=6.064, p=.016). According to our findings, social impairment could be considered a characteristic feature of BAP. ASD parents had difficulty recognizing neutral expressions, suggesting that ASD parents may have impaired recognition of ambiguous expressions as do autistic children. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Associations between facial emotion recognition, cognition and alexithymia in patients with schizophrenia: comparison of photographic and virtual reality presentations.

    Science.gov (United States)

    Gutiérrez-Maldonado, J; Rus-Calafell, M; Márquez-Rejón, S; Ribas-Sabaté, J

    2012-01-01

    Emotion recognition is known to be impaired in schizophrenia patients. Although cognitive deficits and symptomatology have been associated with this impairment there are other patient characteristics, such as alexithymia, which have not been widely explored. Emotion recognition is normally assessed by means of photographs, although they do not reproduce the dynamism of human expressions. Our group has designed and validated a virtual reality (VR) task to assess and subsequently train schizophrenia patients. The present study uses this VR task to evaluate the impaired recognition of facial affect in patients with schizophrenia and to examine its association with cognitive deficit and the patients' inability to express feelings. Thirty clinically stabilized outpatients with a well-established diagnosis of schizophrenia or schizoaffective disorder were assessed in neuropsychological, symptomatic and affective domains. They then performed the facial emotion recognition task. Statistical analyses revealed no significant differences between the two presentation conditions (photographs and VR) in terms of overall errors made. However, anger and fear were easier to recognize in VR than in photographs. Moreover, strong correlations were found between psychopathology and the errors made.

  12. Indomethacin counteracts the effects of chronic social defeat stress on emotional but not recognition memory in mice.

    Science.gov (United States)

    Duque, Aránzazu; Vinader-Caerols, Concepción; Monleón, Santiago

    2017-01-01

    We have previously observed the impairing effects of chronic social defeat stress (CSDS) on emotional memory in mice. Given the relation between stress and inflammatory processes, we sought to study the effectiveness of the anti-inflammatory indomethacin in reversing the detrimental effects of CSDS on emotional memory in mice. The effects of CSDS and indomethacin on recognition memory were also evaluated. Male CD1 mice were randomly divided into four groups: non-stressed + saline (NS+SAL); non-stressed + indomethacin (NS+IND); stressed + saline (S+SAL); and stressed + indomethacin (S+IND). Stressed animals were exposed to a daily 10 min agonistic confrontation (CSDS) for 20 days. All subjects were treated daily with saline or indomethacin (10 mg/kg, i.p.). 24 h after the CSDS period, all the mice were evaluated in a social interaction test to distinguish between those that were resilient or susceptible to social stress. All subjects (n = 10-12 per group) were then evaluated in inhibitory avoidance (IA), novel object recognition (NOR), elevated plus maze and hot plate tests. As in control animals (NS+SAL group), IA learning was observed in the resilient groups, as well as in the susceptible mice treated with indomethacin (S+IND group). Recognition memory was observed in the non-stressed and the resilient mice, but not in the susceptible animals. Also, stressed mice exhibited higher anxiety levels. No significant differences were observed in locomotor activity or analgesia. In conclusion, CSDS induces anxiety in post-pubertal mice and impairs emotional and recognition memory in the susceptible subjects. The effects of CSDS on emotional memory, but not on recognition memory and anxiety, are reversed by indomethacin. Moreover, memory impairment is not secondary to the effects of CSDS on locomotor activity, emotionality or pain sensitivity.

  13. Indomethacin counteracts the effects of chronic social defeat stress on emotional but not recognition memory in mice.

    Directory of Open Access Journals (Sweden)

    Aránzazu Duque

    Full Text Available We have previously observed the impairing effects of chronic social defeat stress (CSDS on emotional memory in mice. Given the relation between stress and inflammatory processes, we sought to study the effectiveness of the anti-inflammatory indomethacin in reversing the detrimental effects of CSDS on emotional memory in mice. The effects of CSDS and indomethacin on recognition memory were also evaluated. Male CD1 mice were randomly divided into four groups: non-stressed + saline (NS+SAL; non-stressed + indomethacin (NS+IND; stressed + saline (S+SAL; and stressed + indomethacin (S+IND. Stressed animals were exposed to a daily 10 min agonistic confrontation (CSDS for 20 days. All subjects were treated daily with saline or indomethacin (10 mg/kg, i.p.. 24 h after the CSDS period, all the mice were evaluated in a social interaction test to distinguish between those that were resilient or susceptible to social stress. All subjects (n = 10-12 per group were then evaluated in inhibitory avoidance (IA, novel object recognition (NOR, elevated plus maze and hot plate tests. As in control animals (NS+SAL group, IA learning was observed in the resilient groups, as well as in the susceptible mice treated with indomethacin (S+IND group. Recognition memory was observed in the non-stressed and the resilient mice, but not in the susceptible animals. Also, stressed mice exhibited higher anxiety levels. No significant differences were observed in locomotor activity or analgesia. In conclusion, CSDS induces anxiety in post-pubertal mice and impairs emotional and recognition memory in the susceptible subjects. The effects of CSDS on emotional memory, but not on recognition memory and anxiety, are reversed by indomethacin. Moreover, memory impairment is not secondary to the effects of CSDS on locomotor activity, emotionality or pain sensitivity.

  14. Emotion recognition in frontotemporal dementia and Alzheimer's disease: A new film-based assessment.

    Science.gov (United States)

    Goodkind, Madeleine S; Sturm, Virginia E; Ascher, Elizabeth A; Shdo, Suzanne M; Miller, Bruce L; Rankin, Katherine P; Levenson, Robert W

    2015-08-01

    Deficits in recognizing others' emotions are reported in many psychiatric and neurological disorders, including autism, schizophrenia, behavioral variant frontotemporal dementia (bvFTD) and Alzheimer's disease (AD). Most previous emotion recognition studies have required participants to identify emotional expressions in photographs. This type of assessment differs from real-world emotion recognition in important ways: Images are static rather than dynamic, include only 1 modality of emotional information (i.e., visual information), and are presented absent a social context. Additionally, existing emotion recognition batteries typically include multiple negative emotions, but only 1 positive emotion (i.e., happiness) and no self-conscious emotions (e.g., embarrassment). We present initial results using a new task for assessing emotion recognition that was developed to address these limitations. In this task, respondents view a series of short film clips and are asked to identify the main characters' emotions. The task assesses multiple negative, positive, and self-conscious emotions based on information that is multimodal, dynamic, and socially embedded. We evaluate this approach in a sample of patients with bvFTD, AD, and normal controls. Results indicate that patients with bvFTD have emotion recognition deficits in all 3 categories of emotion compared to the other groups. These deficits were especially pronounced for negative and self-conscious emotions. Emotion recognition in this sample of patients with AD was indistinguishable from controls. These findings underscore the utility of this approach to assessing emotion recognition and suggest that previous findings that recognition of positive emotion was preserved in dementia patients may have resulted from the limited sampling of positive emotion in traditional tests. (c) 2015 APA, all rights reserved).

  15. Emotion Recognition in Frontotemporal Dementia and Alzheimer's Disease: A New Film-Based Assessment

    Science.gov (United States)

    Goodkind, Madeleine S.; Sturm, Virginia E.; Ascher, Elizabeth A.; Shdo, Suzanne M.; Miller, Bruce L.; Rankin, Katherine P.; Levenson, Robert W.

    2015-01-01

    Deficits in recognizing others' emotions are reported in many psychiatric and neurological disorders, including autism, schizophrenia, behavioral variant frontotemporal dementia (bvFTD) and Alzheimer's disease (AD). Most previous emotion recognition studies have required participants to identify emotional expressions in photographs. This type of assessment differs from real-world emotion recognition in important ways: Images are static rather than dynamic, include only 1 modality of emotional information (i.e., visual information), and are presented absent a social context. Additionally, existing emotion recognition batteries typically include multiple negative emotions, but only 1 positive emotion (i.e., happiness) and no self-conscious emotions (e.g., embarrassment). We present initial results using a new task for assessing emotion recognition that was developed to address these limitations. In this task, respondents view a series of short film clips and are asked to identify the main characters' emotions. The task assesses multiple negative, positive, and self-conscious emotions based on information that is multimodal, dynamic, and socially embedded. We evaluate this approach in a sample of patients with bvFTD, AD, and normal controls. Results indicate that patients with bvFTD have emotion recognition deficits in all 3 categories of emotion compared to the other groups. These deficits were especially pronounced for negative and self-conscious emotions. Emotion recognition in this sample of patients with AD was indistinguishable from controls. These findings underscore the utility of this approach to assessing emotion recognition and suggest that previous findings that recognition of positive emotion was preserved in dementia patients may have resulted from the limited sampling of positive emotion in traditional tests. PMID:26010574

  16. Functional architecture of visual emotion recognition ability: A latent variable approach.

    Science.gov (United States)

    Lewis, Gary J; Lefevre, Carmen E; Young, Andrew W

    2016-05-01

    Emotion recognition has been a focus of considerable attention for several decades. However, despite this interest, the underlying structure of individual differences in emotion recognition ability has been largely overlooked and thus is poorly understood. For example, limited knowledge exists concerning whether recognition ability for one emotion (e.g., disgust) generalizes to other emotions (e.g., anger, fear). Furthermore, it is unclear whether emotion recognition ability generalizes across modalities, such that those who are good at recognizing emotions from the face, for example, are also good at identifying emotions from nonfacial cues (such as cues conveyed via the body). The primary goal of the current set of studies was to address these questions through establishing the structure of individual differences in visual emotion recognition ability. In three independent samples (Study 1: n = 640; Study 2: n = 389; Study 3: n = 303), we observed that the ability to recognize visually presented emotions is based on different sources of variation: a supramodal emotion-general factor, supramodal emotion-specific factors, and face- and within-modality emotion-specific factors. In addition, we found evidence that general intelligence and alexithymia were associated with supramodal emotion recognition ability. Autism-like traits, empathic concern, and alexithymia were independently associated with face-specific emotion recognition ability. These results (a) provide a platform for further individual differences research on emotion recognition ability, (b) indicate that differentiating levels within the architecture of emotion recognition ability is of high importance, and (c) show that the capacity to understand expressions of emotion in others is linked to broader affective and cognitive processes. (c) 2016 APA, all rights reserved).

  17. Congruent bodily arousal promotes the constructive recognition of emotional words.

    Science.gov (United States)

    Kever, Anne; Grynberg, Delphine; Vermeulen, Nicolas

    2017-08-01

    Considerable research has shown that bodily states shape affect and cognition. Here, we examined whether transient states of bodily arousal influence the categorization speed of high arousal, low arousal, and neutral words. Participants realized two blocks of a constructive recognition task, once after a cycling session (increased arousal), and once after a relaxation session (reduced arousal). Results revealed overall faster response times for high arousal compared to low arousal words, and for positive compared to negative words. Importantly, low arousal words were categorized significantly faster after the relaxation than after the cycling, suggesting that a decrease in bodily arousal promotes the recognition of stimuli matching one's current arousal state. These findings highlight the importance of the arousal dimension in emotional processing, and suggest the presence of arousal-congruency effects. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Emotion recognition and social cognition in temporal lobe epilepsy and the effect of epilepsy surgery.

    Science.gov (United States)

    Amlerova, Jana; Cavanna, Andrea E; Bradac, Ondrej; Javurkova, Alena; Raudenska, Jaroslava; Marusic, Petr

    2014-07-01

    The abilities to identify facial expression from another person's face and to attribute mental states to others refer to preserved function of the temporal lobes. In the present study, we set out to evaluate emotion recognition and social cognition in presurgical and postsurgical patients with unilateral refractory temporal lobe epilepsy (TLE). The aim of our study was to investigate the effects of TLE surgery and to identify the main risk factors for impairment in these functions. We recruited 30 patients with TLE for longitudinal data analysis (14 with right-sided and 16 with left-sided TLE) and 74 patients for cross-sectional data analysis (37 with right-sided and 37 with left-sided TLE) plus 20 healthy controls. Besides standard neuropsychological assessment, we administered an analog of the Ekman and Friesen test and the Faux Pas Test to assess emotion recognition and social cognition, respectively. Both emotion recognition and social cognition were impaired in the group of patients with TLE, irrespective of the focus side, compared with healthy controls. The performance in both tests was strongly dependent on the intelligence level. Beyond intelligence level, earlier age at epilepsy onset, longer disease duration, and history of early childhood brain injury predicted social cognition problems in patients with TLE. Epilepsy surgery within the temporal lobe seems to have neutral effect on patients' performances in both domains. However, there are a few individual patients who appear to be at risk of postoperative decline, even when seizure freedom is achieved following epilepsy surgery. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. COGNITIVE STYLE OF A PERSON AS A FACTOR OF EFFECTIVE EMOTION RECOGNITION

    Directory of Open Access Journals (Sweden)

    E V Belovol

    2015-12-01

    Full Text Available Facial expression is one of the most informative sources of non-verbal information. Early studies on the ability to recognize emotions over the face, pointed to the universality of emotion expression and recognition. More recent studies have shown a combination of universal mechanisms and cultural-specific patterns. The process of emotion recognition is based on face perception that’s why the in-group effect should be taken under consideration. The in-group advantage hypothesis posits that observers are more accurate at recognizing facial expressions displayed by the same culture compared to other culture members. On the other hand, the process of emotion recognition is determined by such cognitive features as a cognitive style. This article describes the approaches to emotion expression and recognition, culture-specific features to basic emotion expression. It also describes factors related to recognition of basic emotions by people from different cultures. It was discovered that field-independent people are more accurate in emotion recognition than field- dependent people because they are able to distinguish markers of emotions. There was found no correlation between successful emotion recognition and the observers’ gender, no correlation between successful emotion recognition and the observers’ race

  20. Play it again, Sam: brain correlates of emotional music recognition.

    Science.gov (United States)

    Altenmüller, Eckart; Siggel, Susann; Mohammadi, Bahram; Samii, Amir; Münte, Thomas F

    2014-01-01

    Music can elicit strong emotions and can be remembered in connection with these emotions even decades later. Yet, the brain correlates of episodic memory for highly emotional music compared with less emotional music have not been examined. We therefore used fMRI to investigate brain structures activated by emotional processing of short excerpts of film music successfully retrieved from episodic long-term memory. Eighteen non-musicians volunteers were exposed to 60 structurally similar pieces of film music of 10 s length with high arousal ratings and either less positive or very positive valence ratings. Two similar sets of 30 pieces were created. Each of these was presented to half of the participants during the encoding session outside of the scanner, while all stimuli were used during the second recognition session inside the MRI-scanner. During fMRI each stimulation period (10 s) was followed by a 20 s resting period during which participants pressed either the "old" or the "new" button to indicate whether they had heard the piece before. Musical stimuli vs. silence activated the bilateral superior temporal gyrus, right insula, right middle frontal gyrus, bilateral medial frontal gyrus and the left anterior cerebellum. Old pieces led to activation in the left medial dorsal thalamus and left midbrain compared to new pieces. For recognized vs. not recognized old pieces a focused activation in the right inferior frontal gyrus and the left cerebellum was found. Positive pieces activated the left medial frontal gyrus, the left precuneus, the right superior frontal gyrus, the left posterior cingulate, the bilateral middle temporal gyrus, and the left thalamus compared to less positive pieces. Specific brain networks related to memory retrieval and emotional processing of symphonic film music were identified. The results imply that the valence of a music piece is important for memory performance and is recognized very fast.

  1. Play it again Sam: Brain Correlates of Emotional Music Recognition

    Directory of Open Access Journals (Sweden)

    Eckart eAltenmüller

    2014-02-01

    Full Text Available AbstractBackground: Music can elicit strong emotions and can be remembered in connection with these emotions even decades later. Yet, the brain correlates of episodic memory for highly emotional music compared with less emotional music have not been examined. We therefore used fMRI to investigate brain structures activated by emotional processing of short excerpts of film music successfully retrieved from episodic long-term memory.Methods: 18 non-musicians volunteers were exposed to 60 structurally similar pieces of film music of 10 second length with high arousal ratings and either less positive or very positive valence ratings. Two similar sets of 30 pieces were created. Each of these was presented to half of the participants during the encoding session outside of the scanner, while all stimuli were used during the second recognition session inside the MRI-scanner. During fMRI each stimulation period (10 sec was followed by a 20 sec resting period during which participants pressed either the old or the new to indicate whether they had heard the piece before. Results: Musical stimuli vs. silence activated the bilateral superior temporal gyrus, right insula, right middle frontal gyrus, bilateral medial frontal gyrus and the left anterior cerebellum. Old pieces led to activation in the left medial dorsal thalamus and left midbrain compared to new pieces. For recognized vs. not recognized old pieces a focused activation in the right inferior frontal gyrus and the left cerebellum was found. Positive pieces activated the left medial frontal gyrus, the left precuneus, the right superior frontal gyrus, the left posterior cingulate, the bilateral middle temporal gyrus, and the left thalamus compared to less positive pieces. Conclusion: Specific brain networks related to memory retrieval and emotional processing of symphonic film music were identified. The results imply that the valence of a music piece is important for memory performance.

  2. Emotion Recognition and Social/Role Dysfunction in Non-Clinical Psychosis

    Science.gov (United States)

    Pelletier, Andrea L.; Dean, Derek J.; Lunsford-Avery, Jessica R.; Smith, Ashley K.; Orr, Joseph M.; Gupta, Tina; Millman, Zachary B.; Mittal, Vijay A.

    2013-01-01

    As researchers continue to understand non-clinical psychosis (NCP- brief psychotic-like experiences occurring in 5–7% of the general population; van Os et al., 2009), it is becoming evident that functioning deficits and facial emotion recognition (FER) impairment characterize this phenomenon. However, the extent to which these domains are related remains unclear. Social/role functioning and FER were assessed in 65 adolescents/young adults exhibiting Low and High-NCP. Results indicate that FER and social/role functioning deficits were present in the High-NCP group, and that the domains were associated in this group alone. Taken together, findings suggest that a core emotive deficit is tied to broader social/role dysfunction in NCP. PMID:23182437

  3. Altered emotional recognition and expression in patients with Parkinson’s disease

    Directory of Open Access Journals (Sweden)

    Jin Y

    2017-11-01

    Full Text Available Yazhou Jin,* Zhiqi Mao,* Zhipei Ling, Xin Xu, Zhiyuan Zhang, Xinguang Yu Department of Neurosurgery, People’s Liberation Army General Hospital, Beijing, People’s Republic of China *These authors contributed equally to this work Background: Parkinson’s disease (PD patients exhibit deficits in emotional recognition and expression abilities, including emotional faces and voices. The aim of this study was to explore emotional processing in pre-deep brain stimulation (pre-DBS PD patients using two sensory modalities (visual and auditory. Methods: Fifteen PD patients who needed DBS surgery and 15 healthy, age- and gender-matched controls were recruited as participants. All participants were assessed by the Karolinska Directed Emotional Faces database 50 Faces Recognition test. Vocal recognition was evaluated by the Montreal Affective Voices database 50 Voices Recognition test. For emotional facial expression, the participants were asked to imitate five basic emotions (neutral, happiness, anger, fear, and sadness. The subjects were required to express nonverbal vocalizations of the five basic emotions. Fifteen Chinese native speakers were recruited as decoders. We recorded the accuracy of the responses, reaction time, and confidence level. Results: For emotional recognition and expression, the PD group scored lower on both facial and vocal emotional processing than did the healthy control group. There were significant differences between the two groups in both reaction time and confidence level. A significant relationship was also found between emotional recognition and emotional expression when considering all participants between the two groups together. Conclusion: The PD group exhibited poorer performance on both the recognition and expression tasks. Facial emotion deficits and vocal emotion abnormalities were associated with each other. In addition, our data allow us to speculate that emotional recognition and expression may share a common

  4. The use of the Emotional-Object Recognition as an assay to assess learning and memory associated to an aversive stimulus in rodents.

    Science.gov (United States)

    Brancato, Anna; Lavanco, Gianluca; Cavallaro, Angela; Plescia, Fulvio; Cannizzaro, Carla

    2016-12-01

    Emotionally salient experiences induce the formation of explicit memory traces, besides eliciting automatic or implicit emotional memory in rodents. This study aims at investigating the implementation of a novel task for studying the formation of limbic memory engrams as a result of the acquisition- and retrieval- of fear-conditioning - biased declarative memory traces, measured by animal discrimination of an "emotional-object". Moreover, by using this new method we investigated the potential interactions between stimulation of cannabinoid transmission and integration of emotional information and cognitive functioning. The Emotional-Object Recognition task is composed of 3 following sessions: habituation; cued fear-conditioned learning; emotional recognition. Rats are exposed to Context "B chamber" for habituation and cued fear-conditioning, and tested in Context "A chamber" for emotional-object recognition. Cued fear-conditioning induces a reduction in emotional-object exploration time during the Emotional-Object Recognition task in controls. The activation of cannabinoid signalling impairs limbic memory formation, with respect to vehicle. The Emotional-Object Recognition test overcomes several limitations of commonly employed methods that explore declarative-, spatial memory and fear-conditioning in a non-integrated manner. It allows the assessment of unbiased cognitive indicators of emotional learning and memory. The Emotional-Object Recognition task is a valuable tool for investigating whether, and at what extent, specific drugs or pathological conditions that interfere with the individual affective/emotional homeostasis, can modulate the formation of emotionally salient explicit memory traces, thus jeopardizing control and regulation of animal behavioural strategy. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Subthalamic nucleus stimulation impairs emotional conflict adaptation in Parkinson's disease.

    Science.gov (United States)

    Irmen, Friederike; Huebl, Julius; Schroll, Henning; Brücke, Christof; Schneider, Gerd-Helge; Hamker, Fred H; Kühn, Andrea A

    2017-10-01

    The subthalamic nucleus (STN) occupies a strategic position in the motor network, slowing down responses in situations with conflicting perceptual input. Recent evidence suggests a role of the STN in emotion processing through strong connections with emotion recognition structures. As deep brain stimulation (DBS) of the STN in patients with Parkinson's disease (PD) inhibits monitoring of perceptual and value-based conflict, STN DBS may also interfere with emotional conflict processing. To assess a possible interference of STN DBS with emotional conflict processing, we used an emotional Stroop paradigm. Subjects categorized face stimuli according to their emotional expression while ignoring emotionally congruent or incongruent superimposed word labels. Eleven PD patients ON and OFF STN DBS and eleven age-matched healthy subjects conducted the task. We found conflict-induced response slowing in healthy controls and PD patients OFF DBS, but not ON DBS, suggesting STN DBS to decrease adaptation to within-trial conflict. OFF DBS, patients showed more conflict-induced slowing for negative conflict stimuli, which was diminished by STN DBS. Computational modelling of STN influence on conflict adaptation disclosed DBS to interfere via increased baseline activity. © The Author (2017). Published by Oxford University Press.

  6. Biased emotional recognition in depression: perception of emotions in music by depressed patients.

    Science.gov (United States)

    Punkanen, Marko; Eerola, Tuomas; Erkkilä, Jaakko

    2011-04-01

    Depression is a highly prevalent mood disorder, that impairs a person's social skills and also their quality of life. Populations affected with depression also suffer from a higher mortality rate. Depression affects person's ability to recognize emotions. We designed a novel experiment to test the hypothesis that depressed patients show a judgment bias towards negative emotions. To investigate how depressed patients differ in their perception of emotions conveyed by musical examples, both healthy (n=30) and depressed (n=79) participants were presented with a set of 30 musical excerpts, representing one of five basic target emotions, and asked to rate each excerpt using five Likert scales that represented the amount of each one of those same emotions perceived in the example. Depressed patients showed moderate but consistent negative self-report biases both in the overall use of the scales and their particular application to certain target emotions, when compared to healthy controls. Also, the severity of the clinical state (depression, anxiety and alexithymia) had an effect on the self-report biases for both positive and negative emotion ratings, particularly depression and alexithymia. Only musical stimuli were used, and they were all clear examples of one of the basic emotions of happiness, sadness, fear, anger and tenderness. No neutral or ambiguous excerpts were included. Depressed patients' negative emotional bias was demonstrated using musical stimuli. This suggests that the evaluation of emotional qualities in music could become a means to discriminate between depressed and non-depressed subjects. The practical implications of the present study relate both to diagnostic uses of such perceptual evaluations, as well as a better understanding of the emotional regulation strategies of the patients. Copyright © 2010 Elsevier B.V. All rights reserved.

  7. Recognition of Emotions in Mexican Spanish Speech: An Approach Based on Acoustic Modelling of Emotion-Specific Vowels

    Directory of Open Access Journals (Sweden)

    Santiago-Omar Caballero-Morales

    2013-01-01

    Full Text Available An approach for the recognition of emotions in speech is presented. The target language is Mexican Spanish, and for this purpose a speech database was created. The approach consists in the phoneme acoustic modelling of emotion-specific vowels. For this, a standard phoneme-based Automatic Speech Recognition (ASR system was built with Hidden Markov Models (HMMs, where different phoneme HMMs were built for the consonants and emotion-specific vowels associated with four emotional states (anger, happiness, neutral, sadness. Then, estimation of the emotional state from a spoken sentence is performed by counting the number of emotion-specific vowels found in the ASR’s output for the sentence. With this approach, accuracy of 87–100% was achieved for the recognition of emotional state of Mexican Spanish speech.

  8. Effects of acute psychosocial stress on neural activity to emotional and neutral faces in a face recognition memory paradigm.

    Science.gov (United States)

    Li, Shijia; Weerda, Riklef; Milde, Christopher; Wolf, Oliver T; Thiel, Christiane M

    2014-12-01

    Previous studies have shown that acute psychosocial stress impairs recognition of declarative memory and that emotional material is especially sensitive to this effect. Animal studies suggest a central role of the amygdala which modulates memory processes in hippocampus, prefrontal cortex and other brain areas. We used functional magnetic resonance imaging (fMRI) to investigate neural correlates of stress-induced modulation of emotional recognition memory in humans. Twenty-seven healthy, right-handed, non-smoker male volunteers performed an emotional face recognition task. During encoding, participants were presented with 50 fearful and 50 neutral faces. One hour later, they underwent either a stress (Trier Social Stress Test) or a control procedure outside the scanner which was followed immediately by the recognition session inside the scanner, where participants had to discriminate between 100 old and 50 new faces. Stress increased salivary cortisol, blood pressure and pulse, and decreased the mood of participants but did not impact recognition memory. BOLD data during recognition revealed a stress condition by emotion interaction in the left inferior frontal gyrus and right hippocampus which was due to a stress-induced increase of neural activity to fearful and a decrease to neutral faces. Functional connectivity analyses revealed a stress-induced increase in coupling between the right amygdala and the right fusiform gyrus, when processing fearful as compared to neutral faces. Our results provide evidence that acute psychosocial stress affects medial temporal and frontal brain areas differentially for neutral and emotional items, with a stress-induced privileged processing of emotional stimuli.

  9. Parents' Emotion-Related Beliefs, Behaviours, and Skills Predict Children's Recognition of Emotion

    Science.gov (United States)

    Castro, Vanessa L.; Halberstadt, Amy G.; Lozada, Fantasy T.; Craig, Ashley B.

    2015-01-01

    Children who are able to recognize others' emotions are successful in a variety of socioemotional domains, yet we know little about how school-aged children's abilities develop, particularly in the family context. We hypothesized that children develop emotion recognition skill as a function of parents' own emotion-related beliefs,…

  10. Emotion Recognition from Congruent and Incongruent Emotional Expressions and Situational Cues in Children with Autism Spectrum Disorder

    Science.gov (United States)

    Tell, Dina; Davidson, Denise

    2015-01-01

    In this research, the emotion recognition abilities of children with autism spectrum disorder and typically developing children were compared. When facial expressions and situational cues of emotion were congruent, accuracy in recognizing emotions was good for both children with autism spectrum disorder and typically developing children. When…

  11. Brief report: accuracy and response time for the recognition of facial emotions in a large sample of children with autism spectrum disorders.

    Science.gov (United States)

    Fink, Elian; de Rosnay, Marc; Wierda, Marlies; Koot, Hans M; Begeer, Sander

    2014-09-01

    The empirical literature has presented inconsistent evidence for deficits in the recognition of basic emotion expressions in children with autism spectrum disorders (ASD), which may be due to the focus on research with relatively small sample sizes. Additionally, it is proposed that although children with ASD may correctly identify emotion expression they rely on more deliberate, more time-consuming strategies in order to accurately recognize emotion expressions when compared to typically developing children. In the current study, we examine both emotion recognition accuracy and response time in a large sample of children, and explore the moderating influence of verbal ability on these findings. The sample consisted of 86 children with ASD (M age = 10.65) and 114 typically developing children (M age = 10.32) between 7 and 13 years of age. All children completed a pre-test (emotion word-word matching), and test phase consisting of basic emotion recognition, whereby they were required to match a target emotion expression to the correct emotion word; accuracy and response time were recorded. Verbal IQ was controlled for in the analyses. We found no evidence of a systematic deficit in emotion recognition accuracy or response time for children with ASD, controlling for verbal ability. However, when controlling for children's accuracy in word-word matching, children with ASD had significantly lower emotion recognition accuracy when compared to typically developing children. The findings suggest that the social impairments observed in children with ASD are not the result of marked deficits in basic emotion recognition accuracy or longer response times. However, children with ASD may be relying on other perceptual skills (such as advanced word-word matching) to complete emotion recognition tasks at a similar level as typically developing children.

  12. Speech-based recognition of self-reported and observed emotion in a dimensional space

    NARCIS (Netherlands)

    Truong, Khiet Phuong; van Leeuwen, David A.; de Jong, Franciska M.G.

    2012-01-01

    The differences between self-reported and observed emotion have only marginally been investigated in the context of speech-based automatic emotion recognition. We address this issue by comparing self-reported emotion ratings to observed emotion ratings and look at how differences between these two

  13. Does Facial Expression Recognition Provide a Toehold for the Development of Emotion Understanding?

    Science.gov (United States)

    Strand, Paul S.; Downs, Andrew; Barbosa-Leiker, Celestina

    2016-01-01

    The authors explored predictions from basic emotion theory (BET) that facial emotion expression recognition skills are insular with respect to their own development, and yet foundational to the development of emotional perspective-taking skills. Participants included 417 preschool children for whom estimates of these 2 emotion understanding…

  14. Wanting it Too Much: An Inverse Relation Between Social Motivation and Facial Emotion Recognition in Autism Spectrum Disorder

    OpenAIRE

    Garman, Heather D.; Spaulding, Christine J.; Webb, Sara Jane; Mikami, Amori Yee; Morris, James P.; Lerner, Matthew D.

    2016-01-01

    This study examined social motivation and early-stage face perception as frameworks for understanding impairments in facial emotion recognition (FER) in a well-characterized sample of youth with autism spectrum disorders (ASD). Early-stage face perception (N170 event-related potential latency) was recorded while participants completed a standardized FER task, while social motivation was obtained via parent report. Participants with greater social motivation exhibited poorer FER, while those w...

  15. Emotional Abilities in Children with Oppositional Defiant Disorder (ODD): Impairments in Perspective-Taking and Understanding Mixed Emotions are Associated with High Callous-Unemotional Traits.

    Science.gov (United States)

    O'Kearney, Richard; Salmon, Karen; Liwag, Maria; Fortune, Clare-Ann; Dawel, Amy

    2017-04-01

    Most studies of emotion abilities in disruptive children focus on emotion expression recognition. This study compared 74 children aged 4-8 years with ODD to 45 comparison children (33 healthy; 12 with an anxiety disorder) on behaviourally assessed measures of emotion perception, emotion perspective-taking, knowledge of emotions causes and understanding ambivalent emotions and on parent-reported cognitive and affective empathy. Adjusting for child's sex, age and expressive language ODD children showed a paucity in attributing causes to emotions but no other deficits relative to the comparison groups. ODD boys with high levels of callous-unemotional traits (CU) (n = 22) showed deficits relative to low CU ODD boys (n = 25) in emotion perspective-taking and in understanding ambivalent emotions. Low CU ODD boys did not differ from the healthy typically developing boys (n = 12). Impairments in emotion perceptive-taking and understanding mixed emotions in ODD boys are associated with the presence of a high level of CU.

  16. Impaired Integration of Emotional Faces and Affective Body Context in a Rare Case of Developmental Visual Agnosia

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran. R.; Bentin, Shlomo

    2011-01-01

    In the current study we examined the recognition of facial expressions embedded in emotionally expressive bodies in case LG, an individual with a rare form of developmental visual agnosia who suffers from severe prosopagnosia. Neuropsychological testing demonstrated that LG‘s agnosia is characterized by profoundly impaired visual integration. Unlike individuals with typical developmental prosopagnosia who display specific difficulties with face identity (but typically not expression) recognition, LG was also impaired at recognizing isolated facial expressions. By contrast, he successfully recognized the expressions portrayed by faceless emotional bodies handling affective paraphernalia. When presented with contextualized faces in emotional bodies his ability to detect the emotion expressed by a face did not improve even if it was embedded in an emotionally-congruent body context. Furthermore, in contrast to controls, LG displayed an abnormal pattern of contextual influence from emotionally-incongruent bodies. The results are interpreted in the context of a general integration deficit in developmental visual agnosia, suggesting that impaired integration may extend from the level of the face to the level of the full person. PMID:21482423

  17. Sex differences in facial emotion recognition across varying expression intensity levels from videos.

    Science.gov (United States)

    Wingenbach, Tanja S H; Ashwin, Chris; Brosnan, Mark

    2018-01-01

    There has been much research on sex differences in the ability to recognise facial expressions of emotions, with results generally showing a female advantage in reading emotional expressions from the face. However, most of the research to date has used static images and/or 'extreme' examples of facial expressions. Therefore, little is known about how expression intensity and dynamic stimuli might affect the commonly reported female advantage in facial emotion recognition. The current study investigated sex differences in accuracy of response (Hu; unbiased hit rates) and response latencies for emotion recognition using short video stimuli (1sec) of 10 different facial emotion expressions (anger, disgust, fear, sadness, surprise, happiness, contempt, pride, embarrassment, neutral) across three variations in the intensity of the emotional expression (low, intermediate, high) in an adolescent and adult sample (N = 111; 51 male, 60 female) aged between 16 and 45 (M = 22.2, SD = 5.7). Overall, females showed more accurate facial emotion recognition compared to males and were faster in correctly recognising facial emotions. The female advantage in reading expressions from the faces of others was unaffected by expression intensity levels and emotion categories used in the study. The effects were specific to recognition of emotions, as males and females did not differ in the recognition of neutral faces. Together, the results showed a robust sex difference favouring females in facial emotion recognition using video stimuli of a wide range of emotions and expression intensity variations.

  18. Sex differences in facial emotion recognition across varying expression intensity levels from videos

    Science.gov (United States)

    2018-01-01

    There has been much research on sex differences in the ability to recognise facial expressions of emotions, with results generally showing a female advantage in reading emotional expressions from the face. However, most of the research to date has used static images and/or ‘extreme’ examples of facial expressions. Therefore, little is known about how expression intensity and dynamic stimuli might affect the commonly reported female advantage in facial emotion recognition. The current study investigated sex differences in accuracy of response (Hu; unbiased hit rates) and response latencies for emotion recognition using short video stimuli (1sec) of 10 different facial emotion expressions (anger, disgust, fear, sadness, surprise, happiness, contempt, pride, embarrassment, neutral) across three variations in the intensity of the emotional expression (low, intermediate, high) in an adolescent and adult sample (N = 111; 51 male, 60 female) aged between 16 and 45 (M = 22.2, SD = 5.7). Overall, females showed more accurate facial emotion recognition compared to males and were faster in correctly recognising facial emotions. The female advantage in reading expressions from the faces of others was unaffected by expression intensity levels and emotion categories used in the study. The effects were specific to recognition of emotions, as males and females did not differ in the recognition of neutral faces. Together, the results showed a robust sex difference favouring females in facial emotion recognition using video stimuli of a wide range of emotions and expression intensity variations. PMID:29293674

  19. Sex differences in facial emotion recognition across varying expression intensity levels from videos.

    Directory of Open Access Journals (Sweden)

    Tanja S H Wingenbach

    Full Text Available There has been much research on sex differences in the ability to recognise facial expressions of emotions, with results generally showing a female advantage in reading emotional expressions from the face. However, most of the research to date has used static images and/or 'extreme' examples of facial expressions. Therefore, little is known about how expression intensity and dynamic stimuli might affect the commonly reported female advantage in facial emotion recognition. The current study investigated sex differences in accuracy of response (Hu; unbiased hit rates and response latencies for emotion recognition using short video stimuli (1sec of 10 different facial emotion expressions (anger, disgust, fear, sadness, surprise, happiness, contempt, pride, embarrassment, neutral across three variations in the intensity of the emotional expression (low, intermediate, high in an adolescent and adult sample (N = 111; 51 male, 60 female aged between 16 and 45 (M = 22.2, SD = 5.7. Overall, females showed more accurate facial emotion recognition compared to males and were faster in correctly recognising facial emotions. The female advantage in reading expressions from the faces of others was unaffected by expression intensity levels and emotion categories used in the study. The effects were specific to recognition of emotions, as males and females did not differ in the recognition of neutral faces. Together, the results showed a robust sex difference favouring females in facial emotion recognition using video stimuli of a wide range of emotions and expression intensity variations.

  20. Taste and odor recognition memory: the emotional flavor of life.

    Science.gov (United States)

    Miranda, Maria Isabel

    2012-01-01

    In recent years, our knowledge of the neurobiology of taste and smell has greatly increased; by using several learning models, we now have a better understanding of the behavioral and neurochemical basis of memory recognition. Studies have provided new evidence of some processes that depend on prior experience with the specific combination of sensory stimuli. This review contains recent research related to taste and odor recognition memory, and the goal is to highlight the role of two prominent brain structures, the insular cortex and the amygdala. These structures have an important function during learning and memory and have been associated with the differences in learning induced by the diverse degrees of emotion during taste/odor memory formation, either aversive or appetitive or when taste and odor are combined and/or potentiated.Therefore, this review includes information about certain neurochemical transmitters and their interactions during appetitive or aversive taste memory formation,taste-potentiated odor aversion memory, and conditioned odor aversion, which might be able to maintain the complex processes necessary for flavor recognition memory.

  1. Relationship between individual differences in functional connectivity and facial-emotion recognition abilities in adults with traumatic brain injury.

    Science.gov (United States)

    Rigon, A; Voss, M W; Turkstra, L S; Mutlu, B; Duff, M C

    2017-01-01

    Although several studies have demonstrated that facial-affect recognition impairment is common following moderate-severe traumatic brain injury (TBI), and that there are diffuse alterations in large-scale functional brain networks in TBI populations, little is known about the relationship between the two. Here, in a sample of 26 participants with TBI and 20 healthy comparison participants (HC) we measured facial-affect recognition abilities and resting-state functional connectivity (rs-FC) using fMRI. We then used network-based statistics to examine (A) the presence of rs-FC differences between individuals with TBI and HC within the facial-affect processing network, and (B) the association between inter-individual differences in emotion recognition skills and rs-FC within the facial-affect processing network. We found that participants with TBI showed significantly lower rs-FC in a component comprising homotopic and within-hemisphere, anterior-posterior connections within the facial-affect processing network. In addition, within the TBI group, participants with higher emotion-labeling skills showed stronger rs-FC within a network comprised of intra- and inter-hemispheric bilateral connections. Findings indicate that the ability to successfully recognize facial-affect after TBI is related to rs-FC within components of facial-affective networks, and provide new evidence that further our understanding of the mechanisms underlying emotion recognition impairment in TBI.

  2. Behavioral Biometrics in Assisted Living: A Methodology for Emotion Recognition

    Directory of Open Access Journals (Sweden)

    S. Xefteris

    2016-08-01

    Full Text Available Behavioral biometrics aim at providing algorithms for the automatic recognition of individual behavioral traits, stemming from a person’s actions, attitude, expressions and conduct. In the field of ambient assisted living, behavioral biometrics find an important niche. Individuals suffering from the early stages of neurodegenerative diseases (MCI, Alzheimer’s, dementia need supervision in their daily activities. In this context, an unobtrusive system to monitor subjects and alert formal and informal carers providing information on both physical and emotional status is of great importance and positively affects multiple stakeholders. The primary aim of this paper is to describe a methodology for recognizing the emotional status of a subject using facial expressions and to identify its uses, in conjunction with pre-existing risk-assessment methodologies, for its integration into the context of a smart monitoring system for subjects suffering from neurodegenerative diseases. Paul Ekman’s research provided the background on the universality of facial expressions as indicators of underlying emotions. The methodology then makes use of computational geometry, image processing and graph theory algorithms for the detection of regions of interest and then a neural network is used for the final classification. Findings are coupled with previous published work for risk assessment and alert generation in the context of an ambient assisted living environment based on Service oriented architecture principles, aimed at remote web-based estimation of the cognitive and physical status of MCI and dementia patients.

  3. Recognition of facial expressions of emotion by adults with intellectual disability: Is there evidence for the emotion specificity hypothesis?

    Science.gov (United States)

    Scotland, Jennifer L; McKenzie, Karen; Cossar, Jill; Murray, Aja; Michie, Amanda

    2016-01-01

    This study aimed to evaluate the emotion recognition abilities of adults (n=23) with an intellectual disability (ID) compared with a control group of children (n=23) without ID matched for estimated cognitive ability. The study examined the impact of: task paradigm, stimulus type and preferred processing style (global/local) on accuracy. We found that, after controlling for estimated cognitive ability, the control group performed significantly better than the individuals with ID. This provides some support for the emotion specificity hypothesis. Having a more local processing style did not significantly mediate the relation between having ID and emotion recognition, but did significantly predict emotion recognition ability after controlling for group. This suggests that processing style is related to emotion recognition independently of having ID. The availability of contextual information improved emotion recognition for people with ID when compared with line drawing stimuli, and identifying a target emotion from a choice of two was relatively easier for individuals with ID, compared with the other task paradigms. The results of the study are considered in the context of current theories of emotion recognition deficits in individuals with ID. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Impaired autonomic responses to emotional stimuli in autoimmune limbic encephalitis

    Directory of Open Access Journals (Sweden)

    Olga eSchröder

    2015-11-01

    Full Text Available Limbic encephalitis (LE is an autoimmune-mediated disorder that affects structures of the limbic system, in particular the amygdala. The amygdala constitutes a brain area substantial for processing of emotional, especially fear-related signals. The amygdala is also involved in neuroendocrine and autonomic functions, including skin conductance responses (SCRs to emotionally arousing stimuli. This study investigates behavioral and autonomic responses to discrete emotion-evoking and neutral film clips in a patient suffering from LE associated with contactin-associated protein-2 (CASPR2-antibodies as compared to a healthy control group. Results show a lack of SCRs in the patient while watching the film clips, with significant differences compared to healthy controls in the case of fear-inducing videos. There was no comparable impairment in behavioral data (emotion report, valence and arousal ratings. The results point to a defective modulation of sympathetic responses during emotional stimulation in patients with LE, probably due to impaired functioning of the amygdala.

  5. Mapping the emotional face. How individual face parts contribute to successful emotion recognition.

    Directory of Open Access Journals (Sweden)

    Martin Wegrzyn

    Full Text Available Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes to disgust and happiness (mouth. The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation.

  6. Mapping the emotional face. How individual face parts contribute to successful emotion recognition

    Science.gov (United States)

    Wegrzyn, Martin; Vogt, Maria; Kireclioglu, Berna; Schneider, Julia; Kissler, Johanna

    2017-01-01

    Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation. PMID:28493921

  7. Below and beyond the recognition of emotional facial expressions in alcohol dependence: from basic perception to social cognition.

    Science.gov (United States)

    D'Hondt, Fabien; Campanella, Salvatore; Kornreich, Charles; Philippot, Pierre; Maurage, Pierre

    2014-01-01

    Studies that have carried out experimental evaluation of emotional skills in alcohol-dependence have, up to now, been mainly focused on the exploration of emotional facial expressions (EFE) decoding. In the present paper, we provide some complements to the recent systematic literature review published by Donadon and de Lima Osório on this crucial topic. We also suggest research avenues that must be, in our opinion, considered in the coming years. More precisely, we propose, first, that a battery integrating a set of emotional tasks relating to different processes should be developed to better systemize EFE decoding measures in alcohol-dependence. Second, we propose to go below EFE recognition deficits and to seek for the roots of those alterations, particularly by investigating the putative role played by early visual processing and vision-emotion interactions in the emotional impairment observed in alcohol-dependence. Third, we insist on the need to go beyond EFE recognition deficits by suggesting that they only constitute a part of wider emotional deficits in alcohol-dependence. Importantly, since the efficient decoding of emotions is a crucial ability for the development and maintenance of satisfactory interpersonal relationships, we suggest that disruption of this ability in alcohol-dependent individuals may have adverse consequences for their social integration. One way to achieve this research agenda would be to develop the field of affective and social neuroscience of alcohol-dependence, which could ultimately lead to major advances at both theoretical and therapeutic levels.

  8. Age-related differences in emotion recognition ability: a cross-sectional study.

    Science.gov (United States)

    Mill, Aire; Allik, Jüri; Realo, Anu; Valk, Raivo

    2009-10-01

    Experimental studies indicate that recognition of emotions, particularly negative emotions, decreases with age. However, there is no consensus at which age the decrease in emotion recognition begins, how selective this is to negative emotions, and whether this applies to both facial and vocal expression. In the current cross-sectional study, 607 participants ranging in age from 18 to 84 years (mean age = 32.6 +/- 14.9 years) were asked to recognize emotions expressed either facially or vocally. In general, older participants were found to be less accurate at recognizing emotions, with the most distinctive age difference pertaining to a certain group of negative emotions. Both modalities revealed an age-related decline in the recognition of sadness and -- to a lesser degree -- anger, starting at about 30 years of age. Although age-related differences in the recognition of expression of emotion were not mediated by personality traits, 2 of the Big 5 traits, openness and conscientiousness, made an independent contribution to emotion-recognition performance. Implications of age-related differences in facial and vocal emotion expression and early onset of the selective decrease in emotion recognition are discussed in terms of previous findings and relevant theoretical models.

  9. Emotion recognition through static faces and moving bodies: a comparison between typically developed adults and individuals with high level of autistic traits.

    Science.gov (United States)

    Actis-Grosso, Rossana; Bossi, Francesco; Ricciardelli, Paola

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits or with High Functioning Autism Spectrum Disorder (HAT group) was compared in the recognition of four emotions (Happiness, Anger, Fear, and Sadness) either shown in static faces or conveyed by moving body patch-light displays (PLDs). Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage). Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that (i) emotion recognition is not generally impaired in HAT individuals, (ii) the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals.

  10. Emotion recognition through static faces and moving bodies: a comparison between typically-developed adults and individuals with high level of autistic traits

    Directory of Open Access Journals (Sweden)

    Rossana eActis-Grosso

    2015-10-01

    Full Text Available We investigated whether the type of stimulus (pictures of static faces vs. body motion contributes differently to the recognition of emotions. The performance (accuracy and response times of 25 Low Autistic Traits (LAT group young adults (21 males and 20 young adults (16 males with either High Autistic Traits (HAT group or with High Functioning Autism Spectrum Disorder was compared in the recognition of four emotions (Happiness, Anger, Fear and Sadness either shown in static faces or conveyed by moving bodies (patch-light displays, PLDs. Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage. Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that i emotion recognition is not generally impaired in HAT individuals, ii the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals.

  11. Rapid Presentation of Emotional Expressions Reveals New Emotional Impairments in Tourette’s Syndrome

    Directory of Open Access Journals (Sweden)

    Martial eMermillod

    2013-04-01

    Full Text Available Objective:Based on a variety of empirical evidence obtained within the theoretical framework of embodiment theory, we considered it likely that motor disorders in Tourette’s syndrome (TS would have emotional consequences for TS patients. However, previous research using emotional facial categorization tasks suggests that these consequences are limited to TS patients with obsessive-compulsive behaviors(OCB.Method:These studies used long stimulus presentations which allowed the participants to categorize the different emotional facial expressions (EFEs on the basis of a perceptual analysis that might potentially hide a lack of emotional feeling for certain emotions. In order to reduce this perceptual bias, we used a rapid visual presentation procedure.Results:Using this new experimental method, we revealed different and surprising impairments on several EFEs in TS patients compared to matched healthy control participants. Moreover, a spatial frequency analysis of the visual signal processed by the patients suggests that these impairments may be located at a cortical level.Conclusions:The current study indicates that the rapid visual presentation paradigm makes it possible to identify various potential emotional disorders that were not revealed by the standard visual presentation procedures previously reported in the literature. Moreover, the spatial frequency analysis performed in our study suggests that emotional deficit in TS might lie at the level of temporal cortical areas dedicated to the processing of HSF visual information.

  12. Facial Emotion Recognition: A Survey and Real-World User Experiences in Mixed Reality

    Directory of Open Access Journals (Sweden)

    Dhwani Mehta

    2018-02-01

    Full Text Available Extensive possibilities of applications have made emotion recognition ineluctable and challenging in the field of computer science. The use of non-verbal cues such as gestures, body movement, and facial expressions convey the feeling and the feedback to the user. This discipline of Human–Computer Interaction places reliance on the algorithmic robustness and the sensitivity of the sensor to ameliorate the recognition. Sensors play a significant role in accurate detection by providing a very high-quality input, hence increasing the efficiency and the reliability of the system. Automatic recognition of human emotions would help in teaching social intelligence in the machines. This paper presents a brief study of the various approaches and the techniques of emotion recognition. The survey covers a succinct review of the databases that are considered as data sets for algorithms detecting the emotions by facial expressions. Later, mixed reality device Microsoft HoloLens (MHL is introduced for observing emotion recognition in Augmented Reality (AR. A brief introduction of its sensors, their application in emotion recognition and some preliminary results of emotion recognition using MHL are presented. The paper then concludes by comparing results of emotion recognition by the MHL and a regular webcam.

  13. Towards Real-Time Speech Emotion Recognition for Affective E-Learning

    Science.gov (United States)

    Bahreini, Kiavash; Nadolski, Rob; Westera, Wim

    2016-01-01

    This paper presents the voice emotion recognition part of the FILTWAM framework for real-time emotion recognition in affective e-learning settings. FILTWAM (Framework for Improving Learning Through Webcams And Microphones) intends to offer timely and appropriate online feedback based upon learner's vocal intonations and facial expressions in order…

  14. The Moving Window Technique: A Window into Developmental Changes in Attention during Facial Emotion Recognition

    Science.gov (United States)

    Birmingham, Elina; Meixner, Tamara; Iarocci, Grace; Kanan, Christopher; Smilek, Daniel; Tanaka, James W.

    2013-01-01

    The strategies children employ to selectively attend to different parts of the face may reflect important developmental changes in facial emotion recognition. Using the Moving Window Technique (MWT), children aged 5-12 years and adults ("N" = 129) explored faces with a mouse-controlled window in an emotion recognition task. An…

  15. Facial Emotion Recognition: A Survey and Real-World User Experiences in Mixed Reality.

    Science.gov (United States)

    Mehta, Dhwani; Siddiqui, Mohammad Faridul Haque; Javaid, Ahmad Y

    2018-02-01

    Extensive possibilities of applications have made emotion recognition ineluctable and challenging in the field of computer science. The use of non-verbal cues such as gestures, body movement, and facial expressions convey the feeling and the feedback to the user. This discipline of Human-Computer Interaction places reliance on the algorithmic robustness and the sensitivity of the sensor to ameliorate the recognition. Sensors play a significant role in accurate detection by providing a very high-quality input, hence increasing the efficiency and the reliability of the system. Automatic recognition of human emotions would help in teaching social intelligence in the machines. This paper presents a brief study of the various approaches and the techniques of emotion recognition. The survey covers a succinct review of the databases that are considered as data sets for algorithms detecting the emotions by facial expressions. Later, mixed reality device Microsoft HoloLens (MHL) is introduced for observing emotion recognition in Augmented Reality (AR). A brief introduction of its sensors, their application in emotion recognition and some preliminary results of emotion recognition using MHL are presented. The paper then concludes by comparing results of emotion recognition by the MHL and a regular webcam.

  16. Influence of Emotional Facial Expressions on 3-5-Year-Olds' Face Recognition

    Science.gov (United States)

    Freitag, Claudia; Schwarzer, Gudrun

    2011-01-01

    Three experiments examined 3- and 5-year-olds' recognition of faces in constant and varied emotional expressions. Children were asked to identify repeatedly presented target faces, distinguishing them from distractor faces, during an immediate recognition test and during delayed assessments after 10 min and one week. Emotional facial expression…

  17. Social power and recognition of emotional prosody: High power is associated with lower recognition accuracy than low power.

    Science.gov (United States)

    Uskul, Ayse K; Paulmann, Silke; Weick, Mario

    2016-02-01

    Listeners have to pay close attention to a speaker's tone of voice (prosody) during daily conversations. This is particularly important when trying to infer the emotional state of the speaker. Although a growing body of research has explored how emotions are processed from speech in general, little is known about how psychosocial factors such as social power can shape the perception of vocal emotional attributes. Thus, the present studies explored how social power affects emotional prosody recognition. In a correlational study (Study 1) and an experimental study (Study 2), we show that high power is associated with lower accuracy in emotional prosody recognition than low power. These results, for the first time, suggest that individuals experiencing high or low power perceive emotional tone of voice differently. (c) 2016 APA, all rights reserved).

  18. Selective attention to emotional cues and emotion recognition in healthy subjects: the role of mineralocorticoid receptor stimulation.

    Science.gov (United States)

    Schultebraucks, Katharina; Deuter, Christian E; Duesenberg, Moritz; Schulze, Lars; Hellmann-Regen, Julian; Domke, Antonia; Lockenvitz, Lisa; Kuehl, Linn K; Otte, Christian; Wingenfeld, Katja

    2016-09-01

    Selective attention toward emotional cues and emotion recognition of facial expressions are important aspects of social cognition. Stress modulates social cognition through cortisol, which acts on glucocorticoid (GR) and mineralocorticoid receptors (MR) in the brain. We examined the role of MR activation on attentional bias toward emotional cues and on emotion recognition. We included 40 healthy young women and 40 healthy young men (mean age 23.9 ± 3.3), who either received 0.4 mg of the MR agonist fludrocortisone or placebo. A dot-probe paradigm was used to test for attentional biases toward emotional cues (happy and sad faces). Moreover, we used a facial emotion recognition task to investigate the ability to recognize emotional valence (anger and sadness) from facial expression in four graded categories of emotional intensity (20, 30, 40, and 80 %). In the emotional dot-probe task, we found a main effect of treatment and a treatment × valence interaction. Post hoc analyses revealed an attentional bias away from sad faces after placebo intake and a shift in selective attention toward sad faces compared to placebo. We found no attentional bias toward happy faces after fludrocortisone or placebo intake. In the facial emotion recognition task, there was no main effect of treatment. MR stimulation seems to be important in modulating quick, automatic emotional processing, i.e., a shift in selective attention toward negative emotional cues. Our results confirm and extend previous findings of MR function. However, we did not find an effect of MR stimulation on emotion recognition.

  19. Right Limbic FDG-PET Hypometabolism Correlates with Emotion Recognition and Attribution in Probable Behavioral Variant of Frontotemporal Dementia Patients.

    Directory of Open Access Journals (Sweden)

    Chiara Cerami

    Full Text Available The behavioural variant of frontotemporal dementia (bvFTD is a rare disease mainly affecting the social brain. FDG-PET fronto-temporal hypometabolism is a supportive feature for the diagnosis. It may also provide specific functional metabolic signatures for altered socio-emotional processing. In this study, we evaluated the emotion recognition and attribution deficits and FDG-PET cerebral metabolic patterns at the group and individual levels in a sample of sporadic bvFTD patients, exploring the cognitive-functional correlations. Seventeen probable mild bvFTD patients (10 male and 7 female; age 67.8±9.9 were administered standardized and validated version of social cognition tasks assessing the recognition of basic emotions and the attribution of emotions and intentions (i.e., Ekman 60-Faces test-Ek60F and Story-based Empathy task-SET. FDG-PET was analysed using an optimized voxel-based SPM method at the single-subject and group levels. Severe deficits of emotion recognition and processing characterized the bvFTD condition. At the group level, metabolic dysfunction in the right amygdala, temporal pole, and middle cingulate cortex was highly correlated to the emotional recognition and attribution performances. At the single-subject level, however, heterogeneous impairments of social cognition tasks emerged, and different metabolic patterns, involving limbic structures and prefrontal cortices, were also observed. The derangement of a right limbic network is associated with altered socio-emotional processing in bvFTD patients, but different hypometabolic FDG-PET patterns and heterogeneous performances on social tasks at an individual level exist.

  20. An investigation of the effect of race-based social categorization on adults’ recognition of emotion

    Science.gov (United States)

    Reyes, B. Nicole; Segal, Shira C.

    2018-01-01

    Emotion recognition is important for social interaction and communication, yet previous research has identified a cross-cultural emotion recognition deficit: Recognition is less accurate for emotions expressed by individuals from a cultural group different than one’s own. The current study examined whether social categorization based on race, in the absence of cultural differences, influences emotion recognition in a diverse context. South Asian and White Canadians in the Greater Toronto Area completed an emotion recognition task that required them to identify the seven basic emotional expressions when posed by members of the same two groups, allowing us to tease apart the contributions of culture and social group membership. Contrary to our hypothesis, there was no mutual in-group advantage in emotion recognition: Participants were not more accurate at recognizing emotions posed by their respective racial in-groups. Both groups were more accurate at recognizing expressions when posed by South Asian faces, and White participants were more accurate overall compared to South Asian participants. These results suggest that in a diverse environment, categorization based on race alone does not lead to the creation of social out-groups in a way that negatively impacts emotion recognition. PMID:29474367

  1. An investigation of the effect of race-based social categorization on adults' recognition of emotion.

    Directory of Open Access Journals (Sweden)

    B Nicole Reyes

    Full Text Available Emotion recognition is important for social interaction and communication, yet previous research has identified a cross-cultural emotion recognition deficit: Recognition is less accurate for emotions expressed by individuals from a cultural group different than one's own. The current study examined whether social categorization based on race, in the absence of cultural differences, influences emotion recognition in a diverse context. South Asian and White Canadians in the Greater Toronto Area completed an emotion recognition task that required them to identify the seven basic emotional expressions when posed by members of the same two groups, allowing us to tease apart the contributions of culture and social group membership. Contrary to our hypothesis, there was no mutual in-group advantage in emotion recognition: Participants were not more accurate at recognizing emotions posed by their respective racial in-groups. Both groups were more accurate at recognizing expressions when posed by South Asian faces, and White participants were more accurate overall compared to South Asian participants. These results suggest that in a diverse environment, categorization based on race alone does not lead to the creation of social out-groups in a way that negatively impacts emotion recognition.

  2. An investigation of the effect of race-based social categorization on adults' recognition of emotion.

    Science.gov (United States)

    Reyes, B Nicole; Segal, Shira C; Moulson, Margaret C

    2018-01-01

    Emotion recognition is important for social interaction and communication, yet previous research has identified a cross-cultural emotion recognition deficit: Recognition is less accurate for emotions expressed by individuals from a cultural group different than one's own. The current study examined whether social categorization based on race, in the absence of cultural differences, influences emotion recognition in a diverse context. South Asian and White Canadians in the Greater Toronto Area completed an emotion recognition task that required them to identify the seven basic emotional expressions when posed by members of the same two groups, allowing us to tease apart the contributions of culture and social group membership. Contrary to our hypothesis, there was no mutual in-group advantage in emotion recognition: Participants were not more accurate at recognizing emotions posed by their respective racial in-groups. Both groups were more accurate at recognizing expressions when posed by South Asian faces, and White participants were more accurate overall compared to South Asian participants. These results suggest that in a diverse environment, categorization based on race alone does not lead to the creation of social out-groups in a way that negatively impacts emotion recognition.

  3. Reading Emotions from Body Movement: A Generalized Impairment in Schizophrenia.

    Science.gov (United States)

    Vaskinn, Anja; Sundet, Kjetil; Østefjells, Tiril; Nymo, Katharina; Melle, Ingrid; Ueland, Torill

    2015-01-01

    Body language reading is a social cognitive process with importance for successful maneuvering of social situations. In this study, we investigated body language reading as assessed with human point-light displays in participants with a diagnosis of schizophrenia (n = 84) compared to healthy control participants (n = 84), aiming to answer three questions: (1) whether persons with a diagnosis of schizophrenia have poorer body language reading abilities than healthy persons; (2) whether some emotions are easier to read from body language than others, and if this is the same for individuals with schizophrenia and healthy individuals, and (3) whether there are sex differences in body language reading in participants with schizophrenia and healthy participants. A fourth research aim concerned associations of body language reading with symptoms and functioning in participants with schizophrenia. Scores on the body language reading measure was first standardized using a separate sample of healthy control participants (n = 101). Further results showed that persons with schizophrenia had impaired body language reading ability compared to healthy persons. A significant effect of emotion indicated that some emotions (happiness, neutral) were easier to recognize and this was so for both individuals with schizophrenia and healthy individuals. There were no sex differences for either diagnostic group. Body language reading ability was not associated with symptoms or functioning. In conclusion; schizophrenia was characterized by a global impairment in body language reading that was present for all emotions and across sex.

  4. Effects of mild cognitive impairment on emotional scene memory.

    Science.gov (United States)

    Waring, J D; Dimsdale-Zucker, H R; Flannery, S; Budson, A E; Kensinger, E A

    2017-02-01

    Young and older adults experience benefits in attention and memory for emotional compared to neutral information, but this memory benefit is greatly diminished in Alzheimer's disease (AD). Little is known about whether this impairment arises early or late in the time course between healthy aging and AD. This study compared memory for positive, negative, and neutral items with neutral backgrounds between patients with mild cognitive impairment (MCI) and healthy older adults. We also used a divided attention condition in older adults as a possible model for the deficits observed in MCI patients. Results showed a similar pattern of selective memory for emotional items while forgetting their backgrounds in older adults and MCI patients, but MCI patients had poorer memory overall. Dividing attention during encoding disproportionately reduced memory for backgrounds (versus items) relative to a full attention condition. Participants performing in the lower half on the divided attention task qualitatively and quantitatively mirrored the results in MCI patients. Exploratory analyses comparing lower- and higher-performing MCI patients showed that only higher-performing MCI patients had the characteristic scene memory pattern observed in healthy older adults. Together, these results suggest that the effects of emotion on memory are relatively well preserved for patients with MCI, although emotional memory patterns may start to be altered once memory deficits become more pronounced. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Reading emotions from body movement: a generalized impairment in schizophrenia

    Directory of Open Access Journals (Sweden)

    Anja eVaskinn

    2016-01-01

    Full Text Available Body language reading is a social cognitive process with importance for successful maneuvering of social situations. In this study, we investigated body language reading as assessed with human point-light displays in participants with a diagnosis of schizophrenia (n = 84 compared to healthy control participants (n = 84, aiming to answer three questions: 1 whether persons with a diagnosis of schizophrenia have poorer body language reading abilities than healthy persons; 2 whether some emotions are easier to read from body language than others, and if this is the same for individuals with schizophrenia and healthy individuals, and 3 whether there are sex differences in body language reading in participants with schizophrenia and healthy participants. A fourth research aim concerned associations of body language reading with symptoms and functioning in participants with schizophrenia. Scores on the body language reading measure was first standardized using a separate sample of healthy control participants (n = 101. Further results showed that persons with schizophrenia had impaired body language reading ability compared to healthy persons. A significant effect of emotion indicated that some emotions (happiness, neutral were easier to recognize and this was so for both individuals with schizophrenia and healthy individuals. There were no sex differences for either diagnostic group. Body language reading ability was not associated with symptoms or functioning. In conclusion; schizophrenia was characterized by a global impairment in body language reading that was present for all emotions and across sex.

  6. Feature Fusion Algorithm for Multimodal Emotion Recognition from Speech and Facial Expression Signal

    Directory of Open Access Journals (Sweden)

    Han Zhiyan

    2016-01-01

    Full Text Available In order to overcome the limitation of single mode emotion recognition. This paper describes a novel multimodal emotion recognition algorithm, and takes speech signal and facial expression signal as the research subjects. First, fuse the speech signal feature and facial expression signal feature, get sample sets by putting back sampling, and then get classifiers by BP neural network (BPNN. Second, measure the difference between two classifiers by double error difference selection strategy. Finally, get the final recognition result by the majority voting rule. Experiments show the method improves the accuracy of emotion recognition by giving full play to the advantages of decision level fusion and feature level fusion, and makes the whole fusion process close to human emotion recognition more, with a recognition rate 90.4%.

  7. Exploring Cultural Differences in the Recognition of the Self-Conscious Emotions.

    Science.gov (United States)

    Chung, Joanne M; Robins, Richard W

    2015-01-01

    Recent research suggests that the self-conscious emotions of embarrassment, shame, and pride have distinct, nonverbal expressions that can be recognized in the United States at above-chance levels. However, few studies have examined the recognition of these emotions in other cultures, and little research has been conducted in Asia. Consequently the cross-cultural generalizability of self-conscious emotions has not been firmly established. Additionally, there is no research that examines cultural variability in the recognition of the self-conscious emotions. Cultural values and exposure to Western culture have been identified as contributors to variability in recognition rates for the basic emotions; we sought to examine this for the self-conscious emotions using the University of California, Davis Set of Emotion Expressions (UCDSEE). The present research examined recognition of the self-conscious emotion expressions in South Korean college students and found that recognition rates were very high for pride, low but above chance for shame, and near zero for embarrassment. To examine what might be underlying the recognition rates we found in South Korea, recognition of self-conscious emotions and several cultural values were examined in a U.S. college student sample of European Americans, Asian Americans, and Asian-born individuals. Emotion recognition rates were generally similar between the European Americans and Asian Americans, and higher than emotion recognition rates for Asian-born individuals. These differences were not explained by cultural values in an interpretable manner, suggesting that exposure to Western culture is a more important mediator than values.

  8. Exploring Cultural Differences in the Recognition of the Self-Conscious Emotions.

    Directory of Open Access Journals (Sweden)

    Joanne M Chung

    Full Text Available Recent research suggests that the self-conscious emotions of embarrassment, shame, and pride have distinct, nonverbal expressions that can be recognized in the United States at above-chance levels. However, few studies have examined the recognition of these emotions in other cultures, and little research has been conducted in Asia. Consequently the cross-cultural generalizability of self-conscious emotions has not been firmly established. Additionally, there is no research that examines cultural variability in the recognition of the self-conscious emotions. Cultural values and exposure to Western culture have been identified as contributors to variability in recognition rates for the basic emotions; we sought to examine this for the self-conscious emotions using the University of California, Davis Set of Emotion Expressions (UCDSEE. The present research examined recognition of the self-conscious emotion expressions in South Korean college students and found that recognition rates were very high for pride, low but above chance for shame, and near zero for embarrassment. To examine what might be underlying the recognition rates we found in South Korea, recognition of self-conscious emotions and several cultural values were examined in a U.S. college student sample of European Americans, Asian Americans, and Asian-born individuals. Emotion recognition rates were generally similar between the European Americans and Asian Americans, and higher than emotion recognition rates for Asian-born individuals. These differences were not explained by cultural values in an interpretable manner, suggesting that exposure to Western culture is a more important mediator than values.

  9. Exploring Cultural Differences in the Recognition of the Self-Conscious Emotions

    Science.gov (United States)

    Chung, Joanne M.; Robins, Richard W.

    2015-01-01

    Recent research suggests that the self-conscious emotions of embarrassment, shame, and pride have distinct, nonverbal expressions that can be recognized in the United States at above-chance levels. However, few studies have examined the recognition of these emotions in other cultures, and little research has been conducted in Asia. Consequently the cross-cultural generalizability of self-conscious emotions has not been firmly established. Additionally, there is no research that examines cultural variability in the recognition of the self-conscious emotions. Cultural values and exposure to Western culture have been identified as contributors to variability in recognition rates for the basic emotions; we sought to examine this for the self-conscious emotions using the University of California, Davis Set of Emotion Expressions (UCDSEE). The present research examined recognition of the self-conscious emotion expressions in South Korean college students and found that recognition rates were very high for pride, low but above chance for shame, and near zero for embarrassment. To examine what might be underlying the recognition rates we found in South Korea, recognition of self-conscious emotions and several cultural values were examined in a U.S. college student sample of European Americans, Asian Americans, and Asian-born individuals. Emotion recognition rates were generally similar between the European Americans and Asian Americans, and higher than emotion recognition rates for Asian-born individuals. These differences were not explained by cultural values in an interpretable manner, suggesting that exposure to Western culture is a more important mediator than values. PMID:26309215

  10. Face puzzle—two new video-based tasks for measuring explicit and implicit aspects of facial emotion recognition

    Science.gov (United States)

    Kliemann, Dorit; Rosenblau, Gabriela; Bölte, Sven; Heekeren, Hauke R.; Dziobek, Isabel

    2013-01-01

    Recognizing others' emotional states is crucial for effective social interaction. While most facial emotion recognition tasks use explicit prompts that trigger consciously controlled processing, emotional faces are almost exclusively processed implicitly in real life. Recent attempts in social cognition suggest a dual process perspective, whereby explicit and implicit processes largely operate independently. However, due to differences in methodology the direct comparison of implicit and explicit social cognition has remained a challenge. Here, we introduce a new tool to comparably measure implicit and explicit processing aspects comprising basic and complex emotions in facial expressions. We developed two video-based tasks with similar answer formats to assess performance in respective facial emotion recognition processes: Face Puzzle, implicit and explicit. To assess the tasks' sensitivity to atypical social cognition and to infer interrelationship patterns between explicit and implicit processes in typical and atypical development, we included healthy adults (NT, n = 24) and adults with autism spectrum disorder (ASD, n = 24). Item analyses yielded good reliability of the new tasks. Group-specific results indicated sensitivity to subtle social impairments in high-functioning ASD. Correlation analyses with established implicit and explicit socio-cognitive measures were further in favor of the tasks' external validity. Between group comparisons provide first hints of differential relations between implicit and explicit aspects of facial emotion recognition processes in healthy compared to ASD participants. In addition, an increased magnitude of between group differences in the implicit task was found for a speed-accuracy composite measure. The new Face Puzzle tool thus provides two new tasks to separately assess explicit and implicit social functioning, for instance, to measure subtle impairments as well as potential improvements due to social cognitive

  11. Memristive Computational Architecture of an Echo State Network for Real-Time Speech Emotion Recognition

    Science.gov (United States)

    2015-05-28

    recognition is simpler and requires less computational resources compared to other inputs such as facial expressions . The Berlin database of Emotional ...Processing Magazine, IEEE, vol. 18, no. 1, pp. 32– 80, 2001. [15] K. R. Scherer, T. Johnstone, and G. Klasmeyer, “Vocal expression of emotion ...Network for Real-Time Speech- Emotion Recognition 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 62788F 6. AUTHOR(S) Q

  12. Anxiety disorders in adolescence are associated with impaired facial expression recognition to negative valence.

    Science.gov (United States)

    Jarros, Rafaela Behs; Salum, Giovanni Abrahão; Belem da Silva, Cristiano Tschiedel; Toazza, Rudineia; de Abreu Costa, Marianna; Fumagalli de Salles, Jerusa; Manfro, Gisele Gus

    2012-02-01

    The aim of the present study was to test the ability of adolescents with a current anxiety diagnosis to recognize facial affective expressions, compared to those without an anxiety disorder. Forty cases and 27 controls were selected from a larger cross sectional community sample of adolescents, aged from 10 to 17 years old. Adolescent's facial recognition of six human emotions (sadness, anger, disgust, happy, surprise and fear) and neutral faces was assessed through a facial labeling test using Ekman's Pictures of Facial Affect (POFA). Adolescents with anxiety disorders had a higher mean number of errors in angry faces as compared to controls: 3.1 (SD=1.13) vs. 2.5 (SD=2.5), OR=1.72 (CI95% 1.02 to 2.89; p=0.040). However, they named neutral faces more accurately than adolescents without anxiety diagnosis: 15% of cases vs. 37.1% of controls presented at least one error in neutral faces, OR=3.46 (CI95% 1.02 to 11.7; p=0.047). No differences were found considering other human emotions or on the distribution of errors in each emotional face between the groups. Our findings support an anxiety-mediated influence on the recognition of facial expressions in adolescence. These difficulty in recognizing angry faces and more accuracy in naming neutral faces may lead to misinterpretation of social clues and can explain some aspects of the impairment in social interactions in adolescents with anxiety disorders. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Impairment in emotional modulation of attention and memory in schizophrenia.

    Science.gov (United States)

    Walsh-Messinger, Julie; Ramirez, Paul Michael; Wong, Philip; Antonius, Daniel; Aujero, Nicole; McMahon, Kevin; Opler, Lewis A; Malaspina, Dolores

    2014-08-01

    Emotion plays a critical role in cognition and goal-directed behavior via complex interconnections between the emotional and motivational systems. It has been hypothesized that the impairment in goal-directed behavior widely noted in schizophrenia may result from defects in the interaction between the neural (ventral) emotional system and (rostral) cortical processes. The present study examined the impact of emotion on attention and memory in schizophrenia. Twenty-five individuals with schizophrenia related psychosis and 25 healthy control subjects were administered a computerized task in which they were asked to search for target images during a Rapid Serial Visual Presentation of pictures. Target stimuli were either positive or negative, or neutral images presented at either 200ms or 700ms lag. Additionally, a visual hedonic task was used to assess differences between the schizophrenia group and controls on ratings of valence and arousal from the picture stimuli. Compared to controls, individuals with schizophrenia detected fewer emotional images under both the 200ms and 700ms lag conditions. Multivariate analyses showed that the schizophrenia group also detected fewer positive images under the 700ms lag condition and fewer negative images under the 200ms lag condition. Individuals with schizophrenia reported higher pleasantness and unpleasantness ratings than controls in response to neutral stimuli, while controls reported higher arousal ratings for neutral and positive stimuli compared to the schizophrenia group. These results highlight dysfunction in the neural modulation of emotion, attention, and cortical processing in schizophrenia, adding to the growing but mixed body of literature on emotion processing in the disorder. Published by Elsevier B.V.

  14. Emotion Recognition in Animated Compared to Human Stimuli in Adolescents with Autism Spectrum Disorder

    Science.gov (United States)

    Brosnan, Mark; Johnson, Hilary; Grawmeyer, Beate; Chapman, Emma; Benton, Laura

    2015-01-01

    There is equivocal evidence as to whether there is a deficit in recognising emotional expressions in Autism spectrum disorder (ASD). This study compared emotion recognition in ASD in three types of emotion expression media (still image, dynamic image, auditory) across human stimuli (e.g. photo of a human face) and animated stimuli (e.g. cartoon…

  15. Recognition of Facial Expressions and Prosodic Cues with Graded Emotional Intensities in Adults with Asperger Syndrome

    Science.gov (United States)

    Doi, Hirokazu; Fujisawa, Takashi X.; Kanai, Chieko; Ohta, Haruhisa; Yokoi, Hideki; Iwanami, Akira; Kato, Nobumasa; Shinohara, Kazuyuki

    2013-01-01

    This study investigated the ability of adults with Asperger syndrome to recognize emotional categories of facial expressions and emotional prosodies with graded emotional intensities. The individuals with Asperger syndrome showed poorer recognition performance for angry and sad expressions from both facial and vocal information. The group…

  16. Neural correlates of emotional recognition memory in schizophrenia: effects of valence and arousal.

    Science.gov (United States)

    Lakis, Nadia; Jiménez, José A; Mancini-Marïe, Adham; Stip, Emmanuel; Lavoie, Marc E; Mendrek, Adrianna

    2011-12-30

    Schizophrenia patients are often impaired in their memory for emotional events compared with healthy subjects. Investigations of the neural correlates of emotional memory in schizophrenia patients are scarce in the literature. The present study aimed to compare cerebral activations in schizophrenia patients and healthy controls during memory retrieval of emotional images that varied in both valence and arousal. In a study with functional magnetic resonance imaging, 37 schizophrenia patients were compared with 37 healthy participants while performing a yes/no recognition paradigm with positive, negative (differing in arousal intensity) and neutral images. Schizophrenia patients performed worse than healthy controls in all experimental conditions. They showed less cerebral activation in limbic and prefrontal regions than controls during retrieval of negatively valenced stimuli, but had a similar pattern of brain activation compared with controls during retrieval of positively valenced stimuli (particularly in the high arousal condition) in the cerebellum, temporal lobe and prefrontal cortex. Both groups demonstrated increased brain activations in the high relative to low arousing conditions. Our results suggest atypical brain function during retrieval of negative pictures, but intact functional circuitry of positive affect during episodic memory retrieval in schizophrenia patients. The arousal data revealed that schizophrenia patients closely resemble the control group at both the behavioral and neurofunctional level. 2011 Elsevier Ireland Ltd. All rights reserved.

  17. [Emotion recognition rehabilitation combined with cognitive stimulation for people with Alzheimer's disease. Efficacy for cognition and functional aspects].

    Science.gov (United States)

    Garcia-Casal, J A; Goni-Imizcoz, M; Perea-Bartolome, M V; Garcia-Moja, C; Calvo-Simal, S; Cardelle-Garcia, F; Franco-Martin, M

    2017-08-01

    The ability to recognize facial emotional expression is essential for social interactions and adapting to the environment. Emotion recognition is impaired in people with Alzheimer's disease (AD), thus rehabilitation of these skills has the potential to elicit significant benefits. To assess the efficacy of a combined treatment of rehabilitation of emotion recognition (RER) and cognitive stimulation (CS) for people with AD, due to its potential implications for more effective psychosocial interventions. 36 patients were assigned to one of three experimental conditions: an experimental group (EG) that received 20 sessions of RER and 20 sessions of CS; a control group (CG) that received 40 sessions of CS, and a treatment as usual group (TAU). 32 patients completed the treatment (77.53 ± 5.43 years). Significant differences were found in MMSE30 (F = 5.10; p = 0.013), MMSE35 (F = 4.16; p = 0.026), affect recognition (Z = -2.81; p = 0.005) and basic activities of daily living (Z = -2.27; p = 0.018) favouring the efficacy of the combined treatment. The TAU group showed a decline in depression (Z = -1.99; p = 0.048), apathy (Z = -2.30; p = 0.022) and anosognosia (Z = -2.19; p = 0.028). The combined treatment of RER + CS was more effective than TAU and CS alone for the treatment of patients with AD. This is the first study about the rehabilitation of affect recognition in AD.

  18. Four-Channel Biosignal Analysis and Feature Extraction for Automatic Emotion Recognition

    Science.gov (United States)

    Kim, Jonghwa; André, Elisabeth

    This paper investigates the potential of physiological signals as a reliable channel for automatic recognition of user's emotial state. For the emotion recognition, little attention has been paid so far to physiological signals compared to audio-visual emotion channels such as facial expression or speech. All essential stages of automatic recognition system using biosignals are discussed, from recording physiological dataset up to feature-based multiclass classification. Four-channel biosensors are used to measure electromyogram, electrocardiogram, skin conductivity and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to search the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by emotion recognition results.

  19. Examining speed of processing of facial emotion recognition in individuals at ultra-high risk for psychosis

    DEFF Research Database (Denmark)

    Glenthøj, Louise Birkedal; Fagerlund, Birgitte; Bak, Nikolaj

    2018-01-01

    Emotion recognition is an aspect of social cognition that may be a key predictor of functioning and transition to psychosis in individuals at ultra-high risk (UHR) for psychosis ( Allott et al., 2014 ). UHR individuals exhibit deficits in accurately identifying facial emotions ( van Donkersgoed et...... al., 2015 ), but other potential anomalies in facial emotion recognition are largely unexplored. This study aimed to extend current knowledge on emotion recognition deficits in UHR individuals by examining: 1) whether UHR would display significantly slower facial emotion recognition than healthy...... controls, 2) whether an association between emotion recognition accuracy and emotion recognition latency is present in UHR, 3) the relationships between emotion recognition accuracy, neurocognition and psychopathology in UHR....

  20. [Recognition of facial emotions and theory of mind in schizophrenia: could the theory of mind deficit be due to the non-recognition of facial emotions?].

    Science.gov (United States)

    Besche-Richard, C; Bourrin-Tisseron, A; Olivier, M; Cuervo-Lombard, C-V; Limosin, F

    2012-06-01

    The deficits of recognition of facial emotions and attribution of mental states are now well-documented in schizophrenic patients. However, we don't clearly know about the link between these two complex cognitive functions, especially in schizophrenia. In this study, we attempted to test the link between the recognition of facial emotions and the capacities of mentalization, notably the attribution of beliefs, in health and schizophrenic participants. We supposed that the level of performance of recognition of facial emotions, compared to the working memory and executive functioning, was the best predictor of the capacities to attribute a belief. Twenty schizophrenic participants according to DSM-IVTR (mean age: 35.9 years, S.D. 9.07; mean education level: 11.15 years, S.D. 2.58) clinically stabilized, receiving neuroleptic or antipsychotic medication participated in the study. They were matched on age (mean age: 36.3 years, S.D. 10.9) and educational level (mean educational level: 12.10, S.D. 2.25) with 30 matched healthy participants. All the participants were evaluated with a pool of tasks testing the recognition of facial emotions (the faces of Baron-Cohen), the attribution of beliefs (two stories of first order and two stories of second order), the working memory (the digit span of the WAIS-III and the Corsi test) and the executive functioning (Trail Making Test A et B, Wisconsin Card Sorting Test brief version). Comparing schizophrenic and healthy participants, our results confirmed a difference between the performances of the recognition of facial emotions and those of the attribution of beliefs. The result of the simple linear regression showed that the recognition of facial emotions, compared to the performances of working memory and executive functioning, was the best predictor of the performances in the theory of mind stories. Our results confirmed, in a sample of schizophrenic patients, the deficits in the recognition of facial emotions and in the

  1. Mapping the impairment in decoding static facial expressions of emotion in prosopagnosia.

    Science.gov (United States)

    Fiset, Daniel; Blais, Caroline; Royer, Jessica; Richoz, Anne-Raphaëlle; Dugas, Gabrielle; Caldara, Roberto

    2017-08-01

    Acquired prosopagnosia is characterized by a deficit in face recognition due to diverse brain lesions, but interestingly most prosopagnosic patients suffering from posterior lesions use the mouth instead of the eyes for face identification. Whether this bias is present for the recognition of facial expressions of emotion has not yet been addressed. We tested PS, a pure case of acquired prosopagnosia with bilateral occipitotemporal lesions anatomically sparing the regions dedicated for facial expression recognition. PS used mostly the mouth to recognize facial expressions even when the eye area was the most diagnostic. Moreover, PS directed most of her fixations towards the mouth. Her impairment was still largely present when she was instructed to look at the eyes, or when she was forced to look at them. Control participants showed a performance comparable to PS when only the lower part of the face was available. These observations suggest that the deficits observed in PS with static images are not solely attentional, but are rooted at the level of facial information use. This study corroborates neuroimaging findings suggesting that the Occipital Face Area might play a critical role in extracting facial features that are integrated for both face identification and facial expression recognition in static images. © The Author (2017). Published by Oxford University Press.

  2. A Novel DBN Feature Fusion Model for Cross-Corpus Speech Emotion Recognition

    Directory of Open Access Journals (Sweden)

    Zou Cairong

    2016-01-01

    Full Text Available The feature fusion from separate source is the current technical difficulties of cross-corpus speech emotion recognition. The purpose of this paper is to, based on Deep Belief Nets (DBN in Deep Learning, use the emotional information hiding in speech spectrum diagram (spectrogram as image features and then implement feature fusion with the traditional emotion features. First, based on the spectrogram analysis by STB/Itti model, the new spectrogram features are extracted from the color, the brightness, and the orientation, respectively; then using two alternative DBN models they fuse the traditional and the spectrogram features, which increase the scale of the feature subset and the characterization ability of emotion. Through the experiment on ABC database and Chinese corpora, the new feature subset compared with traditional speech emotion features, the recognition result on cross-corpus, distinctly advances by 8.8%. The method proposed provides a new idea for feature fusion of emotion recognition.

  3. A Pilot Study Examining a Computer-Based Intervention to Improve Recognition and Understanding of Emotions in Young Children with Communication and Social Deficits.

    Science.gov (United States)

    Romero, Neri L

    2017-06-01

    A common social impairment in individuals with ASD is difficulty interpreting and or predicting emotions of others. To date, several interventions targeting teaching emotion recognition and understanding have been utilized both by researchers and practitioners. The results suggest that teaching emotion recognition is possible, but that the results do not generalize to non-instructional contexts. This study sought to replicate earlier findings of a positive impact of teaching emotion recognition using a computer-based intervention and to extend it by testing for generalization on live models in the classroom setting. Two boys and one girl, four to eight years in age, educated in self-contained classrooms for students with communication and social skills deficits, participated in this study. A multiple probe across participants design was utilized. Measures of emotion recognition and understanding were assessed at baseline, intervention, and one month post-intervention to determine maintenance effects. Social validity was assessed through parent and teacher questionnaires. All participants showed improvements in measures assessing their recognition of emotions in faces, generalized knowledge to live models, and maintained gains one month post intervention. These preliminary results are encouraging and should be utilized to inform a group design, in order to test efficacy with a larger population. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Single prolonged stress impairs social and object novelty recognition in rats.

    Science.gov (United States)

    Eagle, Andrew L; Fitzpatrick, Chris J; Perrine, Shane A

    2013-11-01

    Posttraumatic stress disorder (PTSD) results from exposure to a traumatic event and manifests as re-experiencing, arousal, avoidance, and negative cognition/mood symptoms. Avoidant symptoms, as well as the newly defined negative cognitions/mood, are a serious complication leading to diminished interest in once important or positive activities, such as social interaction; however, the basis of these symptoms remains poorly understood. PTSD patients also exhibit impaired object and social recognition, which may underlie the avoidance and symptoms of negative cognition, such as social estrangement or diminished interest in activities. Previous studies have demonstrated that single prolonged stress (SPS), models PTSD phenotypes, including impairments in learning and memory. Therefore, it was hypothesized that SPS would impair social and object recognition memory. Male Sprague Dawley rats were exposed to SPS then tested in the social choice test (SCT) or novel object recognition test (NOR). These tests measure recognition of novelty over familiarity, a natural preference of rodents. Results show that SPS impaired preference for both social and object novelty. In addition, SPS impairment in social recognition may be caused by impaired behavioral flexibility, or an inability to shift behavior during the SCT. These results demonstrate that traumatic stress can impair social and object recognition memory, which may underlie certain avoidant symptoms or negative cognition in PTSD and be related to impaired behavioral flexibility. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Subject-independent emotion recognition based on physiological signals: a three-stage decision method.

    Science.gov (United States)

    Chen, Jing; Hu, Bin; Wang, Yue; Moore, Philip; Dai, Yongqiang; Feng, Lei; Ding, Zhijie

    2017-12-20

    Collaboration between humans and computers has become pervasive and ubiquitous, however current computer systems are limited in that they fail to address the emotional component. An accurate understanding of human emotions is necessary for these computers to trigger proper feedback. Among multiple emotional channels, physiological signals are synchronous with emotional responses; therefore, analyzing physiological changes is a recognized way to estimate human emotions. In this paper, a three-stage decision method is proposed to recognize four emotions based on physiological signals in the multi-subject context. Emotion detection is achieved by using a stage-divided strategy in which each stage deals with a fine-grained goal. The decision method consists of three stages. During the training process, the initial stage transforms mixed training subjects to separate groups, thus eliminating the effect of individual differences. The second stage categorizes four emotions into two emotion pools in order to reduce recognition complexity. The third stage trains a classifier based on emotions in each emotion pool. During the testing process, a test case or test trial will be initially classified to a group followed by classification into an emotion pool in the second stage. An emotion will be assigned to the test trial in the final stage. In this paper we consider two different ways of allocating four emotions into two emotion pools. A comparative analysis is also carried out between the proposal and other methods. An average recognition accuracy of 77.57% was achieved on the recognition of four emotions with the best accuracy of 86.67% to recognize the positive and excited emotion. Using differing ways of allocating four emotions into two emotion pools, we found there is a difference in the effectiveness of a classifier on learning each emotion. When compared to other methods, the proposed method demonstrates a significant improvement in recognizing four emotions in the

  6. Below and beyond the recognition of emotional facial expressions in alcohol dependence: from basic perception to social cognition

    Directory of Open Access Journals (Sweden)

    D’Hondt F

    2014-11-01

    Full Text Available Fabien D’Hondt,1 Salvatore Campanella,2 Charles Kornreich,2 Pierre Philippot,1 Pierre Maurage1 1Laboratory for Experimental Psychopathology, Psychological Sciences Research Institute, Université Catholique de Louvain, Louvain-la-Neuve, Belgium; 2Laboratory of Medical Psychology and Addictology, ULB Neuroscience Institute (UNI, Université Libre de Bruxelles, Brussels, Belgium Abstract: Studies that have carried out experimental evaluation of emotional skills in alcohol-dependence have, up to now, been mainly focused on the exploration of emotional facial expressions (EFE decoding. In the present paper, we provide some complements to the recent systematic literature review published by Donadon and de Lima Osório on this crucial topic. We also suggest research avenues that must be, in our opinion, considered in the coming years. More precisely, we propose, first, that a battery integrating a set of emotional tasks relating to different processes should be developed to better systemize EFE decoding measures in alcohol-dependence. Second, we propose to go below EFE recognition deficits and to seek for the roots of those alterations, particularly by investigating the putative role played by early visual processing and vision–emotion interactions in the emotional impairment observed in alcohol-dependence. Third, we insist on the need to go beyond EFE recognition deficits by suggesting that they only constitute a part of wider emotional deficits in alcohol-dependence. Importantly, since the efficient decoding of emotions is a crucial ability for the development and maintenance of satisfactory interpersonal relationships, we suggest that disruption of this ability in alcohol-dependent individuals may have adverse consequences for their social integration. One way to achieve this research agenda would be to develop the field of affective and social neuroscience of alcohol-dependence, which could ultimately lead to major advances at both theoretical

  7. More Pronounced Deficits in Facial Emotion Recognition for Schizophrenia than Bipolar Disorder

    Science.gov (United States)

    Goghari, Vina M; Sponheim, Scott R

    2012-01-01

    Schizophrenia and bipolar disorder are typically separated in diagnostic systems. Behavioural, cognitive, and brain abnormalities associated with each disorder nonetheless overlap. We evaluated the diagnostic specificity of facial emotion recognition deficits in schizophrenia and bipolar disorder to determine whether select aspects of emotion recognition differed for the two disorders. The investigation used an experimental task that included the same facial images in an emotion recognition condition and an age recognition condition (to control for processes associated with general face recognition) in 27 schizophrenia patients, 16 bipolar I patients, and 30 controls. Schizophrenia and bipolar patients exhibited both shared and distinct aspects of facial emotion recognition deficits. Schizophrenia patients had deficits in recognizing angry facial expressions compared to healthy controls and bipolar patients. Compared to control participants, both schizophrenia and bipolar patients were more likely to mislabel facial expressions of anger as fear. Given that schizophrenia patients exhibited a deficit in emotion recognition for angry faces, which did not appear due to generalized perceptual and cognitive dysfunction, improving recognition of threat-related expression may be an important intervention target to improve social functioning in schizophrenia. PMID:23218816

  8. Effects of facial emotion recognition remediation on visual scanning of novel face stimuli.

    Science.gov (United States)

    Marsh, Pamela J; Luckett, Gemma; Russell, Tamara; Coltheart, Max; Green, Melissa J

    2012-11-01

    Previous research shows that emotion recognition in schizophrenia can be improved with targeted remediation that draws attention to important facial features (eyes, nose, mouth). Moreover, the effects of training have been shown to last for up to one month after training. The aim of this study was to investigate whether improved emotion recognition of novel faces is associated with concomitant changes in visual scanning of these same novel facial expressions. Thirty-nine participants with schizophrenia received emotion recognition training using Ekman's Micro-Expression Training Tool (METT), with emotion recognition and visual scanpath (VSP) recordings to face stimuli collected simultaneously. Baseline ratings of interpersonal and cognitive functioning were also collected from all participants. Post-METT training, participants showed changes in foveal attention to the features of facial expressions of emotion not used in METT training, which were generally consistent with the information about important features from the METT. In particular, there were changes in how participants looked at the features of facial expressions of emotion surprise, disgust, fear, happiness, and neutral, demonstrating that improved emotion recognition is paralleled by changes in the way participants with schizophrenia viewed novel facial expressions of emotion. However, there were overall decreases in foveal attention to sad and neutral faces that indicate more intensive instruction might be needed for these faces during training. Most importantly, the evidence shows that participant gender may affect training outcomes. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Recognition of facial expressions and prosodic cues with graded emotional intensities in adults with Asperger syndrome.

    Science.gov (United States)

    Doi, Hirokazu; Fujisawa, Takashi X; Kanai, Chieko; Ohta, Haruhisa; Yokoi, Hideki; Iwanami, Akira; Kato, Nobumasa; Shinohara, Kazuyuki

    2013-09-01

    This study investigated the ability of adults with Asperger syndrome to recognize emotional categories of facial expressions and emotional prosodies with graded emotional intensities. The individuals with Asperger syndrome showed poorer recognition performance for angry and sad expressions from both facial and vocal information. The group difference in facial expression recognition was prominent for stimuli with low or intermediate emotional intensities. In contrast to this, the individuals with Asperger syndrome exhibited lower recognition accuracy than typically-developed controls mainly for emotional prosody with high emotional intensity. In facial expression recognition, Asperger and control groups showed an inversion effect for all categories. The magnitude of this effect was less in the Asperger group for angry and sad expressions, presumably attributable to reduced recruitment of the configural mode of face processing. The individuals with Asperger syndrome outperformed the control participants in recognizing inverted sad expressions, indicating enhanced processing of local facial information representing sad emotion. These results suggest that the adults with Asperger syndrome rely on modality-specific strategies in emotion recognition from facial expression and prosodic information.

  10. Memory bias for negative emotional words in recognition memory is driven by effects of category membership.

    Science.gov (United States)

    White, Corey N; Kapucu, Aycan; Bruno, Davide; Rotello, Caren M; Ratcliff, Roger

    2014-01-01

    Recognition memory studies often find that emotional items are more likely than neutral items to be labelled as studied. Previous work suggests this bias is driven by increased memory strength/familiarity for emotional items. We explored strength and bias interpretations of this effect with the conjecture that emotional stimuli might seem more familiar because they share features with studied items from the same category. Categorical effects were manipulated in a recognition task by presenting lists with a small, medium or large proportion of emotional words. The liberal memory bias for emotional words was only observed when a medium or large proportion of categorised words were presented in the lists. Similar, though weaker, effects were observed with categorised words that were not emotional (animal names). These results suggest that liberal memory bias for emotional items may be largely driven by effects of category membership.

  11. Recognition memory for low- and high-frequency-filtered emotional faces: Low spatial frequencies drive emotional memory enhancement, whereas high spatial frequencies drive the emotion-induced recognition bias.

    Science.gov (United States)

    Rohr, Michaela; Tröger, Johannes; Michely, Nils; Uhde, Alarith; Wentura, Dirk

    2017-07-01

    This article deals with two well-documented phenomena regarding emotional stimuli: emotional memory enhancement-that is, better long-term memory for emotional than for neutral stimuli-and the emotion-induced recognition bias-that is, a more liberal response criterion for emotional than for neutral stimuli. Studies on visual emotion perception and attention suggest that emotion-related processes can be modulated by means of spatial-frequency filtering of the presented emotional stimuli. Specifically, low spatial frequencies are assumed to play a primary role for the influence of emotion on attention and judgment. Given this theoretical background, we investigated whether spatial-frequency filtering also impacts (1) the memory advantage for emotional faces and (2) the emotion-induced recognition bias, in a series of old/new recognition experiments. Participants completed incidental-learning tasks with high- (HSF) and low- (LSF) spatial-frequency-filtered emotional and neutral faces. The results of the surprise recognition tests showed a clear memory advantage for emotional stimuli. Most importantly, the emotional memory enhancement was significantly larger for face images containing only low-frequency information (LSF faces) than for HSF faces across all experiments, suggesting that LSF information plays a critical role in this effect, whereas the emotion-induced recognition bias was found only for HSF stimuli. We discuss our findings in terms of both the traditional account of different processing pathways for HSF and LSF information and a stimulus features account. The double dissociation in the results favors the latter account-that is, an explanation in terms of differences in the characteristics of HSF and LSF stimuli.

  12. Investigating Patterns for Self-Induced Emotion Recognition from EEG Signals.

    Science.gov (United States)

    Zhuang, Ning; Zeng, Ying; Yang, Kai; Zhang, Chi; Tong, Li; Yan, Bin

    2018-03-12

    Most current approaches to emotion recognition are based on neural signals elicited by affective materials such as images, sounds and videos. However, the application of neural patterns in the recognition of self-induced emotions remains uninvestigated. In this study we inferred the patterns and neural signatures of self-induced emotions from electroencephalogram (EEG) signals. The EEG signals of 30 participants were recorded while they watched 18 Chinese movie clips which were intended to elicit six discrete emotions, including joy, neutrality, sadness, disgust, anger and fear. After watching each movie clip the participants were asked to self-induce emotions by recalling a specific scene from each movie. We analyzed the important features, electrode distribution and average neural patterns of different self-induced emotions. Results demonstrated that features related to high-frequency rhythm of EEG signals from electrodes distributed in the bilateral temporal, prefrontal and occipital lobes have outstanding performance in the discrimination of emotions. Moreover, the six discrete categories of self-induced emotion exhibit specific neural patterns and brain topography distributions. We achieved an average accuracy of 87.36% in the discrimination of positive from negative self-induced emotions and 54.52% in the classification of emotions into six discrete categories. Our research will help promote the development of comprehensive endogenous emotion recognition methods.

  13. Investigating Patterns for Self-Induced Emotion Recognition from EEG Signals

    Science.gov (United States)

    Zeng, Ying; Yang, Kai; Tong, Li; Yan, Bin

    2018-01-01

    Most current approaches to emotion recognition are based on neural signals elicited by affective materials such as images, sounds and videos. However, the application of neural patterns in the recognition of self-induced emotions remains uninvestigated. In this study we inferred the patterns and neural signatures of self-induced emotions from electroencephalogram (EEG) signals. The EEG signals of 30 participants were recorded while they watched 18 Chinese movie clips which were intended to elicit six discrete emotions, including joy, neutrality, sadness, disgust, anger and fear. After watching each movie clip the participants were asked to self-induce emotions by recalling a specific scene from each movie. We analyzed the important features, electrode distribution and average neural patterns of different self-induced emotions. Results demonstrated that features related to high-frequency rhythm of EEG signals from electrodes distributed in the bilateral temporal, prefrontal and occipital lobes have outstanding performance in the discrimination of emotions. Moreover, the six discrete categories of self-induced emotion exhibit specific neural patterns and brain topography distributions. We achieved an average accuracy of 87.36% in the discrimination of positive from negative self-induced emotions and 54.52% in the classification of emotions into six discrete categories. Our research will help promote the development of comprehensive endogenous emotion recognition methods. PMID:29534515

  14. Identifying at-risk states beyond positive symptoms: a brief task assessing how neurocognitive impairments impact on misrepresentation of the social world through blunted emotional appraisal

    OpenAIRE

    Galdos,Mariana; Simons,Claudia J.P.; Wichers,Marieke; Fernandez-Rivas,Aranzazu; Martinez-Azumendi,Oscar; Lataster,Tineke; Amer,Guillermo; Myin-Germeys,Inez; Gonzalez-Torres,Miguel Angel; Os,Jim van

    2011-01-01

    OBJECTIVE: Neurocognitive impairments observed in psychotic disorder may impact on emotion recognition and theory of mind, resulting in altered understanding of the social world. Early intervention efforts would be served by further elucidation of this mechanism. METHOD: Patients with a psychotic disorder (n=30) and a reference control group (n=310) were asked to offer emotional appraisals of images of social situations (EASS task). The degree to which case-control differences in appraisals w...

  15. Emotion recognition based on multiple order features using fractional Fourier transform

    Science.gov (United States)

    Ren, Bo; Liu, Deyin; Qi, Lin

    2017-07-01

    In order to deal with the insufficiency of recently algorithms based on Two Dimensions Fractional Fourier Transform (2D-FrFT), this paper proposes a multiple order features based method for emotion recognition. Most existing methods utilize the feature of single order or a couple of orders of 2D-FrFT. However, different orders of 2D-FrFT have different contributions on the feature extraction of emotion recognition. Combination of these features can enhance the performance of an emotion recognition system. The proposed approach obtains numerous features that extracted in different orders of 2D-FrFT in the directions of x-axis and y-axis, and uses the statistical magnitudes as the final feature vectors for recognition. The Support Vector Machine (SVM) is utilized for the classification and RML Emotion database and Cohn-Kanade (CK) database are used for the experiment. The experimental results demonstrate the effectiveness of the proposed method.

  16. Facial Emotions Recognition using Gabor Transform and Facial Animation Parameters with Neural Networks

    Science.gov (United States)

    Harit, Aditya; Joshi, J. C., Col; Gupta, K. K.

    2018-03-01

    The paper proposed an automatic facial emotion recognition algorithm which comprises of two main components: feature extraction and expression recognition. The algorithm uses a Gabor filter bank on fiducial points to find the facial expression features. The resulting magnitudes of Gabor transforms, along with 14 chosen FAPs (Facial Animation Parameters), compose the feature space. There are two stages: the training phase and the recognition phase. Firstly, for the present 6 different emotions, the system classifies all training expressions in 6 different classes (one for each emotion) in the training stage. In the recognition phase, it recognizes the emotion by applying the Gabor bank to a face image, then finds the fiducial points, and then feeds it to the trained neural architecture.

  17. Comparison of Emotion Recognition and Mind Reading Abilities in Opium Abusers and Healthy Matched Individuals

    Directory of Open Access Journals (Sweden)

    Vahid Nejati

    2012-05-01

    Full Text Available Introduction: The purpose of this study is to compare the emotion recognition and mind reading in opium abusers and healthy individuals. Method: In this causative-comparative study, with a non probability sampling method, 30 opium abusers compared with 30 healthy individuals that were matched in sex and education. Neurocognitive tests of reading mind from eyes and emotion recognition from face were used for evaluation. Independent T-Test was used for analysis. Findings: The results showed that opium abusers had significantly lower abilities in mind reading than healthy matched individuals. Also opium abusers had significantly lower performance in recognition of emotional experience of happy, sad and angry faces. Conclusion: Based on weak performance of mind reading and emotion recognition in addicts, it is advised that social cognition evaluation considered in drug abusers evaluation. Future interventional study could propose social cognition rehabilitation programs for addicts.

  18. Transcutaneous vagus nerve stimulation (tVNS) enhances recognition of emotions in faces but not bodies.

    Science.gov (United States)

    Sellaro, Roberta; de Gelder, Beatrice; Finisguerra, Alessandra; Colzato, Lorenza S

    2018-02-01

    The polyvagal theory suggests that the vagus nerve is the key phylogenetic substrate enabling optimal social interactions, a crucial aspect of which is emotion recognition. A previous study showed that the vagus nerve plays a causal role in mediating people's ability to recognize emotions based on images of the eye region. The aim of this study is to verify whether the previously reported causal link between vagal activity and emotion recognition can be generalized to situations in which emotions must be inferred from images of whole faces and bodies. To this end, we employed transcutaneous vagus nerve stimulation (tVNS), a novel non-invasive brain stimulation technique that causes the vagus nerve to fire by the application of a mild electrical stimulation to the auricular branch of the vagus nerve, located in the anterior protuberance of the outer ear. In two separate sessions, participants received active or sham tVNS before and while performing two emotion recognition tasks, aimed at indexing their ability to recognize emotions from facial and bodily expressions. Active tVNS, compared to sham stimulation, enhanced emotion recognition for whole faces but not for bodies. Our results confirm and further extend recent observations supporting a causal relationship between vagus nerve activity and the ability to infer others' emotional state, but restrict this association to situations in which the emotional state is conveyed by the whole face and/or by salient facial cues, such as eyes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Multi-Stage Recognition of Speech Emotion Using Sequential Forward Feature Selection

    Directory of Open Access Journals (Sweden)

    Liogienė Tatjana

    2016-07-01

    Full Text Available The intensive research of speech emotion recognition introduced a huge collection of speech emotion features. Large feature sets complicate the speech emotion recognition task. Among various feature selection and transformation techniques for one-stage classification, multiple classifier systems were proposed. The main idea of multiple classifiers is to arrange the emotion classification process in stages. Besides parallel and serial cases, the hierarchical arrangement of multi-stage classification is most widely used for speech emotion recognition. In this paper, we present a sequential-forward-feature-selection-based multi-stage classification scheme. The Sequential Forward Selection (SFS and Sequential Floating Forward Selection (SFFS techniques were employed for every stage of the multi-stage classification scheme. Experimental testing of the proposed scheme was performed using the German and Lithuanian emotional speech datasets. Sequential-feature-selection-based multi-stage classification outperformed the single-stage scheme by 12–42 % for different emotion sets. The multi-stage scheme has shown higher robustness to the growth of emotion set. The decrease in recognition rate with the increase in emotion set for multi-stage scheme was lower by 10–20 % in comparison with the single-stage case. Differences in SFS and SFFS employment for feature selection were negligible.

  20. A gesture-controlled Serious Game for teaching emotion recognition skills to preschoolers with autism

    OpenAIRE

    Christinaki, Eirini; Triantafyllidis, Georgios; Vidakis, Nikolaos

    2013-01-01

    The recognition of facial expressions is important for the perception of emotions. Understanding emotions is essential in human communication and social interaction. Children with autism have been reported to exhibit deficits in the recognition of affective expressions. With the appropriate intervention, elimination of those deficits can be achieved. Interventions are proposed to start as early as possible. Computer-based programs have been widely used with success to teach people with autism...

  1. Associations between facial emotion recognition and young adolescents' behaviors in bullying.

    Directory of Open Access Journals (Sweden)

    Tiziana Pozzoli

    Full Text Available This study investigated whether different behaviors young adolescents can act during bullying episodes were associated with their ability to recognize morphed facial expressions of the six basic emotions, expressed at high and low intensity. The sample included 117 middle-school students (45.3% girls; mean age = 12.4 years who filled in a peer nomination questionnaire and individually performed a computerized emotion recognition task. Bayesian generalized mixed-effects models showed a complex picture, in which type and intensity of emotions, students' behavior and gender interacted in explaining recognition accuracy. Results were discussed with a particular focus on negative emotions and suggesting a "neutral" nature of emotion recognition ability, which does not necessarily lead to moral behavior but can also be used for pursuing immoral goals.

  2. Emotion Recognition from EEG Signals Using Multidimensional Information in EMD Domain.

    Science.gov (United States)

    Zhuang, Ning; Zeng, Ying; Tong, Li; Zhang, Chi; Zhang, Hanming; Yan, Bin

    2017-01-01

    This paper introduces a method for feature extraction and emotion recognition based on empirical mode decomposition (EMD). By using EMD, EEG signals are decomposed into Intrinsic Mode Functions (IMFs) automatically. Multidimensional information of IMF is utilized as features, the first difference of time series, the first difference of phase, and the normalized energy. The performance of the proposed method is verified on a publicly available emotional database. The results show that the three features are effective for emotion recognition. The role of each IMF is inquired and we find that high frequency component IMF1 has significant effect on different emotional states detection. The informative electrodes based on EMD strategy are analyzed. In addition, the classification accuracy of the proposed method is compared with several classical techniques, including fractal dimension (FD), sample entropy, differential entropy, and discrete wavelet transform (DWT). Experiment results on DEAP datasets demonstrate that our method can improve emotion recognition performance.

  3. Associations between facial emotion recognition and young adolescents’ behaviors in bullying

    Science.gov (United States)

    Gini, Gianluca; Altoè, Gianmarco

    2017-01-01

    This study investigated whether different behaviors young adolescents can act during bullying episodes were associated with their ability to recognize morphed facial expressions of the six basic emotions, expressed at high and low intensity. The sample included 117 middle-school students (45.3% girls; mean age = 12.4 years) who filled in a peer nomination questionnaire and individually performed a computerized emotion recognition task. Bayesian generalized mixed-effects models showed a complex picture, in which type and intensity of emotions, students’ behavior and gender interacted in explaining recognition accuracy. Results were discussed with a particular focus on negative emotions and suggesting a “neutral” nature of emotion recognition ability, which does not necessarily lead to moral behavior but can also be used for pursuing immoral goals. PMID:29131871

  4. Extraction Of Audio Features For Emotion Recognition System Based On Music

    Directory of Open Access Journals (Sweden)

    Kee Moe Han

    2015-08-01

    Full Text Available Music is the combination of melody linguistic information and the vocalists emotion. Since music is a work of art analyzing emotion in music by computer is a difficult task. Many approaches have been developed to detect the emotions included in music but the results are not satisfactory because emotion is very complex. In this paper the evaluations of audio features from the music files are presented. The extracted features are used to classify the different emotion classes of the vocalists. Musical features extraction is done by using Music Information Retrieval MIR tool box in this paper. The database of 100 music clips are used to classify the emotions perceived in music clips. Music may contain many emotions according to the vocalists mood such as happy sad nervous bored peace etc. In this paper the audio features related to the emotions of the vocalists are extracted to use in emotion recognition system based on music.

  5. Age differences in right-wing authoritarianism and their relation to emotion recognition.

    Science.gov (United States)

    Ruffman, Ted; Wilson, Marc; Henry, Julie D; Dawson, Abigail; Chen, Yan; Kladnitski, Natalie; Myftari, Ella; Murray, Janice; Halberstadt, Jamin; Hunter, John A

    2016-03-01

    This study examined the correlates of right-wing authoritarianism (RWA) in older adults. Participants were given tasks measuring emotion recognition, executive functions and fluid IQ and questionnaires measuring RWA, perceived threat and social dominance orientation. Study 1 established higher age-related RWA across the age span in more than 2,600 New Zealanders. Studies 2 to 4 found that threat, education, social dominance and age all predicted unique variance in older adults' RWA, but the most consistent predictor was emotion recognition, predicting unique variance in older adults' RWA independent of all other variables. We argue that older adults' worse emotion recognition is associated with a more general change in social judgment. Expression of extreme attitudes (right- or left-wing) has the potential to antagonize others, but worse emotion recognition means that subtle signals will not be perceived, making the expression of extreme attitudes more likely. Our findings are consistent with other studies showing that worsening emotion recognition underlies age-related declines in verbosity, understanding of social gaffes, and ability to detect lies. Such results indicate that emotion recognition is a core social insight linked to many aspects of social cognition. (c) 2016 APA, all rights reserved).

  6. Facial emotion recognition in male antisocial personality disorders with or without adult attention deficit hyperactivity disorder.

    Science.gov (United States)

    Bagcioglu, Erman; Isikli, Hasmet; Demirel, Husrev; Sahin, Esat; Kandemir, Eyup; Dursun, Pinar; Yuksek, Erhan; Emul, Murat

    2014-07-01

    We aimed to investigate facial emotion recognition abilities in violent individuals with antisocial personality disorder who have comorbid attention deficient hyperactivity disorder (ADHD) or not. The photos of happy, surprised, fearful, sad, angry, disgust, and neutral facial expressions and Wender Utah Rating Scale have been performed in all groups. The mean ages were as follows: in antisocial personality disorder with ADHD 22.0 ± 1.59, in pure antisocial individuals 21.90 ± 1.80 and in controls 22.97 ± 2.85 (p>0.05). The mean score in Wender Utah Rating Scale was significantly different between groups (p0.05) excluding disgust faces which was significantly impaired in ASPD+ADHD and pure ASPD groups. Antisocial individuals with attention deficient and hyperactivity had spent significantly more time to each facial emotion than healthy controls (pantisocial individual had more time to recognize disgust and neutral faces than healthy controls (pantisocial individuals and antisocial individuals with ADHD. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Deficits in recognition, identification, and discrimination of facial emotions in patients with bipolar disorder.

    Science.gov (United States)

    Benito, Adolfo; Lahera, Guillermo; Herrera, Sara; Muncharaz, Ramón; Benito, Guillermo; Fernández-Liria, Alberto; Montes, José Manuel

    2013-01-01

    To analyze the recognition, identification, and discrimination of facial emotions in a sample of outpatients with bipolar disorder (BD). Forty-four outpatients with diagnosis of BD and 48 matched control subjects were selected. Both groups were assessed with tests for recognition (Emotion Recognition-40 - ER40), identification (Facial Emotion Identification Test - FEIT), and discrimination (Facial Emotion Discrimination Test - FEDT) of facial emotions, as well as a theory of mind (ToM) verbal test (Hinting Task). Differences between groups were analyzed, controlling the influence of mild depressive and manic symptoms. Patients with BD scored significantly lower than controls on recognition (ER40), identification (FEIT), and discrimination (FEDT) of emotions. Regarding the verbal measure of ToM, a lower score was also observed in patients compared to controls. Patients with mild syndromal depressive symptoms obtained outcomes similar to patients in euthymia. A significant correlation between FEDT scores and global functioning (measured by the Functioning Assessment Short Test, FAST) was found. These results suggest that, even in euthymia, patients with BD experience deficits in recognition, identification, and discrimination of facial emotions, with potential functional implications.

  8. The Relationship between Emotion Recognition Ability and Social Skills in Young Children with Autism

    Science.gov (United States)

    Williams, Beth T.; Gray, Kylie M.

    2013-01-01

    This study assessed the relationship between emotion recognition ability and social skills in 42 young children with autistic disorder aged 4-7 years. The analyses revealed that accuracy in recognition of sadness, but not happiness, anger or fear, was associated with higher ratings on the Vineland-II Socialization domain, above and beyond the…

  9. Teaching emotion recognition skills to young children with autism: a randomised controlled trial of an emotion training programme.

    Science.gov (United States)

    Williams, Beth T; Gray, Kylie M; Tonge, Bruce J

    2012-12-01

    Children with autism have difficulties in emotion recognition and a number of interventions have been designed to target these problems. However, few emotion training interventions have been trialled with young children with autism and co-morbid ID. This study aimed to evaluate the efficacy of an emotion training programme for a group of young children with autism with a range of intellectual ability. Participants were 55 children with autistic disorder, aged 4-7 years (FSIQ 42-107). Children were randomly assigned to an intervention (n = 28) or control group (n = 27). Participants in the intervention group watched a DVD designed to teach emotion recognition skills to children with autism (the Transporters), whereas the control group watched a DVD of Thomas the Tank Engine. Participants were assessed on their ability to complete basic emotion recognition tasks, mindreading and theory of mind (TOM) tasks before and after the 4-week intervention period, and at 3-month follow-up. Analyses controlled for the effect of chronological age, verbal intelligence, gender and DVD viewing time on outcomes. Children in the intervention group showed improved performance in the recognition of anger compared with the control group, with few improvements maintained at 3-month follow-up. There was no generalisation of skills to TOM or social skills. The Transporters programme showed limited efficacy in teaching basic emotion recognition skills to young children with autism with a lower range of cognitive ability. Improvements were limited to the recognition of expressions of anger, with poor maintenance of these skills at follow-up. These findings provide limited support for the efficacy of the Transporters programme for young children with autism of a lower cognitive range. © 2012 The Authors. Journal of Child Psychology and Psychiatry © 2012 Association for Child and Adolescent Mental Health.

  10. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias

    OpenAIRE

    Invitto, Sara; Calcagn?, Antonio; Mignozzi, Arianna; Scardino, Rosanna; Piraino, Giulia; Turchi, Daniele; De Feudis, Irio; Brunetti, Antonio; Bevilacqua, Vitoantonio; de Tommaso, Marina

    2017-01-01

    Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emo...

  11. Basic and complex emotion recognition in children with autism: cross-cultural findings.

    Science.gov (United States)

    Fridenson-Hayo, Shimrit; Berggren, Steve; Lassalle, Amandine; Tal, Shahar; Pigat, Delia; Bölte, Sven; Baron-Cohen, Simon; Golan, Ofer

    2016-01-01

    Children with autism spectrum conditions (ASC) have emotion recognition deficits when tested in different expression modalities (face, voice, body). However, these findings usually focus on basic emotions, using one or two expression modalities. In addition, cultural similarities and differences in emotion recognition patterns in children with ASC have not been explored before. The current study examined the similarities and differences in the recognition of basic and complex emotions by children with ASC and typically developing (TD) controls across three cultures: Israel, Britain, and Sweden. Fifty-five children with high-functioning ASC, aged 5-9, were compared to 58 TD children. On each site, groups were matched on age, sex, and IQ. Children were tested using four tasks, examining recognition of basic and complex emotions from voice recordings, videos of facial and bodily expressions, and emotional video scenarios including all modalities in context. Compared to their TD peers, children with ASC showed emotion recognition deficits in both basic and complex emotions on all three modalities and their integration in context. Complex emotions were harder to recognize, compared to basic emotions for the entire sample. Cross-cultural agreement was found for all major findings, with minor deviations on the face and body tasks. Our findings highlight the multimodal nature of ER deficits in ASC, which exist for basic as well as complex emotions and are relatively stable cross-culturally. Cross-cultural research has the potential to reveal both autism-specific universal deficits and the role that specific cultures play in the way empathy operates in different countries.

  12. Variability in the impairments of recognition memory in patients with frontal lobe lesions

    OpenAIRE

    Bastin, Christine; Van der Linden, Martial; Lekeu, Françoise; Andrés, Pilar; Salmon, Eric

    2006-01-01

    Fourteen patients with frontal lobe lesions and 14 normal subjects were tested on a recognition memory task that required discriminating between target words, new words that are synonyms of the targets and unrelated distractors. A deficit was found in 12 of the patients. Moreover, three different patterns of recognition impairment were identified: (I) poor memory for targets, (II) normal hits but increased false recognitions for both types of distractors, (III) normal hit rates, but increased...

  13. A small-world network model of facial emotion recognition.

    Science.gov (United States)

    Takehara, Takuma; Ochiai, Fumio; Suzuki, Naoto

    2016-01-01

    Various models have been proposed to increase understanding of the cognitive basis of facial emotions. Despite those efforts, interactions between facial emotions have received minimal attention. If collective behaviours relating to each facial emotion in the comprehensive cognitive system could be assumed, specific facial emotion relationship patterns might emerge. In this study, we demonstrate that the frameworks of complex networks can effectively capture those patterns. We generate 81 facial emotion images (6 prototypes and 75 morphs) and then ask participants to rate degrees of similarity in 3240 facial emotion pairs in a paired comparison task. A facial emotion network constructed on the basis of similarity clearly forms a small-world network, which features an extremely short average network distance and close connectivity. Further, even if two facial emotions have opposing valences, they are connected within only two steps. In addition, we show that intermediary morphs are crucial for maintaining full network integration, whereas prototypes are not at all important. These results suggest the existence of collective behaviours in the cognitive systems of facial emotions and also describe why people can efficiently recognize facial emotions in terms of information transmission and propagation. For comparison, we construct three simulated networks--one based on the categorical model, one based on the dimensional model, and one random network. The results reveal that small-world connectivity in facial emotion networks is apparently different from those networks, suggesting that a small-world network is the most suitable model for capturing the cognitive basis of facial emotions.

  14. Investigating emotion recognition and empathy deficits in Conduct Disorder using behavioural and eye-tracking methods

    OpenAIRE

    Martin-Key, Nayra, Anna

    2017-01-01

    The aim of this thesis was to characterise the nature of the emotion recognition and empathy deficits observed in male and female adolescents with Conduct Disorder (CD) and varying levels of callous-unemotional (CU) traits. The first two experiments employed behavioural tasks with concurrent eye-tracking methods to explore the mechanisms underlying facial and body expression recognition deficits. Having CD and being male independently predicted poorer facial expression recognition across all ...

  15. Exploring cultural differences in the recognition of the self-conscious emotions

    NARCIS (Netherlands)

    Chung, J.M.H.; Robins, R.W.

    2015-01-01

    Recent research suggests that the self-conscious emotions of embarrassment, shame, and pride have distinct, nonverbal expressions that can be recognized in the United States at above-chance levels. However, few studies have examined the recognition of these emotions in other cultures, and little

  16. Facial Emotion Recognition in Children with High Functioning Autism and Children with Social Phobia

    Science.gov (United States)

    Wong, Nina; Beidel, Deborah C.; Sarver, Dustin E.; Sims, Valerie

    2012-01-01

    Recognizing facial affect is essential for effective social functioning. This study examines emotion recognition abilities in children aged 7-13 years with High Functioning Autism (HFA = 19), Social Phobia (SP = 17), or typical development (TD = 21). Findings indicate that all children identified certain emotions more quickly (e.g., happy [less…

  17. Abnormal Facial Emotion Recognition in Depression: Serial Testing in an Ultra-Rapid-Cycling Patient.

    Science.gov (United States)

    George, Mark S.; Huggins, Teresa; McDermut, Wilson; Parekh, Priti I.; Rubinow, David; Post, Robert M.

    1998-01-01

    Mood disorder subjects have a selective deficit in recognizing human facial emotion. Whether the facial emotion recognition errors persist during normal mood states (i.e., are state vs. trait dependent) was studied in one male bipolar II patient. Results of five sessions are presented and discussed. (Author/EMK)

  18. A multimodal approach to emotion recognition ability in autism spectrum disorders

    NARCIS (Netherlands)

    Jones, C.R.G.; Pickles, A.; Falcaro, M.; Marsden, A.J.S.; Happé, F.; Scott, S.K.; Sauter, D.; Tregay, J.; Phillips, R.J.; Baird, G.; Simonoff, E.; Charman, T.

    2011-01-01

    Background:  Autism spectrum disorders (ASD) are characterised by social and communication difficulties in day-to-day life, including problems in recognising emotions. However, experimental investigations of emotion recognition ability in ASD have been equivocal, hampered by small sample sizes,

  19. Psilocybin with psychological support improves emotional face recognition in treatment-resistant depression.

    Science.gov (United States)

    Stroud, J B; Freeman, T P; Leech, R; Hindocha, C; Lawn, W; Nutt, D J; Curran, H V; Carhart-Harris, R L

    2018-02-01

    Depressed patients robustly exhibit affective biases in emotional processing which are altered by SSRIs and predict clinical outcome. The objective of this study is to investigate whether psilocybin, recently shown to rapidly improve mood in treatment-resistant depression (TRD), alters patients' emotional processing biases. Seventeen patients with treatment-resistant depression completed a dynamic emotional face recognition task at baseline and 1 month later after two doses of psilocybin with psychological support. Sixteen controls completed the emotional recognition task over the same time frame but did not receive psilocybin. We found evidence for a group × time interaction on speed of emotion recognition (p = .035). At baseline, patients were slower at recognising facial emotions compared with controls (p psilocybin, this difference was remediated (p = .208). Emotion recognition was faster at follow-up compared with baseline in patients (p = .004, d = .876) but not controls (p = .263, d = .302). In patients, this change was significantly correlated with a reduction in anhedonia over the same time period (r = .640, p = .010). Psilocybin with psychological support appears to improve processing of emotional faces in treatment-resistant depression, and this correlates with reduced anhedonia. Placebo-controlled studies are warranted to follow up these preliminary findings.

  20. Emotion recognition from speech by combining databases and fusion of classifiers

    NARCIS (Netherlands)

    Lefter, I.; Rothkrantz, L.J.M.; Wiggers, P.; Leeuwen, D.A. van

    2010-01-01

    We explore possibilities for enhancing the generality, portability and robustness of emotion recognition systems by combining data-bases and by fusion of classifiers. In a first experiment, we investigate the performance of an emotion detection system tested on a certain database given that it is

  1. Emotion recognition abilities across stimulus modalities in schizophrenia and the role of visual attention.

    Science.gov (United States)

    Simpson, Claire; Pinkham, Amy E; Kelsven, Skylar; Sasson, Noah J

    2013-12-01

    Emotion can be expressed by both the voice and face, and previous work suggests that presentation modality may impact emotion recognition performance in individuals with schizophrenia. We investigated the effect of stimulus modality on emotion recognition accuracy and the potential role of visual attention to faces in emotion recognition abilities. Thirty-one patients who met DSM-IV criteria for schizophrenia (n=8) or schizoaffective disorder (n=23) and 30 non-clinical control individuals participated. Both groups identified emotional expressions in three different conditions: audio only, visual only, combined audiovisual. In the visual only and combined conditions, time spent visually fixating salient features of the face were recorded. Patients were significantly less accurate than controls in emotion recognition during both the audio and visual only conditions but did not differ from controls on the combined condition. Analysis of visual scanning behaviors demonstrated that patients attended less than healthy individuals to the mouth in the visual condition but did not differ in visual attention to salient facial features in the combined condition, which may in part explain the absence of a deficit for patients in this condition. Collectively, these findings demonstrate that patients benefit from multimodal stimulus presentations of emotion and support hypotheses that visual attention to salient facial features may serve as a mechanism for accurate emotion identification. © 2013.

  2. Emotion and memory: a recognition advantage for positive and negative words independent of arousal.

    Science.gov (United States)

    Adelman, James S; Estes, Zachary

    2013-12-01

    Much evidence indicates that emotion enhances memory, but the precise effects of the two primary factors of arousal and valence remain at issue. Moreover, the current knowledge of emotional memory enhancement is based mostly on small samples of extremely emotive stimuli presented in unnaturally high proportions without adequate affective, lexical, and semantic controls. To investigate how emotion affects memory under conditions of natural variation, we tested whether arousal and valence predicted recognition memory for over 2500 words that were not sampled for their emotionality, and we controlled a large variety of lexical and semantic factors. Both negative and positive stimuli were remembered better than neutral stimuli, whether arousing or calming. Arousal failed to predict recognition memory, either independently or interactively with valence. Results support models that posit a facilitative role of valence in memory. This study also highlights the importance of stimulus controls and experimental designs in research on emotional memory. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Self-reported emotional dysregulation but no impairment of emotional intelligence in borderline personality disorder: an explorative study.

    Science.gov (United States)

    Beblo, Thomas; Pastuszak, Anna; Griepenstroh, Julia; Fernando, Silvia; Driessen, Martin; Schütz, Astrid; Rentzsch, Katrin; Schlosser, Nicole

    2010-05-01

    Emotional dysfunction is a key feature of patients with borderline personality disorder (BPD) but emotional intelligence (EI) has rarely been investigated in this sample. This study aimed at an investigation of ability EI, general intelligence, and self-reported emotion regulation in BPD. We included 19 patients with BPD and 20 healthy control subjects in the study. EI was assessed by means of the Mayer-Salovey-Caruso emotional intelligence test and the test of emotional intelligence. For the assessment of general intelligence, we administered the multidimensional "Leistungsprüfsystem-Kurzversion." The emotion regulation questionnaire and the difficulties in Emotion Regulation Scale were used to assess emotion regulation. The patients with BPD did not exhibit impairments of ability EI and general intelligence but reported severe impairments in emotion regulation. Ability EI was related both to general intelligence (patients and controls) and to self-reported emotion regulation (patients). In conclusion, emotional dysfunction in BPD might primarily affect self-perceived behavior rather than abilities. Intense negative emotions in everyday life may trigger dysfunctional emotion regulation strategies in BPD although patients possess sufficient theoretical knowledge about optimal regulation strategies.

  4. Emotion Recognition by Body Movement Representation on the Manifold of Symmetric Positive Definite Matrices

    OpenAIRE

    Daoudi , Mohamed; Berretti , Stefano; Pala , Pietro; Delevoye , Yvonne ,; Bimbo , Alberto ,

    2017-01-01

    International audience; Emotion recognition is attracting great interest for its potential application in a multitude of real-life situations. Much of the Computer Vision research in this field has focused on relating emotions to facial expressions, with investigations rarely including more than upper body. In this work, we propose a new scenario, for which emotional states are related to 3D dynamics of the whole body motion. To address the complexity of human body movement, we used covarianc...

  5. ERP Correlates of Target-Distracter Differentiation in Repeated Runs of a Continuous Recognition Task with Emotional and Neutral Faces

    Science.gov (United States)

    Treese, Anne-Cecile; Johansson, Mikael; Lindgren, Magnus

    2010-01-01

    The emotional salience of faces has previously been shown to induce memory distortions in recognition memory tasks. This event-related potential (ERP) study used repeated runs of a continuous recognition task with emotional and neutral faces to investigate emotion-induced memory distortions. In the second and third runs, participants made more…

  6. Specific Patterns of Emotion Recognition from Faces in Children with ASD: Results of a Cross-Modal Matching Paradigm

    Science.gov (United States)

    Golan, Ofer; Gordon, Ilanit; Fichman, Keren; Keinan, Giora

    2018-01-01

    Children with ASD show emotion recognition difficulties, as part of their social communication deficits. We examined facial emotion recognition (FER) in intellectually disabled children with ASD and in younger typically developing (TD) controls, matched on mental age. Our emotion-matching paradigm employed three different modalities: facial, vocal…

  7. Spoken Word Recognition in Adolescents with Autism Spectrum Disorders and Specific Language Impairment

    Science.gov (United States)

    Loucas, Tom; Riches, Nick; Baird, Gillian; Pickles, Andrew; Simonoff, Emily; Chandler, Susie; Charman, Tony

    2013-01-01

    Spoken word recognition, during gating, appears intact in specific language impairment (SLI). This study used gating to investigate the process in adolescents with autism spectrum disorders plus language impairment (ALI). Adolescents with ALI, SLI, and typical language development (TLD), matched on nonverbal IQ listened to gated words that varied…

  8. Theory of Mind, Emotion Recognition and Social Perception in Individuals at Clinical High Risk for Psychosis: findings from the NAPLS-2 cohort.

    Science.gov (United States)

    Barbato, Mariapaola; Liu, Lu; Cadenhead, Kristin S; Cannon, Tyrone D; Cornblatt, Barbara A; McGlashan, Thomas H; Perkins, Diana O; Seidman, Larry J; Tsuang, Ming T; Walker, Elaine F; Woods, Scott W; Bearden, Carrie E; Mathalon, Daniel H; Heinssen, Robert; Addington, Jean

    2015-09-01

    Social cognition, the mental operations that underlie social interactions, is a major construct to investigate in schizophrenia. Impairments in social cognition are present before the onset of psychosis, and even in unaffected first-degree relatives, suggesting that social cognition may be a trait marker of the illness. In a large cohort of individuals at clinical high risk for psychosis (CHR) and healthy controls, three domains of social cognition (theory of mind, facial emotion recognition and social perception) were assessed to clarify which domains are impaired in this population. Six-hundred and seventy-five CHR individuals and 264 controls, who were part of the multi-site North American Prodromal Longitudinal Study, completed The Awareness of Social Inference Test , the Penn Emotion Recognition task , the Penn Emotion Differentiation task , and the Relationship Across Domains , measures of theory of mind, facial emotion recognition, and social perception, respectively. Social cognition was not related to positive and negative symptom severity, but was associated with age and IQ. CHR individuals demonstrated poorer performance on all measures of social cognition. However, after controlling for age and IQ, the group differences remained significant for measures of theory of mind and social perception, but not for facial emotion recognition. Theory of mind and social perception are impaired in individuals at CHR for psychosis. Age and IQ seem to play an important role in the arising of deficits in facial affect recognition. Future studies should examine the stability of social cognition deficits over time and their role, if any, in the development of psychosis.

  9. Arousal Rather than Basic Emotions Influence Long-Term Recognition Memory in Humans.

    Science.gov (United States)

    Marchewka, Artur; Wypych, Marek; Moslehi, Abnoos; Riegel, Monika; Michałowski, Jarosław M; Jednoróg, Katarzyna

    2016-01-01

    Emotion can influence various cognitive processes, however its impact on memory has been traditionally studied over relatively short retention periods and in line with dimensional models of affect. The present study aimed to investigate emotional effects on long-term recognition memory according to a combined framework of affective dimensions and basic emotions. Images selected from the Nencki Affective Picture System were rated on the scale of affective dimensions and basic emotions. After 6 months, subjects took part in a surprise recognition test during an fMRI session. The more negative the pictures the better they were remembered, but also the more false recognitions they provoked. Similar effects were found for the arousal dimension. Recognition success was greater for pictures with lower intensity of happiness and with higher intensity of surprise, sadness, fear, and disgust. Consecutive fMRI analyses showed a significant activation for remembered (recognized) vs. forgotten (not recognized) images in anterior cingulate and bilateral anterior insula as well as in bilateral caudate nuclei and right thalamus. Further, arousal was found to be the only subjective rating significantly modulating brain activation. Higher subjective arousal evoked higher activation associated with memory recognition in the right caudate and the left cingulate gyrus. Notably, no significant modulation was observed for other subjective ratings, including basic emotion intensities. These results emphasize the crucial role of arousal for long-term recognition memory and support the hypothesis that the memorized material, over time, becomes stored in a distributed cortical network including the core salience network and basal ganglia.

  10. Adjunctive selective estrogen receptor modulator increases neural activity in the hippocampus and inferior frontal gyrus during emotional face recognition in schizophrenia.

    Science.gov (United States)

    Ji, E; Weickert, C S; Lenroot, R; Kindler, J; Skilleter, A J; Vercammen, A; White, C; Gur, R E; Weickert, T W

    2016-05-03

    Estrogen has been implicated in the development and course of schizophrenia with most evidence suggesting a neuroprotective effect. Treatment with raloxifene, a selective estrogen receptor modulator, can reduce symptom severity, improve cognition and normalize brain activity during learning in schizophrenia. People with schizophrenia are especially impaired in the identification of negative facial emotions. The present study was designed to determine the extent to which adjunctive raloxifene treatment would alter abnormal neural activity during angry facial emotion recognition in schizophrenia. Twenty people with schizophrenia (12 men, 8 women) participated in a 13-week, randomized, double-blind, placebo-controlled, crossover trial of adjunctive raloxifene treatment (120 mg per day orally) and performed a facial emotion recognition task during functional magnetic resonance imaging after each treatment phase. Two-sample t-tests in regions of interest selected a priori were performed to assess activation differences between raloxifene and placebo conditions during the recognition of angry faces. Adjunctive raloxifene significantly increased activation in the right hippocampus and left inferior frontal gyrus compared with the placebo condition (family-wise error, Precognition in schizophrenia. These findings support the hypothesis that estrogen plays a modifying role in schizophrenia and shows that adjunctive raloxifene treatment may reverse abnormal neural activity during facial emotion recognition, which is relevant to impaired social functioning in men and women with schizophrenia.

  11. Perirhinal Cortex Muscarinic Receptor Blockade Impairs Taste Recognition Memory Formation

    OpenAIRE

    Gutiérrez, Ranier; De la Cruz, Vanesa; Rodriguez-Ortiz, Carlos J.; Bermudez-Rattoni, Federico

    2004-01-01

    The relevance of perirhinal cortical cholinergic and glutamatergic neurotransmission for taste recognition memory and learned taste aversion was assessed by microinfusions of muscarinic (scopolamine), NMDA (AP-5), and AMPA (NBQX) receptor antagonists. Infusions of scopolamine, but not AP5 or NBQX, prevented the consolidation of taste recognition memory using attenuation of neophobia as an index. In addition, learned taste aversion in both short- and long-term memory tests was exclusively impa...

  12. A New Fuzzy Cognitive Map Learning Algorithm for Speech Emotion Recognition

    Directory of Open Access Journals (Sweden)

    Wei Zhang

    2017-01-01

    Full Text Available Selecting an appropriate recognition method is crucial in speech emotion recognition applications. However, the current methods do not consider the relationship between emotions. Thus, in this study, a speech emotion recognition system based on the fuzzy cognitive map (FCM approach is constructed. Moreover, a new FCM learning algorithm for speech emotion recognition is proposed. This algorithm includes the use of the pleasure-arousal-dominance emotion scale to calculate the weights between emotions and certain mathematical derivations to determine the network structure. The proposed algorithm can handle a large number of concepts, whereas a typical FCM can handle only relatively simple networks (maps. Different acoustic features, including fundamental speech features and a new spectral feature, are extracted to evaluate the performance of the proposed method. Three experiments are conducted in this paper, namely, single feature experiment, feature combination experiment, and comparison between the proposed algorithm and typical networks. All experiments are performed on TYUT2.0 and EMO-DB databases. Results of the feature combination experiments show that the recognition rates of the combination features are 10%–20% better than those of single features. The proposed FCM learning algorithm generates 5%–20% performance improvement compared with traditional classification networks.

  13. Mapping structural covariance networks of facial emotion recognition in early psychosis: A pilot study.

    Science.gov (United States)

    Buchy, Lisa; Barbato, Mariapaola; Makowski, Carolina; Bray, Signe; MacMaster, Frank P; Deighton, Stephanie; Addington, Jean

    2017-11-01

    People with psychosis show deficits recognizing facial emotions and disrupted activation in the underlying neural circuitry. We evaluated associations between facial emotion recognition and cortical thickness using a correlation-based approach to map structural covariance networks across the brain. Fifteen people with an early psychosis provided magnetic resonance scans and completed the Penn Emotion Recognition and Differentiation tasks. Fifteen historical controls provided magnetic resonance scans. Cortical thickness was computed using CIVET and analyzed with linear models. Seed-based structural covariance analysis was done using the mapping anatomical correlations across the cerebral cortex methodology. To map structural covariance networks involved in facial emotion recognition, the right somatosensory cortex and bilateral fusiform face areas were selected as seeds. Statistics were run in SurfStat. Findings showed increased cortical covariance between the right fusiform face region seed and right orbitofrontal cortex in controls than early psychosis subjects. Facial emotion recognition scores were not significantly associated with thickness in any region. A negative effect of Penn Differentiation scores on cortical covariance was seen between the left fusiform face area seed and right superior parietal lobule in early psychosis subjects. Results suggest that facial emotion recognition ability is related to covariance in a temporal-parietal network in early psychosis. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Impaired processing of self-face recognition in anorexia nervosa.

    Science.gov (United States)

    Hirot, France; Lesage, Marine; Pedron, Lya; Meyer, Isabelle; Thomas, Pierre; Cottencin, Olivier; Guardia, Dewi

    2016-03-01

    Body image disturbances and massive weight loss are major clinical symptoms of anorexia nervosa (AN). The aim of the present study was to examine the influence of body changes and eating attitudes on self-face recognition ability in AN. Twenty-seven subjects suffering from AN and 27 control participants performed a self-face recognition task (SFRT). During the task, digital morphs between their own face and a gender-matched unfamiliar face were presented in a random sequence. Participants' self-face recognition failures, cognitive flexibility, body concern and eating habits were assessed with the Self-Face Recognition Questionnaire (SFRQ), Trail Making Test (TMT), Body Shape Questionnaire (BSQ) and Eating Disorder Inventory-2 (EDI-2), respectively. Subjects suffering from AN exhibited significantly greater difficulties than control participants in identifying their own face (p = 0.028). No significant difference was observed between the two groups for TMT (all p > 0.1, non-significant). Regarding predictors of self-face recognition skills, there was a negative correlation between SFRT and body mass index (p = 0.01) and a positive correlation between SFRQ and EDI-2 (p face recognition.

  15. Recognition Memory Is Impaired in Children after Prolonged Febrile Seizures

    Science.gov (United States)

    Martinos, Marina M.; Yoong, Michael; Patil, Shekhar; Chin, Richard F. M.; Neville, Brian G.; Scott, Rod C.; de Haan, Michelle

    2012-01-01

    Children with a history of a prolonged febrile seizure show signs of acute hippocampal injury on magnetic resonance imaging. In addition, animal studies have shown that adult rats who suffered febrile seizures during development reveal memory impairments. Together, these lines of evidence suggest that memory impairments related to hippocampal…

  16. Influence of oxytocin on emotion recognition from body language: A randomized placebo-controlled trial.

    Science.gov (United States)

    Bernaerts, Sylvie; Berra, Emmely; Wenderoth, Nicole; Alaerts, Kaat

    2016-10-01

    The neuropeptide 'oxytocin' (OT) is known to play a pivotal role in a variety of complex social behaviors by promoting a prosocial attitude and interpersonal bonding. One mechanism by which OT is hypothesized to promote prosocial behavior is by enhancing the processing of socially relevant information from the environment. With the present study, we explored to what extent OT can alter the 'reading' of emotional body language as presented by impoverished biological motion point light displays (PLDs). To do so, a double-blind between-subjects randomized placebo-controlled trial was conducted, assessing performance on a bodily emotion recognition task in healthy adult males before and after a single-dose of intranasal OT (24 IU). Overall, a single-dose of OT administration had a significant effect of medium size on emotion recognition from body language. OT-induced improvements in emotion recognition were not differentially modulated by the emotional valence of the presented stimuli (positive versus negative) and also, the overall tendency to label an observed emotional state as 'happy' (positive) or 'angry' (negative) was not modified by the administration of OT. Albeit moderate, the present findings of OT-induced improvements in bodily emotion recognition from whole-body PLD provide further support for a link between OT and the processing of socio-communicative cues originating from the body of others. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Darwin revisited: The vagus nerve is a causal element in controlling recognition of other's emotions.

    Science.gov (United States)

    Colzato, Lorenza S; Sellaro, Roberta; Beste, Christian

    2017-07-01

    Charles Darwin proposed that via the vagus nerve, the tenth cranial nerve, emotional facial expressions are evolved, adaptive and serve a crucial communicative function. In line with this idea, the later-developed polyvagal theory assumes that the vagus nerve is the key phylogenetic substrate that regulates emotional and social behavior. The polyvagal theory assumes that optimal social interaction, which includes the recognition of emotion in faces, is modulated by the vagus nerve. So far, in humans, it has not yet been demonstrated that the vagus plays a causal role in emotion recognition. To investigate this we employed transcutaneous vagus nerve stimulation (tVNS), a novel non-invasive brain stimulation technique that modulates brain activity via bottom-up mechanisms. A sham/placebo-controlled, randomized cross-over within-subjects design was used to infer a causal relation between the stimulated vagus nerve and the related ability to recognize emotions as indexed by the Reading the Mind in the Eyes Test in 38 healthy young volunteers. Active tVNS, compared to sham stimulation, enhanced emotion recognition for easy items, suggesting that it promoted the ability to decode salient social cues. Our results confirm that the vagus nerve is causally involved in emotion recognition, supporting Darwin's argumentation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Facial emotion recognition in Chinese with schizophrenia at early and chronic stages of illness.

    Science.gov (United States)

    Leung, Joey Shuk-Yan; Lee, Tatia M C; Lee, Chi-Chiu

    2011-12-30

    Deficits in facial emotion recognition have been recognised in Chinese patients diagnosed with schizophrenia. This study examined the relationship between chronicity of illness and performance of facial emotion recognition in Chinese with schizophrenia. There were altogether four groups of subjects matched for age and gender composition. The first and second groups comprised medically stable outpatients with first-episode schizophrenia (n=50) and their healthy controls (n=26). The third and fourth groups were patients with chronic schizophrenic illness (n=51) and their controls (n=28). The ability to recognise the six prototypical facial emotions was examined using locally validated coloured photographs from the Japanese and Caucasian Facial Expressions of Emotion. Chinese patients with schizophrenia, in both the first-episode and chronic stages, performed significantly worse than their control counterparts on overall facial emotion recognition, (Pemotion did not appear to have worsened over the course of disease progression, suggesting that recognition of facial emotion is a rather stable trait of the illness. The emotion-specific deficit may have implications for understanding the social difficulties in schizophrenia. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Emotion recognition in preschool children: associations with maternal depression and early parenting.

    Science.gov (United States)

    Kujawa, Autumn; Dougherty, Lea; Durbin, C Emily; Laptook, Rebecca; Torpey, Dana; Klein, Daniel N

    2014-02-01

    Emotion knowledge in childhood has been shown to predict social functioning and psychological well-being, but relatively little is known about parental factors that influence its development in early childhood. There is some evidence that both parenting behavior and maternal depression are associated with emotion recognition, but previous research has only examined these factors independently. The current study assessed auditory and visual emotion recognition ability among a large sample of preschool children to examine typical emotion recognition skills in children of this age, as well as the independent and interactive effects of maternal and paternal depression and negative parenting (i.e., hostility and intrusiveness). Results indicated that children were most accurate at identifying happy emotional expressions. The lowest accuracy was observed for neutral expressions. A significant interaction was found between maternal depression and negative parenting behavior: children with a maternal history of depression were particularly sensitive to the negative effects of maladaptive parenting behavior on emotion recognition ability. No significant effects were found for paternal depression. These results highlight the importance of examining the effects of multiple interacting factors on children's emotional development and provide suggestions for identifying children for targeted preventive interventions.

  20. The effects of alcohol on the recognition of facial expressions and microexpressions of emotion: enhanced recognition of disgust and contempt.

    Science.gov (United States)

    Felisberti, Fatima; Terry, Philip

    2015-09-01

    The study compared alcohol's effects on the recognition of briefly displayed facial expressions of emotion (so-called microexpressions) with expressions presented for a longer period. Using a repeated-measures design, we tested 18 participants three times (counterbalanced), after (i) a placebo drink, (ii) a low-to-moderate dose of alcohol (0.17 g/kg women; 0.20 g/kg men) and (iii) a moderate-to-high dose of alcohol (0.52 g/kg women; 0.60 g/kg men). On each session, participants were presented with stimuli representing six emotions (happiness, sadness, anger, fear, disgust and contempt) overlaid on a generic avatar in a six-alternative forced-choice paradigm. A neutral expression (1 s) preceded and followed a target expression presented for 200 ms (microexpressions) or 400 ms. Participants mouse clicked the correct answer. The recognition of disgust was significantly better after the high dose of alcohol than after the low dose or placebo drinks at both durations of stimulus presentation. A similar profile of effects was found for the recognition of contempt. There were no effects on response latencies. Alcohol can increase sensitivity to expressions of disgust and contempt. Such effects are not dependent on stimulus duration up to 400 ms and may reflect contextual modulation of alcohol's effects on emotion recognition. Copyright © 2015 John Wiley & Sons, Ltd.

  1. Gender differences in the relationship between social communication and emotion recognition.

    Science.gov (United States)

    Kothari, Radha; Skuse, David; Wakefield, Justin; Micali, Nadia

    2013-11-01

    To investigate the association between autistic traits and emotion recognition in a large community sample of children using facial and social motion cues, additionally stratifying by gender. A general population sample of 3,666 children from the Avon Longitudinal Study of Parents and Children (ALSPAC) were assessed on their ability to correctly recognize emotions using the faces subtest of the Diagnostic Analysis of Non-Verbal Accuracy, and the Emotional Triangles Task, a novel test assessing recognition of emotion from social motion cues. Children with autistic-like social communication difficulties, as assessed by the Social Communication Disorders Checklist, were compared with children without such difficulties. Autistic-like social communication difficulties were associated with poorer recognition of emotion from social motion cues in both genders, but were associated with poorer facial emotion recognition in boys only (odds ratio = 1.9, 95% CI = 1.4, 2.6, p = .0001). This finding must be considered in light of lower power to detect differences in girls. In this community sample of children, greater deficits in social communication skills are associated with poorer discrimination of emotions, implying there may be an underlying continuum of liability to the association between these characteristics. As a similar degree of association was observed in both genders on a novel test of social motion cues, the relatively good performance of girls on the more familiar task of facial emotion discrimination may be due to compensatory mechanisms. Our study might indicate the existence of a cognitive process by which girls with underlying autistic traits can compensate for their covert deficits in emotion recognition, although this would require further investigation. Copyright © 2013 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  2. An exploratory study on emotion recognition in patients with a clinically isolated syndrome and multiple sclerosis.

    Science.gov (United States)

    Jehna, Margit; Neuper, Christa; Petrovic, Katja; Wallner-Blazek, Mirja; Schmidt, Reinhold; Fuchs, Siegrid; Fazekas, Franz; Enzinger, Christian

    2010-07-01

    Multiple sclerosis (MS) is a chronic multifocal CNS disorder which can affect higher order cognitive processes. Whereas cognitive disturbances in MS are increasingly better characterised, emotional facial expression (EFE) has rarely been tested, despite its importance for adequate social behaviour. We tested 20 patients with a clinically isolated syndrome suggestive of MS (CIS) or MS and 23 healthy controls (HC) for the ability to differ between emotional facial stimuli, controlling for the influence of depressive mood (ADS-L). We screened for cognitive dysfunction using The Faces Symbol Test (FST). The patients demonstrated significant decreased reaction-times regarding emotion recognition tests compared to HC. However, the results also suggested worse cognitive abilities in the patients. Emotional and cognitive test results were correlated. This exploratory pilot study suggests that emotion recognition deficits might be prevalent in MS. However, future studies will be needed to overcome the limitations of this study. Copyright 2010 Elsevier B.V. All rights reserved.

  3. Cross-cultural emotional prosody recognition: evidence from Chinese and British listeners.

    Science.gov (United States)

    Paulmann, Silke; Uskul, Ayse K

    2014-01-01

    This cross-cultural study of emotional tone of voice recognition tests the in-group advantage hypothesis (Elfenbein & Ambady, 2002) employing a quasi-balanced design. Individuals of Chinese and British background were asked to recognise pseudosentences produced by Chinese and British native speakers, displaying one of seven emotions (anger, disgust, fear, happy, neutral tone of voice, sad, and surprise). Findings reveal that emotional displays were recognised at rates higher than predicted by chance; however, members of each cultural group were more accurate in recognising the displays communicated by a member of their own cultural group than a member of the other cultural group. Moreover, the evaluation of error matrices indicates that both culture groups relied on similar mechanism when recognising emotional displays from the voice. Overall, the study reveals evidence for both universal and culture-specific principles in vocal emotion recognition.

  4. Test battery for measuring the perception and recognition of facial expressions of emotion

    Science.gov (United States)

    Wilhelm, Oliver; Hildebrandt, Andrea; Manske, Karsten; Schacht, Annekathrin; Sommer, Werner

    2014-01-01

    Despite the importance of perceiving and recognizing facial expressions in everyday life, there is no comprehensive test battery for the multivariate assessment of these abilities. As a first step toward such a compilation, we present 16 tasks that measure the perception and recognition of facial emotion expressions, and data illustrating each task's difficulty and reliability. The scoring of these tasks focuses on either the speed or accuracy of performance. A sample of 269 healthy young adults completed all tasks. In general, accuracy and reaction time measures for emotion-general scores showed acceptable and high estimates of internal consistency and factor reliability. Emotion-specific scores yielded lower reliabilities, yet high enough to encourage further studies with such measures. Analyses of task difficulty revealed that all tasks are suitable for measuring emotion perception and emotion recognition related abilities in normal populations. PMID:24860528

  5. Processing of emotional facial expressions in Korsakoff's syndrome.

    NARCIS (Netherlands)

    Montagne, B.; Kessels, R.P.C.; Wester, A.J.; Haan, E.H.F. de

    2006-01-01

    Interpersonal contacts depend to a large extent on understanding emotional facial expressions of others. Several neurological conditions may affect proficiency in emotional expression recognition. It has been shown that chronic alcoholics are impaired in labelling emotional expressions. More

  6. Emotional valence of stimuli modulates false recognition: Using a modified version of the simplified conjoint recognition paradigm.

    Science.gov (United States)

    Gong, Xianmin; Xiao, Hongrui; Wang, Dahua

    2016-11-01

    False recognition results from the interplay of multiple cognitive processes, including verbatim memory, gist memory, phantom recollection, and response bias. In the current study, we modified the simplified Conjoint Recognition (CR) paradigm to investigate the way in which the valence of emotional stimuli affects the cognitive process and behavioral outcome of false recognition. In Study 1, we examined the applicability of the modification to the simplified CR paradigm and model. Twenty-six undergraduate students (13 females, aged 21.00±2.30years) learned and recognized both the large and small categories of photo objects. The applicability of the paradigm and model was confirmed by a fair goodness-of-fit of the model to the observational data and by their competence in detecting the memory differences between the large- and small-category conditions. In Study 2, we recruited another sample of 29 undergraduate students (14 females, aged 22.60±2.74years) to learn and recognize the categories of photo objects that were emotionally provocative. The results showed that negative valence increased false recognition, particularly the rate of false "remember" responses, by facilitating phantom recollection; positive valence did not influence false recognition significantly though enhanced gist processing. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Emotion Recognition of Weblog Sentences Based on an Ensemble Algorithm of Multi-label Classification and Word Emotions

    Science.gov (United States)

    Li, Ji; Ren, Fuji

    Weblogs have greatly changed the communication ways of mankind. Affective analysis of blog posts is found valuable for many applications such as text-to-speech synthesis or computer-assisted recommendation. Traditional emotion recognition in text based on single-label classification can not satisfy higher requirements of affective computing. In this paper, the automatic identification of sentence emotion in weblogs is modeled as a multi-label text categorization task. Experiments are carried out on 12273 blog sentences from the Chinese emotion corpus Ren_CECps with 8-dimension emotion annotation. An ensemble algorithm RAKEL is used to recognize dominant emotions from the writer's perspective. Our emotion feature using detailed intensity representation for word emotions outperforms the other main features such as the word frequency feature and the traditional lexicon-based feature. In order to deal with relatively complex sentences, we integrate grammatical characteristics of punctuations, disjunctive connectives, modification relations and negation into features. It achieves 13.51% and 12.49% increases for Micro-averaged F1 and Macro-averaged F1 respectively compared to the traditional lexicon-based feature. Result shows that multiple-dimension emotion representation with grammatical features can efficiently classify sentence emotion in a multi-label problem.

  8. The Relationship between Child Maltreatment and Emotion Recognition

    Science.gov (United States)

    Koizumi, Michiko; Takagishi, Haruto

    2014-01-01

    Child abuse and neglect affect the development of social cognition in children and inhibit social adjustment. The purpose of this study was to compare the ability to identify the emotional states of others between abused and non-abused children. The participants, 129 children (44 abused and 85 non-abused children), completed a children’s version of the Reading the Mind in the Eyes Test (RMET). Results showed that the mean accuracy rate on the RMET for abused children was significantly lower than the rate of the non-abused children. In addition, the accuracy rates for positive emotion items (e.g., hoping, interested, happy) were significantly lower for the abused children, but negative emotion and neutral items were not different across the groups. This study found a negative relationship between child abuse and the ability to understand others’ emotions, especially positive emotions. PMID:24465891

  9. The relationship between child maltreatment and emotion recognition.

    Directory of Open Access Journals (Sweden)

    Michiko Koizumi

    Full Text Available Child abuse and neglect affect the development of social cognition in children and inhibit social adjustment. The purpose of this study was to compare the ability to identify the emotional states of others between abused and non-abused children. The participants, 129 children (44 abused and 85 non-abused children, completed a children's version of the Reading the Mind in the Eyes Test (RMET. Results showed that the mean accuracy rate on the RMET for abused children was significantly lower than the rate of the non-abused children. In addition, the accuracy rates for positive emotion items (e.g., hoping, interested, happy were significantly lower for the abused children, but negative emotion and neutral items were not different across the groups. This study found a negative relationship between child abuse and the ability to understand others' emotions, especially positive emotions.

  10. Emotional Faces in Context: Age Differences in Recognition Accuracy and Scanning Patterns

    Science.gov (United States)

    Noh, Soo Rim; Isaacowitz, Derek M.

    2014-01-01

    While age-related declines in facial expression recognition are well documented, previous research relied mostly on isolated faces devoid of context. We investigated the effects of context on age differences in recognition of facial emotions and in visual scanning patterns of emotional faces. While their eye movements were monitored, younger and older participants viewed facial expressions (i.e., anger, disgust) in contexts that were emotionally congruent, incongruent, or neutral to the facial expression to be identified. Both age groups had highest recognition rates of facial expressions in the congruent context, followed by the neutral context, and recognition rates in the incongruent context were worst. These context effects were more pronounced for older adults. Compared to younger adults, older adults exhibited a greater benefit from congruent contextual information, regardless of facial expression. Context also influenced the pattern of visual scanning characteristics of emotional faces in a similar manner across age groups. In addition, older adults initially attended more to context overall. Our data highlight the importance of considering the role of context in understanding emotion recognition in adulthood. PMID:23163713

  11. Emotional facial expressions differentially influence predictions and performance for face recognition.

    Science.gov (United States)

    Nomi, Jason S; Rhodes, Matthew G; Cleary, Anne M

    2013-01-01

    This study examined how participants' predictions of future memory performance are influenced by emotional facial expressions. Participants made judgements of learning (JOLs) predicting the likelihood that they would correctly identify a face displaying a happy, angry, or neutral emotional expression in a future two-alternative forced-choice recognition test of identity (i.e., recognition that a person's face was seen before). JOLs were higher for studied faces with happy and angry emotional expressions than for neutral faces. However, neutral test faces with studied neutral expressions had significantly higher identity recognition rates than neutral test faces studied with happy or angry expressions. Thus, these data are the first to demonstrate that people believe happy and angry emotional expressions will lead to better identity recognition in the future relative to neutral expressions. This occurred despite the fact that neutral expressions elicited better identity recognition than happy and angry expressions. These findings contribute to the growing literature examining the interaction of cognition and emotion.

  12. Impact of civil war on emotion recognition: the denial of sadness in Sierra Leone.

    Directory of Open Access Journals (Sweden)

    Maria Alessandra eUmilta'

    2013-09-01

    Full Text Available Studies of children with atypical emotional experience demonstrate that childhood exposure to high levels of hostility and threat biases emotion perception. This study investigates emotion processing, in former child soldiers and non-combatant civilians. All participants have experienced prolonged violence exposure during childhood. The study, carried out in Sierra Leone, aimed to examine the effects of exposure to and forced participation in acts of extreme violence on the emotion processing of young adults war survivors. A total of 76 young, male adults (38 former child soldier survivors and 38 civilian survivors were tested in order to assess participants’ ability to identify four different facial emotion expressions from photographs and movies. Both groups were able to recognize facial expressions of emotion. However, despite their general ability to correctly identify facial emotions, participants showed a significant response bias in their recognition of sadness. Both former soldiers and civilians made more errors in identifying expressions of sadness than in the other three emotions and when mislabeling sadness participants most often described it as anger. Conversely, when making erroneous identifications of other emotions, participants were most likely to label the expressed emotion as sadness. In addition, while for three of the four emotions participants were better able to make a correct identification the greater the intensity of the expression, this pattern was not observed for sadness. During movies presentation the recognition of sadness was significantly worse for soldiers. While both former child soldiers and civilians were found to be able to identify facial emotions, a significant response bias in their attribution of negative emotions was observed. Such bias was particularly pronounced in former child soldiers. These findings point to a pervasive long-lasting effect of childhood exposure to violence on emotion processing

  13. Impact of civil war on emotion recognition: the denial of sadness in Sierra Leone.

    Science.gov (United States)

    Umiltà, Maria Allessandra; Wood, Rachel; Loffredo, Francesca; Ravera, Roberto; Gallese, Vittorio

    2013-01-01

    Studies of children with atypical emotional experience demonstrate that childhood exposure to high levels of hostility and threat biases emotion perception. This study investigates emotion processing, in former child soldiers and non-combatant civilians. All participants have experienced prolonged violence exposure during childhood. The study, carried out in Sierra Leone, aimed to examine the effects of exposure to and forced participation in acts of extreme violence on the emotion processing of young adults war survivors. A total of 76 young, male adults (38 former child soldier survivors and 38 civilian survivors) were tested in order to assess participants' ability to identify four different facial emotion expressions from photographs and movies. Both groups were able to recognize facial expressions of emotion. However, despite their general ability to correctly identify facial emotions, participants showed a significant response bias in their recognition of sadness. Both former soldiers and civilians made more errors in identifying expressions of sadness than in the other three emotions and when mislabeling sadness participants most often described it as anger. Conversely, when making erroneous identifications of other emotions, participants were most likely to label the expressed emotion as sadness. In addition, while for three of the four emotions participants were better able to make a correct identification the greater the intensity of the expression, this pattern was not observed for sadness. During movies presentation the recognition of sadness was significantly worse for soldiers. While both former child soldiers and civilians were found to be able to identify facial emotions, a significant response bias in their attribution of negative emotions was observed. Such bias was particularly pronounced in former child soldiers. These findings point to a pervasive long-lasting effect of childhood exposure to violence on emotion processing in later life.

  14. MDMA enhances "mind reading" of positive emotions and impairs "mind reading" of negative emotions.

    Science.gov (United States)

    Hysek, Cédric M; Domes, Gregor; Liechti, Matthias E

    2012-07-01

    3,4-Methylenedioxymethamphetamine (MDMA, ecstasy) increases sociability. The prosocial effects of MDMA may result from the release of the "social hormone" oxytocin and associated alterations in the processing of socioemotional stimuli. We investigated the effects of MDMA (125 mg) on the ability to infer the mental states of others from social cues of the eye region in the Reading the Mind in the Eyes Test. The study included 48 healthy volunteers (24 men, 24 women) and used a double-blind, placebo-controlled, within-subjects design. A choice reaction time test was used to exclude impairments in psychomotor function. We also measured circulating oxytocin and cortisol levels and subjective drug effects. MDMA differentially affected mind reading depending on the emotional valence of the stimuli. MDMA enhanced the accuracy of mental state decoding for positive stimuli (e.g., friendly), impaired mind reading for negative stimuli (e.g., hostile), and had no effect on mind reading for neutral stimuli (e.g., reflective). MDMA did not affect psychomotor performance, increased circulating oxytocin and cortisol levels, and produced subjective prosocial effects, including feelings of being more open, talkative, and closer to others. The shift in the ability to correctly read socioemotional information toward stimuli associated with positive emotional valence, together with the prosocial feelings elicited by MDMA, may enhance social approach behavior and sociability when MDMA is used recreationally and facilitate therapeutic relationships in MDMA-assisted psychotherapeutic settings.

  15. Perirhinal Cortex Muscarinic Receptor Blockade Impairs Taste Recognition Memory Formation

    Science.gov (United States)

    Gutierrez, Ranier; De la Cruz, Vanesa; Rodriguez-Ortiz, Carlos J.; Bermudez-Rattoni, Federico

    2004-01-01

    The relevance of perirhinal cortical cholinergic and glutamatergic neurotransmission for taste recognition memory and learned taste aversion was assessed by microinfusions of muscarinic (scopolamine), NMDA (AP-5), and AMPA (NBQX) receptor antagonists. Infusions of scopolamine, but not AP5 or NBQX, prevented the consolidation of taste recognition…

  16. Achillea millefolium Aqueous Extract does not Impair Recognition ...

    African Journals Online (AJOL)

    Purpose: To investigate the effect of the aqueous extract of Achillea millefolium on recognition memory in mice. Methods: Male mice (35) were used. The aqueous extract of A. millefolium was prepared using a Soxhlet apparatus and injected intraperitoneally in a dose of 50, 250, 500 or 1000 mg/kg daily for 20 days.

  17. Gender differences in facial emotion recognition in persons with chronic schizophrenia.

    Science.gov (United States)

    Weiss, Elisabeth M; Kohler, Christian G; Brensinger, Colleen M; Bilker, Warren B; Loughead, James; Delazer, Margarete; Nolan, Karen A

    2007-03-01

    The aim of the present study was to investigate possible sex differences in the recognition of facial expressions of emotion and to investigate the pattern of classification errors in schizophrenic males and females. Such an approach provides an opportunity to inspect the degree to which males and females differ in perceiving and interpreting the different emotions displayed to them and to analyze which emotions are most susceptible to recognition errors. Fifty six chronically hospitalized schizophrenic patients (38 men and 18 women) completed the Penn Emotion Recognition Test (ER40), a computerized emotion discrimination test presenting 40 color photographs of evoked happy, sad, anger, fear expressions and neutral expressions balanced for poser gender and ethnicity. We found a significant sex difference in the patterns of error rates in the Penn Emotion Recognition Test. Neutral faces were more commonly mistaken as angry in schizophrenic men, whereas schizophrenic women misinterpreted neutral faces more frequently as sad. Moreover, female faces were better recognized overall, but fear was better recognized in same gender photographs, whereas anger was better recognized in different gender photographs. The findings of the present study lend support to the notion that sex differences in aggressive behavior could be related to a cognitive style characterized by hostile attributions to neutral faces in schizophrenic men.

  18. Identifying at-risk states beyond positive symptoms: a brief task assessing how neurocognitive impairments impact on misrepresentation of the social world through blunted emotional appraisal.

    Science.gov (United States)

    Galdos, Mariana; Simons, Claudia J P; Wichers, Marieke; Fernandez-Rivas, Aranzazu; Martinez-Azumendi, Oscar; Lataster, Tineke; Amer, Guillermo; Myin-Germeys, Inez; Gonzalez-Torres, Miguel Angel; van Os, Jim

    2011-10-01

    Neurocognitive impairments observed in psychotic disorder may impact on emotion recognition and theory of mind, resulting in altered understanding of the social world. Early intervention efforts would be served by further elucidation of this mechanism. Patients with a psychotic disorder (n=30) and a reference control group (n=310) were asked to offer emotional appraisals of images of social situations (EASS task). The degree to which case-control differences in appraisals were mediated by neurocognitive alterations was analyzed. The EASS task displayed convergent and discriminant validity. Compared to controls, patients displayed blunted emotional appraisal of social situations (B=0.52, 95% CI: 0.30, 0.74, Ppsychotic disorder may underlie misrepresentation of the social world, mediated by altered emotion recognition. A task assessing the social impact of cognitive alterations in clinical practice may be useful in detecting key alterations very early in the course of psychotic illness.

  19. Emotion Recognition Through Body Language Using RGB-D Sensor

    DEFF Research Database (Denmark)

    Kiforenko, Lilita; Kraft, Dirk

    2016-01-01

    by various visual stimuli. We present the emotion dataset that is recorded using Microsoft Kinect for Windows sensor and body joints rotation angles that are extracted using Microsoft Kinect Software Development Kit 1.6. The classified emotions are curiosity, confusion, joy, boredom and disgust. We show...... joint rotation angles and meta-features that are fed into a Support Vector Machines classifier. The work of Gaber-Barron and Si (2012) is used as inspiration and many of their proposed meta-features are reimplemented or modified. In this work we try to identify ”basic” human emotions, that are triggered...

  20. Effect of Time Delay on Recognition Memory for Pictures: The Modulatory Role of Emotion

    Science.gov (United States)

    Wang, Bo

    2014-01-01

    This study investigated the modulatory role of emotion in the effect of time delay on recognition memory for pictures. Participants viewed neutral, positive and negative pictures, and took a recognition memory test 5 minutes, 24 hours, or 1 week after learning. The findings are: 1) For neutral, positive and negative pictures, overall recognition accuracy in the 5-min delay did not significantly differ from that in the 24-h delay. For neutral and positive pictures, overall recognition accuracy in the 1-week delay was lower than in the 24-h delay; for negative pictures, overall recognition in the 24-h and 1-week delay did not significantly differ. Therefore negative emotion modulates the effect of time delay on recognition memory, maintaining retention of overall recognition accuracy only within a certain frame of time. 2) For the three types of pictures, recollection and familiarity in the 5-min delay did not significantly differ from that in the 24-h and the 1-week delay. Thus emotion does not appear to modulate the effect of time delay on recollection and familiarity. However, recollection in the 24-h delay was higher than in the 1-week delay, whereas familiarity in the 24-h delay was lower than in the 1-week delay. PMID:24971457

  1. Emotion-Specific Priming: Congruence Effects on Affect and Recognition across Negative Emotions.

    Science.gov (United States)

    Hansen, Christine H.; Shantz, Cynthia A.

    1995-01-01

    Demonstrated the emotion-specific priming effects of negatively valenced emotions (anger, sadness, and fear) in a divided attention task. Results indicated that a negative emotion displayed by a target that matched the emotion induced by a priming manipulation was significantly stronger than an incongruous priming manipulation and displayed…

  2. Cognitive impairment is undetected in medical inpatients: a study of mortality and recognition amongst healthcare professionals

    Directory of Open Access Journals (Sweden)

    Torisson Gustav

    2012-08-01

    Full Text Available Abstract Background Detecting cognitive impairment in medical inpatients is important due to its association with adverse outcomes. Our aim was to study recognition of cognitive impairment and its association with mortality. Methods 200 inpatients aged over 60 years were recruited at the Department of General Internal Medicine at University Hospital MAS in Malmö, Sweden. The MMSE (Mini-Mental State Examination and the CDT (Clock-Drawing Test were performed and related to recognition rates by patients, staff physicians, nurses and informants. The impact of abnormal cognitive test results on mortality was studied using a multivariable Cox proportional hazards regression. Results 55 patients (28% had no cognitive impairment while 68 patients (34% had 1 abnormal test result (on MMSE or CDT and 77 patients (39% had 2 abnormal test results. Recognition by healthcare professionals was 12% in the group with 1 abnormal test and 44-64% in the group with 2 abnormal test results. In our model, cognitive impairment predicted 12-month mortality with a hazard ratio (95% CI of 2.86 (1.28-6.39 for the group with 1 abnormal cognitive test and 3.39 (1.54-7.45 for the group with 2 abnormal test results. Conclusions Cognitive impairment is frequent in medical inpatients and associated with increased mortality. Recognition rates of cognitive impairment need to be improved in hospitals.

  3. Analysis of physiological signals for recognition of boredom, pain, and surprise emotions.

    Science.gov (United States)

    Jang, Eun-Hye; Park, Byoung-Jun; Park, Mi-Sook; Kim, Sang-Hyeob; Sohn, Jin-Hun

    2015-06-18

    The aim of the study was to examine the differences of boredom, pain, and surprise. In addition to that, it was conducted to propose approaches for emotion recognition based on physiological signals. Three emotions, boredom, pain, and surprise, are induced through the presentation of emotional stimuli and electrocardiography (ECG), electrodermal activity (EDA), skin temperature (SKT), and photoplethysmography (PPG) as physiological signals are measured to collect a dataset from 217 participants when experiencing the emotions. Twenty-seven physiological features are extracted from the signals to classify the three emotions. The discriminant function analysis (DFA) as a statistical method, and five machine learning algorithms (linear discriminant analysis (LDA), classification and regression trees (CART), self-organizing map (SOM), Naïve Bayes algorithm, and support vector machine (SVM)) are used for classifying the emotions. The result shows that the difference of physiological responses among emotions is significant in heart rate (HR), skin conductance level (SCL), skin conductance response (SCR), mean skin temperature (meanSKT), blood volume pulse (BVP), and pulse transit time (PTT), and the highest recognition accuracy of 84.7% is obtained by using DFA. This study demonstrates the differences of boredom, pain, and surprise and the best emotion recognizer for the classification of the three emotions by using physiological signals.

  4. No Differences in Emotion Recognition Strategies in Children with Autism Spectrum Disorder: Evidence from Hybrid Faces

    Directory of Open Access Journals (Sweden)

    Kris Evers

    2014-01-01

    Full Text Available Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD. However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness or in the mouth region (so-called bottom-emotions: sadness, anger, and fear. No stronger reliance on mouth information was found in children with ASD.

  5. No differences in emotion recognition strategies in children with autism spectrum disorder: evidence from hybrid faces.

    Science.gov (United States)

    Evers, Kris; Kerkhof, Inneke; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2014-01-01

    Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD). However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD) group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region) with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness) or in the mouth region (so-called bottom-emotions: sadness, anger, and fear). No stronger reliance on mouth information was found in children with ASD.

  6. The Effects of Anxiety on the Recognition of Multisensory Emotional Cues with Different Cultural Familiarity

    Directory of Open Access Journals (Sweden)

    Ai Koizumi

    2011-10-01

    Full Text Available Anxious individuals have been shown to interpret others' facial expressions negatively. However, whether this negative interpretation bias depends on the modality and familiarity of emotional cues remains largely unknown. We examined whether trait-anxiety affects recognition of multisensory emotional cues (ie, face and voice, which were expressed by actors from either the same or different cultural background as the participants (ie, familiar in-group and unfamiliar out-group. The dynamic face and voice cues of the same actors were synchronized, and conveyed either congruent (eg, happy face and voice or incongruent emotions (eg, happy face and angry voice. Participants were to indicate the perceived emotion in one of the cues, while ignoring the other. The results showed that when recognizing emotions of in-group actors, highly anxious individuals, compared with low anxious ones, were more likely to interpret others' emotions in a negative manner, putting more weight on the to-be-ignored angry cues. This interpretation bias was found regardless of the cue modality. However, when recognizing emotions of out-group actors, low and high anxious individuals showed no difference in the interpretation of emotions irrespective of modality. These results suggest that trait-anxiety affects recognition of emotional expressions in a modality independent yet cultural familiarity dependent manner.

  7. Emotion Recognition from Chinese Speech for Smart Affective Services Using a Combination of SVM and DBN

    Science.gov (United States)

    Zhu, Lianzhang; Chen, Leiming; Zhao, Dehai

    2017-01-01

    Accurate emotion recognition from speech is important for applications like smart health care, smart entertainment, and other smart services. High accuracy emotion recognition from Chinese speech is challenging due to the complexities of the Chinese language. In this paper, we explore how to improve the accuracy of speech emotion recognition, including speech signal feature extraction and emotion classification methods. Five types of features are extracted from a speech sample: mel frequency cepstrum coefficient (MFCC), pitch, formant, short-term zero-crossing rate and short-term energy. By comparing statistical features with deep features extracted by a Deep Belief Network (DBN), we attempt to find the best features to identify the emotion status for speech. We propose a novel classification method that combines DBN and SVM (support vector machine) instead of using only one of them. In addition, a conjugate gradient method is applied to train DBN in order to speed up the training process. Gender-dependent experiments are conducted using an emotional speech database created by the Chinese Academy of Sciences. The results show that DBN features can reflect emotion status better than artificial features, and our new classification approach achieves an accuracy of 95.8%, which is higher than using either DBN or SVM separately. Results also show that DBN can work very well for small training databases if it is properly designed. PMID:28737705

  8. Emotion Recognition from Chinese Speech for Smart Affective Services Using a Combination of SVM and DBN.

    Science.gov (United States)

    Zhu, Lianzhang; Chen, Leiming; Zhao, Dehai; Zhou, Jiehan; Zhang, Weishan

    2017-07-24

    Accurate emotion recognition from speech is important for applications like smart health care, smart entertainment, and other smart services. High accuracy emotion recognition from Chinese speech is challenging due to the complexities of the Chinese language. In this paper, we explore how to improve the accuracy of speech emotion recognition, including speech signal feature extraction and emotion classification methods. Five types of features are extracted from a speech sample: mel frequency cepstrum coefficient (MFCC), pitch, formant, short-term zero-crossing rate and short-term energy. By comparing statistical features with deep features extracted by a Deep Belief Network (DBN), we attempt to find the best features to identify the emotion status for speech. We propose a novel classification method that combines DBN and SVM (support vector machine) instead of using only one of them. In addition, a conjugate gradient method is applied to train DBN in order to speed up the training process. Gender-dependent experiments are conducted using an emotional speech database created by the Chinese Academy of Sciences. The results show that DBN features can reflect emotion status better than artificial features, and our new classification approach achieves an accuracy of 95.8%, which is higher than using either DBN or SVM separately. Results also show that DBN can work very well for small training databases if it is properly designed.

  9. When familiarity breeds accuracy: cultural exposure and facial emotion recognition.

    Science.gov (United States)

    Elfenbein, Hillary Anger; Ambady, Nalini

    2003-08-01

    Two studies provide evidence for the role of cultural familiarity in recognizing facial expressions of emotion. For Chinese located in China and the United States, Chinese Americans, and non-Asian Americans, accuracy and speed in judging Chinese and American emotions was greater with greater participant exposure to the group posing the expressions. Likewise, Tibetans residing in China and Africans residing in the United States were faster and more accurate when judging emotions expressed by host versus nonhost society members. These effects extended across generations of Chinese Americans, seemingly independent of ethnic or biological ties. Results suggest that the universal affect system governing emotional expression may be characterized by subtle differences in style across cultures, which become more familiar with greater cultural contact.

  10. Emotion recognition using eigenvalues and Levenberg–Marquardt ...

    Indian Academy of Sciences (India)

    Vilas H Gaidhane

    The robustness of the proposed approach is also tested on low-resolution facial ... The visual perception of human emotion is .... Kanade database and the experimental results are explained in .... After detailed study and analysis on image.

  11. General and specific responsiveness of the amygdala during explicit emotion recognition in females and males

    Directory of Open Access Journals (Sweden)

    Windischberger Christian

    2009-08-01

    Full Text Available Abstract Background The ability to recognize emotions in facial expressions relies on an extensive neural network with the amygdala as the key node as has typically been demonstrated for the processing of fearful stimuli. A sufficient characterization of the factors influencing and modulating amygdala function, however, has not been reached now. Due to lacking or diverging results on its involvement in recognizing all or only certain negative emotions, the influence of gender or ethnicity is still under debate. This high-resolution fMRI study addresses some of the relevant parameters, such as emotional valence, gender and poser ethnicity on amygdala activation during facial emotion recognition in 50 Caucasian subjects. Stimuli were color photographs of emotional Caucasian and African American faces. Results Bilateral amygdala activation was obtained to all emotional expressions (anger, disgust, fear, happy, and sad and neutral faces across all subjects. However, only in males a significant correlation of amygdala activation and behavioral response to fearful stimuli was observed, indicating higher amygdala responses with better fear recognition, thus pointing to subtle gender differences. No significant influence of poser ethnicity on amygdala activation occurred, but analysis of recognition accuracy revealed a significant impact of poser ethnicity that was emotion-dependent. Conclusion Applying high-resolution fMRI while subjects were performing an explicit emotion recognition task revealed bilateral amygdala activation to all emotions presented and neutral expressions. This mechanism seems to operate similarly in healthy females and males and for both in-group and out-group ethnicities. Our results support the assumption that an intact amygdala response is fundamental in the processing of these salient stimuli due to its relevance detecting function.

  12. Memory evaluation in mild cognitive impairment using recall and recognition tests.

    Science.gov (United States)

    Bennett, Ilana J; Golob, Edward J; Parker, Elizabeth S; Starr, Arnold

    2006-11-01

    Amnestic mild cognitive impairment (MCI) is a selective episodic memory deficit that often indicates early Alzheimer's disease. Episodic memory function in MCI is typically defined by deficits in free recall, but can also be tested using recognition procedures. To assess both recall and recognition in MCI, MCI (n = 21) and older comparison (n = 30) groups completed the USC-Repeatable Episodic Memory Test. Subjects memorized two verbally presented 15-item lists. One list was used for three free recall trials, immediately followed by yes/no recognition. The second list was used for three-alternative forced-choice recognition. Relative to the comparison group, MCI had significantly fewer hits and more false alarms in yes/no recognition, and were less accurate in forced-choice recognition. Signal detection analysis showed that group differences were not due to response bias. Discriminant function analysis showed that yes/no recognition was a better predictor of group membership than free recall or forced-choice measures. MCI subjects recalled fewer items than comparison subjects, with no group differences in repetitions, intrusions, serial position effects, or measures of recall strategy (subjective organization, recall consistency). Performance deficits on free recall and recognition in MCI suggest a combination of both tests may be useful for defining episodic memory impairment associated with MCI and early Alzheimer's disease.

  13. The effect of unimodal affective priming on dichotic emotion recognition.

    Science.gov (United States)

    Voyer, Daniel; Myles, Daniel

    2017-11-15

    The present report concerns two experiments extending to unimodal priming the cross-modal priming effects observed with auditory emotions by Harding and Voyer [(2016). Laterality effects in cross-modal affective priming. Laterality: Asymmetries of Body, Brain and Cognition, 21, 585-605]. Experiment 1 used binaural targets to establish the presence of the priming effect and Experiment 2 used dichotically presented targets to examine auditory asymmetries. In Experiment 1, 82 university students completed a task in which binaural targets consisting of one of 4 English words inflected in one of 4 emotional tones were preceded by binaural primes consisting of one of 4 Mandarin words pronounced in the same (congruent) or different (incongruent) emotional tones. Trials where the prime emotion was congruent with the target emotion showed faster responses and higher accuracy in identifying the target emotion. In Experiment 2, 60 undergraduate students participated and the target was presented dichotically instead of binaurally. Primes congruent with the left ear produced a large left ear advantage, whereas right congruent primes produced a right ear advantage. These results indicate that unimodal priming produces stronger effects than those observed under cross-modal priming. The findings suggest that priming should likely be considered a strong top-down influence on laterality effects.

  14. Effects of Power on Mental Rotation and Emotion Recognition in Women.

    Science.gov (United States)

    Nissan, Tali; Shapira, Oren; Liberman, Nira

    2015-10-01

    Based on construal-level theory (CLT) and its view of power as an instance of social distance, we predicted that high, relative to low power would enhance women's mental-rotation performance and impede their emotion-recognition performance. The predicted effects of power emerged both when it was manipulated via a recall priming task (Study 1) and environmental cues (Studies 2 and 3). Studies 3 and 4 found evidence for mediation by construal level of the effect of power on emotion recognition but not on mental rotation. We discuss potential mediating mechanisms for these effects based on both the social distance/construal level and the approach/inhibition views of power. We also discuss implications for optimizing performance on mental rotation and emotion recognition in everyday life. © 2015 by the Society for Personality and Social Psychology, Inc.

  15. Normal mere exposure effect with impaired recognition in Alzheimer's disease

    OpenAIRE

    Willems, Sylvie; Adam, Stéphane; Van der Linden, Martial

    2002-01-01

    We investigated the mere exposure effect and the explicit memory in Alzheimer's disease (AD) patients and elderly control subjects, using unfamiliar faces. During the exposure phase, the subjects estimated the age of briefly flashed faces. The mere exposure effect was examined by presenting pairs of faces (old and new) and asking participants to select the face they liked. The participants were then presented with a forced-choice explicit recognition task. Controls subjects exhibited above-ch...

  16. Memory evaluation in mild cognitive impairment using recall and recognition tests

    OpenAIRE

    Bennett, IJ; Golob, EJ; Parker, ES; Starr, A

    2006-01-01

    Amnestic mild cognitive impairment (MCI) is a selective episodic memory deficit that often indicates early Alzheimer's disease. Episodic memory function in MCI is typically defined by deficits in free recall, but can also be tested using recognition procedures. To assess both recall and recognition in MCI, MCI (n = 21) and older comparison (n = 30) groups completed the USC-Repeatable Episodic Memory Test. Subjects memorized two verbally presented 15-item lists. One list was used for three fre...

  17. Emotion Index of Cover Song Music Video Clips based on Facial Expression Recognition

    DEFF Research Database (Denmark)

    Kavallakis, George; Vidakis, Nikolaos; Triantafyllidis, Georgios

    2017-01-01

    This paper presents a scheme of creating an emotion index of cover song music video clips by recognizing and classifying facial expressions of the artist in the video. More specifically, it fuses effective and robust algorithms which are employed for expression recognition, along with the use...... of a neural network system using the features extracted by the SIFT algorithm. Also we support the need of this fusion of different expression recognition algorithms, because of the way that emotions are linked to facial expressions in music video clips....

  18. Gender differences in the recognition of emotional faces: are men less efficient?

    Directory of Open Access Journals (Sweden)

    Ana Ruiz-Ibáñez

    2017-06-01

    Full Text Available As research in recollection of stimuli with emotional valence indicates, emotions influence memory. Many studies in face and emotional facial expression recognition have focused on age (young and old people and gender-associated (men and women differences. Nevertheless, this kind of studies has produced contradictory results, because of that, it would be necessary to study gender involvement in depth. The main objective of our research consists of analyzing the differences in image recognition using faces with emotional facial expressions between two groups composed by university students aged 18-30. The first group is constituted by men and the second one by women. The results showed statistically significant differences in face corrected recognition (hit rate - false alarm rate: the women demonstrated a better recognition than the men. However, other analyzed variables as time or efficiency do not provide conclusive results. Furthermore, a significant negative correlation between the time used and the efficiency when doing the task was found in the male group. This information reinforces not only the hypothesis of gender difference in face recognition, in favor of women, but also these ones that suggest a different cognitive processing of facial stimuli in both sexes. Finally, we argue the necessity of a greater research related to variables as age or sociocultural level.

  19. Instructions to mimic improve facial emotion recognition in people with sub-clinical autism traits.

    Science.gov (United States)

    Lewis, Michael B; Dunn, Emily

    2017-11-01

    People tend to mimic the facial expression of others. It has been suggested that this helps provide social glue between affiliated people but it could also aid recognition of emotions through embodied cognition. The degree of facial mimicry, however, varies between individuals and is limited in people with autism spectrum conditions (ASC). The present study sought to investigate the effect of promoting facial mimicry during a facial-emotion-recognition test. In two experiments, participants without an ASC diagnosis had their autism quotient (AQ) measured. Following a baseline test, they did an emotion-recognition test again but half of the participants were asked to mimic the target face they saw prior to making their responses. Mimicry improved emotion recognition, and further analysis revealed that the largest improvement was for participants who had higher scores on the autism traits. In fact, recognition performance was best overall for people who had high AQ scores but also received the instruction to mimic. Implications for people with ASC are explored.

  20. Prestimulus default mode activity influences depth of processing and recognition in an emotional memory task.

    Science.gov (United States)

    Soravia, Leila M; Witmer, Joëlle S; Schwab, Simon; Nakataki, Masahito; Dierks, Thomas; Wiest, Roland; Henke, Katharina; Federspiel, Andrea; Jann, Kay

    2016-03-01

    Low self-referential thoughts are associated with better concentration, which leads to deeper encoding and increases learning and subsequent retrieval. There is evidence that being engaged in externally rather than internally focused tasks is related to low neural activity in the default mode network (DMN) promoting open mind and the deep elaboration of new information. Thus, reduced DMN activity should lead to enhanced concentration, comprehensive stimulus evaluation including emotional categorization, deeper stimulus processing, and better long-term retention over one whole week. In this fMRI study, we investigated brain activation preceding and during incidental encoding of emotional pictures and on subsequent recognition performance. During fMRI, 24 subjects were exposed to 80 pictures of different emotional valence and subsequently asked to complete an online recognition task one week later. Results indicate that neural activity within the medial temporal lobes during encoding predicts subsequent memory performance. Moreover, a low activity of the default mode network preceding incidental encoding leads to slightly better recognition performance independent of the emotional perception of a picture. The findings indicate that the suppression of internally-oriented thoughts leads to a more comprehensive and thorough evaluation of a stimulus and its emotional valence. Reduced activation of the DMN prior to stimulus onset is associated with deeper encoding and enhanced consolidation and retrieval performance even one week later. Even small prestimulus lapses of attention influence consolidation and subsequent recognition performance. © 2015 Wiley Periodicals, Inc.

  1. Monetary incentives at retrieval promote recognition of involuntarily learned emotional information.

    Science.gov (United States)

    Yan, Chunping; Li, Yunyun; Zhang, Qin; Cui, Lixia

    2018-03-07

    Previous studies have suggested that the effects of reward on memory processes are affected by certain factors, but it remains unclear whether the effects of reward at retrieval on recognition processes are influenced by emotion. The event-related potential was used to investigate the combined effect of reward and emotion on memory retrieval and its neural mechanism. The behavioral results indicated that the reward at retrieval improved recognition performance under positive and negative emotional conditions. The event-related potential results indicated that there were significant interactions between the reward and emotion in the average amplitude during recognition, and the significant reward effects from the frontal to parietal brain areas appeared at 130-800 ms for positive pictures and at 190-800 ms for negative pictures, but there were no significant reward effects of neutral pictures; the reward effect of positive items appeared relatively earlier, starting at 130 ms, and that of negative pictures began at 190 ms. These results indicate that monetary incentives at retrieval promote recognition of involuntarily learned emotional information.

  2. Comparing Facial Emotional Recognition in Patients with Borderline Personality Disorder and Patients with Schizotypal Personality Disorder with a Normal Group

    Directory of Open Access Journals (Sweden)

    Aida Farsham

    2017-04-01

    Full Text Available Objective: No research has been conducted on facial emotional recognition on patients with borderline personality disorder (BPD and schizotypal personality disorder (SPD. The present study aimed at comparing facial emotion recognition in these patients with the general population. The neurocognitive processing of emotions can show the pathologic style of these 2 disorders. Method:  Twenty BPD patients, 16 SPD patients, and 20 healthy individuals were selected by available sampling method. Structural Clinical Interview for Axis II, Millon Personality Inventory, Beck Depression Inventory and Facial Emotional Recognition Test was were conducted for all participants.Discussion: The results of one way ANOVA and Scheffe’s post hoc test analysis revealed significant differences in neuropsychology assessment of  facial emotional recognition between BPD and  SPD patients with normal group (p = 0/001. A significant difference was found in emotion recognition of fear between the 2 groups of BPD and normal population (p = 0/008. A significant difference was observed between SPD patients and control group in emotion recognition of wonder (p = 0/04(.The obtained results indicated a deficit in negative emotion recognition, especially disgust emotion, thus, it can be concluded that these patients have the same neurocognitive profile in the emotion domain.

  3. Comparing Facial Emotional Recognition in Patients with Borderline Personality Disorder and Patients with Schizotypal Personality Disorder with a Normal Group.

    Science.gov (United States)

    Farsham, Aida; Abbaslou, Tahereh; Bidaki, Reza; Bozorg, Bonnie

    2017-04-01

    Objective: No research has been conducted on facial emotional recognition on patients with borderline personality disorder (BPD) and schizotypal personality disorder (SPD). The present study aimed at comparing facial emotion recognition in these patients with the general population. The neurocognitive processing of emotions can show the pathologic style of these 2 disorders. Method: Twenty BPD patients, 16 SPD patients, and 20 healthy individuals were selected by available sampling method. Structural Clinical Interview for Axis II, Millon Personality Inventory, Beck Depression Inventory and Facial Emotional Recognition Test was were conducted for all participants. Discussion: The results of one way ANOVA and Scheffe's post hoc test analysis revealed significant differences in neuropsychology assessment of facial emotional recognition between BPD and SPD patients with normal group (p = 0/001). A significant difference was found in emotion recognition of fear between the 2 groups of BPD and normal population (p = 0/008). A significant difference was observed between SPD patients and control group in emotion recognition of wonder (p = 0/04(. The obtained results indicated a deficit in negative emotion recognition, especially disgust emotion, thus, it can be concluded that these patients have the same neurocognitive profile in the emotion domain.

  4. GENDER DIFFERENCES IN THE RECOGNITION OF FACIAL EXPRESSIONS OF EMOTION

    Directory of Open Access Journals (Sweden)

    CARLOS FELIPE PARDO-VÉLEZ

    2003-07-01

    Full Text Available Gender differences in the recognition of facial expressions of anger, happiness and sadness wereresearched in students 18-25 years of age. A reaction time procedure was used, and the percentage ofcorrect answers when recognizing was also measured. Though the work hypothesis expected genderdifferences in facial expression recognition, results suggest that these differences are not significant at alevel of 0.05%. Statistical analysis shows a greater easiness (at a non-significant level for women torecognize happiness expressions, and for men to recognize anger expressions. The implications ofthese data are discussed, and possible extensions of this investigation in terms of sample size andcollege major of the participants.

  5. Emotion recognition techniques using physiological signals and video games -Systematic review-

    OpenAIRE

    Callejas-Cuervo, Mauro; Martínez-Tejada, Laura Alejandra; Alarcón-Aldana, Andrea Catherine

    2017-01-01

    Abstract Emotion recognition systems from physiological signals are innovative techniques that allow studying the behavior and reaction of an individual when exposed to information that may evoke emotional reactions through multimedia tools, for example, video games. This type of approach is used to identify the behavior of an individual in different fields, such as medicine, education, psychology, etc., in order to assess the effect that the content has on the individual that is interacting ...

  6. Mother's Happiness with Cognitive - Executive Functions and Facial Emotional Recognition in School Children with Down Syndrome.

    OpenAIRE

    Maryam Malmir; Maryam Seifenaraghi; Dariush D Farhud; G Ali Afrooz; Mohammad Khanahmadi

    2015-01-01

    Background: According to the mother?s key roles in bringing up emotional and cognitive abilities of mentally retarded children and respect to positive psychology in recent decades, this research is administered to assess the relation between mother?s happiness level with cognitive- executive functions (i.e. attention, working memory, inhibition and planning) and facial emotional recognition ability as two factors in learning and adjustment skills in mentally retarded children with Down syndro...

  7. ReliefF-Based EEG Sensor Selection Methods for Emotion Recognition.

    Science.gov (United States)

    Zhang, Jianhai; Chen, Ming; Zhao, Shaokai; Hu, Sanqing; Shi, Zhiguo; Cao, Yu

    2016-09-22

    Electroencephalogram (EEG) signals recorded from sensor electrodes on the scalp can directly detect the brain dynamics in response to different emotional states. Emotion recognition from EEG signals has attracted broad attention, partly due to the rapid development of wearable computing and the needs of a more immersive human-computer interface (HCI) environment. To improve the recognition performance, multi-channel EEG signals are usually used. A large set of EEG sensor channels will add to the computational complexity and cause users inconvenience. ReliefF-based channel selection methods were systematically investigated for EEG-based emotion recognition on a database for emotion analysis using physiological signals (DEAP). Three strategies were employed to select the best channels in classifying four emotional states (joy, fear, sadness and relaxation). Furthermore, support vector machine (SVM) was used as a classifier to validate the performance of the channel selection results. The experimental results showed the effectiveness of our methods and the comparison with the similar strategies, based on the F-score, was given. Strategies to evaluate a channel as a unity gave better performance in channel reduction with an acceptable loss of accuracy. In the third strategy, after adjusting channels' weights according to their contribution to the classification accuracy, the number of channels was reduced to eight with a slight loss of accuracy (58.51% ± 10.05% versus the best classification accuracy 59.13% ± 11.00% using 19 channels). In addition, the study of selecting subject-independent channels, related to emotion processing, was also implemented. The sensors, selected subject-independently from frontal, parietal lobes, have been identified to provide more discriminative information associated with emotion processing, and are distributed symmetrically over the scalp, which is consistent with the existing literature. The results will make a contribution to the

  8. ReliefF-Based EEG Sensor Selection Methods for Emotion Recognition

    Directory of Open Access Journals (Sweden)

    Jianhai Zhang

    2016-09-01

    Full Text Available Electroencephalogram (EEG signals recorded from sensor electrodes on the scalp can directly detect the brain dynamics in response to different emotional states. Emotion recognition from EEG signals has attracted broad attention, partly due to the rapid development of wearable computing and the needs of a more immersive human-computer interface (HCI environment. To improve the recognition performance, multi-channel EEG signals are usually used. A large set of EEG sensor channels will add to the computational complexity and cause users inconvenience. ReliefF-based channel selection methods were systematically investigated for EEG-based emotion recognition on a database for emotion analysis using physiological signals (DEAP. Three strategies were employed to select the best channels in classifying four emotional states (joy, fear, sadness and relaxation. Furthermore, support vector machine (SVM was used as a classifier to validate the performance of the channel selection results. The experimental results showed the effectiveness of our methods and the comparison with the similar strategies, based on the F-score, was given. Strategies to evaluate a channel as a unity gave better performance in channel reduction with an acceptable loss of accuracy. In the third strategy, after adjusting channels’ weights according to their contribution to the classification accuracy, the number of channels was reduced to eight with a slight loss of accuracy (58.51% ± 10.05% versus the best classification accuracy 59.13% ± 11.00% using 19 channels. In addition, the study of selecting subject-independent channels, related to emotion processing, was also implemented. The sensors, selected subject-independently from frontal, parietal lobes, have been identified to provide more discriminative information associated with emotion processing, and are distributed symmetrically over the scalp, which is consistent with the existing literature. The results will make a

  9. No Differences in Emotion Recognition Strategies in Children with Autism Spectrum Disorder: Evidence from Hybrid Faces

    OpenAIRE

    Evers, Kris; Kerkhof, Inneke; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2014-01-01

    Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD). However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybr...

  10. End-to-End Multimodal Emotion Recognition Using Deep Neural Networks

    Science.gov (United States)

    Tzirakis, Panagiotis; Trigeorgis, George; Nicolaou, Mihalis A.; Schuller, Bjorn W.; Zafeiriou, Stefanos

    2017-12-01

    Automatic affect recognition is a challenging task due to the various modalities emotions can be expressed with. Applications can be found in many domains including multimedia retrieval and human computer interaction. In recent years, deep neural networks have been used with great success in determining emotional states. Inspired by this success, we propose an emotion recognition system using auditory and visual modalities. To capture the emotional content for various styles of speaking, robust features need to be extracted. To this purpose, we utilize a Convolutional Neural Network (CNN) to extract features from the speech, while for the visual modality a deep residual network (ResNet) of 50 layers. In addition to the importance of feature extraction, a machine learning algorithm needs also to be insensitive to outliers while being able to model the context. To tackle this problem, Long Short-Term Memory (LSTM) networks are utilized. The system is then trained in an end-to-end fashion where - by also taking advantage of the correlations of the each of the streams - we manage to significantly outperform the traditional approaches based on auditory and visual handcrafted features for the prediction of spontaneous and natural emotions on the RECOLA database of the AVEC 2016 research challenge on emotion recognition.

  11. Mother's Happiness with Cognitive - Executive Functions and Facial Emotional Recognition in School Children with Down Syndrome.

    Science.gov (United States)

    Malmir, Maryam; Seifenaraghi, Maryam; Farhud, Dariush D; Afrooz, G Ali; Khanahmadi, Mohammad

    2015-05-01

    According to the mother's key roles in bringing up emotional and cognitive abilities of mentally retarded children and respect to positive psychology in recent decades, this research is administered to assess the relation between mother's happiness level with cognitive- executive functions (i.e. attention, working memory, inhibition and planning) and facial emotional recognition ability as two factors in learning and adjustment skills in mentally retarded children with Down syndrome. This study was an applied research and data were analyzed by Pearson correlation procedure. Population is included all school children with Down syndrome (9-12 yr) that come from Tehran, Iran. Overall, 30 children were selected as an in access sample. After selection and agreement of parents, the Wechsler Intelligence Scale for Children-Revised (WISC-R) was performed to determine the student's IQ, and then mothers were invited to fill out the Oxford Happiness Inventory (OHI). Cognitive-executive functions were evaluated by tests as followed: Continues Performance Test (CPT), N-Back, Stroop test (day and night version) and Tower of London. Ekman emotion facial expression test was also accomplished for assessing facial emotional recognition in children with Down syndrome, individually. Mother's happiness level had a positive relation with cognitive-executive functions (attention, working memory, inhibition and planning) and facial emotional recognition in her children with Down syndrome, significantly. Parents' happiness (especially mothers) is a powerful predictor for cognitive and emotional abilities of their children.

  12. Multimodal emotional state recognition using sequence-dependent deep hierarchical features.

    Science.gov (United States)

    Barros, Pablo; Jirak, Doreen; Weber, Cornelius; Wermter, Stefan

    2015-12-01

    Emotional state recognition has become an important topic for human-robot interaction in the past years. By determining emotion expressions, robots can identify important variables of human behavior and use these to communicate in a more human-like fashion and thereby extend the interaction possibilities. Human emotions are multimodal and spontaneous, which makes them hard to be recognized by robots. Each modality has its own restrictions and constraints which, together with the non-structured behavior of spontaneous expressions, create several difficulties for the approaches present in the literature, which are based on several explicit feature extraction techniques and manual modality fusion. Our model uses a hierarchical feature representation to deal with spontaneous emotions, and learns how to integrate multiple modalities for non-verbal emotion recognition, making it suitable to be used in an HRI scenario. Our experiments show that a significant improvement of recognition accuracy is achieved when we use hierarchical features and multimodal information, and our model improves the accuracy of state-of-the-art approaches from 82.5% reported in the literature to 91.3% for a benchmark dataset on spontaneous emotion expressions. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. Impaired Word and Face Recognition in Older Adults with Type 2 Diabetes.

    Science.gov (United States)

    Jones, Nicola; Riby, Leigh M; Smith, Michael A

    2016-07-01

    Older adults with type 2 diabetes mellitus (DM2) exhibit accelerated decline in some domains of cognition including verbal episodic memory. Few studies have investigated the influence of DM2 status in older adults on recognition memory for more complex stimuli such as faces. In the present study we sought to compare recognition memory performance for words, objects and faces under conditions of relatively low and high cognitive load. Healthy older adults with good glucoregulatory control (n = 13) and older adults with DM2 (n = 24) were administered recognition memory tasks in which stimuli (faces, objects and words) were presented under conditions of either i) low (stimulus presented without a background pattern) or ii) high (stimulus presented against a background pattern) cognitive load. In a subsequent recognition phase, the DM2 group recognized fewer faces than healthy controls. Further, the DM2 group exhibited word recognition deficits in the low cognitive load condition. The recognition memory impairment observed in patients with DM2 has clear implications for day-to-day functioning. Although these deficits were not amplified under conditions of increased cognitive load, the present study emphasizes that recognition memory impairment for both words and more complex stimuli such as face are a feature of DM2 in older adults. Copyright © 2016 IMSS. Published by Elsevier Inc. All rights reserved.

  14. Early life stress impairs social recognition due to a blunted response of vasopressin release within the septum of adult male rats.

    Science.gov (United States)

    Lukas, Michael; Bredewold, Remco; Landgraf, Rainer; Neumann, Inga D; Veenema, Alexa H

    2011-07-01

    Early life stress poses a risk for the development of psychopathologies characterized by disturbed emotional, social, and cognitive performance. We used maternal separation (MS, 3h daily, postnatal days 1-14) to test whether early life stress impairs social recognition performance in juvenile (5-week-old) and adult (16-week-old) male Wistar rats. Social recognition was tested in the social discrimination test and defined by increased investigation by the experimental rat towards a novel rat compared with a previously encountered rat. Juvenile control and MS rats demonstrated successful social recognition at inter-exposure intervals of 30 and 60 min. However, unlike adult control rats, adult MS rats failed to discriminate between a previously encountered and a novel rat after 60 min. The social recognition impairment of adult MS rats was accompanied by a lack of a rise in arginine vasopressin (AVP) release within the lateral septum seen during social memory acquisition in adult control rats. This blunted response of septal AVP release was social stimulus-specific because forced swimming induced a rise in septal AVP release in both control and MS rats. Retrodialysis of AVP (1 μg/ml, 3.3 μl/min, 30 min) into the lateral septum during social memory acquisition restored social recognition in adult MS rats at the 60-min interval. These studies demonstrate that MS impairs social recognition performance in adult rats, which is likely caused by blunted septal AVP activation. Impaired social recognition may be linked to MS-induced changes in other social behaviors like aggression as shown previously. Copyright © 2010 Elsevier Ltd. All rights reserved.

  15. The primacy of perceiving: emotion recognition buffers negative effects of emotional labor

    NARCIS (Netherlands)

    Bechtoldt, M.N.; Rohrmann, S.; de Pater, I.E.; Beersma, B.

    2011-01-01

    There is ample empirical evidence for negative effects of emotional labor (surface acting and deep acting) on workers' well-being. This study analyzed to what extent workers' ability to recognize others' emotions may buffer these effects. In a 4-week study with 85 nurses and police officers, emotion

  16. Child's recognition of emotions in robot's face and body

    NARCIS (Netherlands)

    Cohen, I.; Looije, R.; Neerincx, M.A.

    2011-01-01

    Social robots can comfort and support children who have to cope with chronic diseases. In previous studies, a "facial robot", the iCat, proved to show well-recognized emotional expressions that are important in social interactions. The question is if a mobile robot without a face, the Nao, can

  17. Covert brand recognition engages emotion-specific brain networks.

    Science.gov (United States)

    Casarotto, Silvia; Ricciardi, Emiliano; Romani, Simona; Dalli, Daniele; Pietrini, Pietro

    2012-12-01

    Consumer goods' brands have become a major driver of consumers' choice: they have got symbolic, relational and even social properties that add substantial cultural and affective value to goods and services. Therefore, measuring the role of brands in consumers' cognitive and affective processes would be very helpful to better understand economic decision making. This work aimed at finding the neural correlates of automatic, spontaneous emotional response to brands, showing how deeply integrated are consumption symbols within the cognitive and affective processes of individuals. Functional magnetic resonance imaging (fMRI) was measured during a visual oddball paradigm consisting in the presentation of scrambled pictures as frequent stimuli, colored squares as targets, and brands and emotional pictures (selected from the International Affective Picture System [IAPS]) as emotionally-salient distractors. Affective rating of brands was assessed individually after scanning by a validated questionnaire. Results showed that, similarly to IAPS pictures, brands activated a well-defined emotional network, including amygdala and dorsolateral prefrontal cortex, highly specific of affective valence. In conclusion, this work identified the neural correlates of brands within cognitive and affective processes of consumers.

  18. USE OF FACIAL EMOTION RECOGNITION IN E-LEARNING SYSTEMS

    Directory of Open Access Journals (Sweden)

    Uğur Ayvaz

    2017-09-01

    Full Text Available Since the personal computer usage and internet bandwidth are increasing, e-learning systems are also widely spreading. Although e-learning has some advantages in terms of information accessibility, time and place flexibility compared to the formal learning, it does not provide enough face-to-face interactivity between an educator and learners. In this study, we are proposing a hybrid information system, which is combining computer vision and machine learning technologies for visual and interactive e-learning systems. The proposed information system detects emotional states of the learners and gives feedback to an educator about their instant and weighted emotional states based on facial expressions. In this way, the educator will be aware of the general emotional state of the virtual classroom and the system will create a formal learning-like interactive environment. Herein, several classification algorithms were applied to learn instant emotional state and the best accuracy rates were obtained using kNN and SVM algorithms.

  19. Oxytocin improves facial emotion recognition in young adults with antisocial personality disorder.

    Science.gov (United States)

    Timmermann, Marion; Jeung, Haang; Schmitt, Ruth; Boll, Sabrina; Freitag, Christine M; Bertsch, Katja; Herpertz, Sabine C

    2017-11-01

    Deficient facial emotion recognition has been suggested to underlie aggression in individuals with antisocial personality disorder (ASPD). As the neuropeptide oxytocin (OT) has been shown to improve facial emotion recognition, it might also exert beneficial effects in individuals providing so much harm to the society. In a double-blind, randomized, placebo-controlled crossover trial, 22 individuals with ASPD and 29 healthy control (HC) subjects (matched for age, sex, intelligence, and education) were intranasally administered either OT (24 IU) or a placebo 45min before participating in an emotion classification paradigm with fearful, angry, and happy faces. We assessed the number of correct classifications and reaction times as indicators of emotion recognition ability. Significant group×substance×emotion interactions were found in correct classifications and reaction times. Compared to HC, individuals with ASPD showed deficits in recognizing fearful and happy faces; these group differences were no longer observable under OT. Additionally, reaction times for angry faces differed significantly between the ASPD and HC group in the placebo condition. This effect was mainly driven by longer reaction times in HC subjects after placebo administration compared to OT administration while individuals with ASPD revealed descriptively the contrary response pattern. Our data indicate an improvement of the recognition of fearful and happy facial expressions by OT in young adults with ASPD. Particularly the increased recognition of facial fear is of high importance since the correct perception of distress signals in others is thought to inhibit aggression. Beneficial effects of OT might be further mediated by improved recognition of facial happiness probably reflecting increased social reward responsiveness. Copyright © 2017. Published by Elsevier Ltd.

  20. Arousal rather than basic emotions influence long-term recognition memory in humans.

    Directory of Open Access Journals (Sweden)

    Artur Marchewka

    2016-10-01

    Full Text Available Emotion can influence various cognitive processes, however its impact on memory has been traditionally studied over relatively short retention periods and in line with dimensional models of affect. The present study aimed to investigate emotional effects on long-term recognition memory according to a combined framework of affective dimensions and basic emotions. Images selected from the Nencki Affective Picture System were rated on the scale of affective dimensions and basic emotions. After six months, subjects took part in a surprise recognition test during an fMRI session. The more negative the pictures the better they were remembered, but also the more false recognitions they provoked. Similar effects were found for the arousal dimension. Recognition success was grea