WorldWideScience

Sample records for recognized pure emotions

  1. Dogs recognize dog and human emotions.

    Science.gov (United States)

    Albuquerque, Natalia; Guo, Kun; Wilkinson, Anna; Savalli, Carine; Otta, Emma; Mills, Daniel

    2016-01-01

    The perception of emotional expressions allows animals to evaluate the social intentions and motivations of each other. This usually takes place within species; however, in the case of domestic dogs, it might be advantageous to recognize the emotions of humans as well as other dogs. In this sense, the combination of visual and auditory cues to categorize others' emotions facilitates the information processing and indicates high-level cognitive representations. Using a cross-modal preferential looking paradigm, we presented dogs with either human or dog faces with different emotional valences (happy/playful versus angry/aggressive) paired with a single vocalization from the same individual with either a positive or negative valence or Brownian noise. Dogs looked significantly longer at the face whose expression was congruent to the valence of vocalization, for both conspecifics and heterospecifics, an ability previously known only in humans. These results demonstrate that dogs can extract and integrate bimodal sensory emotional information, and discriminate between positive and negative emotions from both humans and dogs. © 2016 The Author(s).

  2. Recognizing Induced Emotions of Happiness and Sadness from Dance Movement

    Science.gov (United States)

    Van Dyck, Edith; Vansteenkiste, Pieter; Lenoir, Matthieu; Lesaffre, Micheline; Leman, Marc

    2014-01-01

    Recent research revealed that emotional content can be successfully decoded from human dance movement. Most previous studies made use of videos of actors or dancers portraying emotions through choreography. The current study applies emotion induction techniques and free movement in order to examine the recognition of emotional content from dance. Observers (N = 30) watched a set of silent videos showing depersonalized avatars of dancers moving to an emotionally neutral musical stimulus after emotions of either sadness or happiness had been induced. Each of the video clips consisted of two dance performances which were presented side-by-side and were played simultaneously; one of a dancer in the happy condition and one of the same individual in the sad condition. After every film clip, the observers were asked to make forced-choices concerning the emotional state of the dancer. Results revealed that observers were able to identify the emotional state of the dancers with a high degree of accuracy. Moreover, emotions were more often recognized for female dancers than for their male counterparts. In addition, the results of eye tracking measurements unveiled that observers primarily focus on movements of the chest when decoding emotional information from dance movement. The findings of our study show that not merely portrayed emotions, but also induced emotions can be successfully recognized from free dance movement. PMID:24587026

  3. Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals

    Directory of Open Access Journals (Sweden)

    Nasoz Fatma

    2004-01-01

    Full Text Available We discuss the strong relationship between affect and cognition and the importance of emotions in multimodal human computer interaction (HCI and user modeling. We introduce the overall paradigm for our multimodal system that aims at recognizing its users' emotions and at responding to them accordingly depending upon the current context or application. We then describe the design of the emotion elicitation experiment we conducted by collecting, via wearable computers, physiological signals from the autonomic nervous system (galvanic skin response, heart rate, temperature and mapping them to certain emotions (sadness, anger, fear, surprise, frustration, and amusement. We show the results of three different supervised learning algorithms that categorize these collected signals in terms of emotions, and generalize their learning to recognize emotions from new collections of signals. We finally discuss possible broader impact and potential applications of emotion recognition for multimodal intelligent systems.

  4. Expression intensity, gender and facial emotion recognition: Women recognize only subtle facial emotions better than men.

    Science.gov (United States)

    Hoffmann, Holger; Kessler, Henrik; Eppel, Tobias; Rukavina, Stefanie; Traue, Harald C

    2010-11-01

    Two experiments were conducted in order to investigate the effect of expression intensity on gender differences in the recognition of facial emotions. The first experiment compared recognition accuracy between female and male participants when emotional faces were shown with full-blown (100% emotional content) or subtle expressiveness (50%). In a second experiment more finely grained analyses were applied in order to measure recognition accuracy as a function of expression intensity (40%-100%). The results show that although women were more accurate than men in recognizing subtle facial displays of emotion, there was no difference between male and female participants when recognizing highly expressive stimuli. Copyright © 2010 Elsevier B.V. All rights reserved.

  5. Effects of Facial Expressions on Recognizing Emotions in Dance Movements

    Directory of Open Access Journals (Sweden)

    Nao Shikanai

    2011-10-01

    Full Text Available Effects of facial expressions on recognizing emotions expressed in dance movements were investigated. Dancers expressed three emotions: joy, sadness, and anger through dance movements. We used digital video cameras and a 3D motion capturing system to record and capture the movements. We then created full-video displays with an expressive face, full-video displays with an unexpressive face, stick figure displays (no face, or point-light displays (no face from these data using 3D animation software. To make point-light displays, 13 markers were attached to the body of each dancer. We examined how accurately observers were able to identify the expression that the dancers intended to create through their dance movements. Dance experienced and inexperienced observers participated in the experiment. They watched the movements and rated the compatibility of each emotion with each movement on a 5-point Likert scale. The results indicated that both experienced and inexperienced observers could identify all the emotions that dancers intended to express. Identification scores for dance movements with an expressive face were higher than for other expressions. This finding indicates that facial expressions affect the identification of emotions in dance movements, whereas only bodily expressions provide sufficient information to recognize emotions.

  6. Recognizing emotions from EEG subbands using wavelet analysis.

    Science.gov (United States)

    Candra, Henry; Yuwono, Mitchell; Handojoseno, Ardi; Chai, Rifai; Su, Steven; Nguyen, Hung T

    2015-01-01

    Objectively recognizing emotions is a particularly important task to ensure that patients with emotional symptoms are given the appropriate treatments. The aim of this study was to develop an emotion recognition system using Electroencephalogram (EEG) signals to identify four emotions including happy, sad, angry, and relaxed. We approached this objective by firstly investigating the relevant EEG frequency band followed by deciding the appropriate feature extraction method. Two features were considered namely: 1. Wavelet Energy, and 2. Wavelet Entropy. EEG Channels reduction was then implemented to reduce the complexity of the features. The ground truth emotional states of each subject were inferred using Russel's circumplex model of emotion, that is, by mapping the subjectively reported degrees of valence (pleasure) and arousal to the appropriate emotions - for example, an emotion with high valence and high arousal is equivalent to a `happy' emotional state, while low valence and low arousal is equivalent to a `sad' emotional state. The Support Vector Machine (SVM) classifier was then used for mapping each feature vector into corresponding discrete emotions. The results presented in this study indicated thatWavelet features extracted from alpha, beta and gamma bands seem to provide the necessary information for describing the aforementioned emotions. Using the DEAP (Dataset for Emotion Analysis using electroencephalogram, Physiological and Video Signals), our proposed method achieved an average sensitivity and specificity of 77.4% ± 14.1% and 69.1% ± 12.8%, respectively.

  7. Do Preschoolers Recognize The Emotional Expressiveness of Colors in Realistic and Abstract Art Paintings?

    Science.gov (United States)

    Pouliou, Dimitra; Bonoti, Fotini; Nikonanou, Niki

    2018-01-01

    The present study was designed to examine preschoolers' ability to recognize the emotional expressiveness of an art painting, through its colors. To attain this aim 78 children, 3-5 years old were presented with realistic and abstract paintings conveying either happiness or sadness and were asked to choose those which matched the appropriate emotion. In total 16 paintings were used, which varied in color, while their subject matter was held as constant as possible after they had been previously rated by a group of adults to ensure that they conveyed the two emotions under investigation. Results showed that children's ability to recognize the emotional expressiveness of a painting through its colors appears at 3 years old and increases significantly at 4 and 5 years old. It was also found that the mood of happiness was more easily recognized than that of sadness, while the style of art paintings (realistic vs. abstract) did not affect children's ability to recognize emotions.

  8. Effect of Emotion and Personality on Deviation from Purely Rational Decision-Making

    OpenAIRE

    Fiori, M.; Lintas, A.; Mesrobian, S.; Villa, A. E. P.

    2013-01-01

    Human decision-making has consistently demonstrated deviation from "pure" rationality. Emotions are a primary driver of human actions and the current study investigates how perceived emotions and personality traits may affect decision-making during the Ultimatum Game (UG). We manipulated emotions by showing images with emotional connotation while participants decided how to split money with a second player. Event-related potentials (ERPs) from scalp electrodes were recorded during the whole d...

  9. Random Deep Belief Networks for Recognizing Emotions from Speech Signals.

    Science.gov (United States)

    Wen, Guihua; Li, Huihui; Huang, Jubing; Li, Danyang; Xun, Eryang

    2017-01-01

    Now the human emotions can be recognized from speech signals using machine learning methods; however, they are challenged by the lower recognition accuracies in real applications due to lack of the rich representation ability. Deep belief networks (DBN) can automatically discover the multiple levels of representations in speech signals. To make full of its advantages, this paper presents an ensemble of random deep belief networks (RDBN) method for speech emotion recognition. It firstly extracts the low level features of the input speech signal and then applies them to construct lots of random subspaces. Each random subspace is then provided for DBN to yield the higher level features as the input of the classifier to output an emotion label. All outputted emotion labels are then fused through the majority voting to decide the final emotion label for the input speech signal. The conducted experimental results on benchmark speech emotion databases show that RDBN has better accuracy than the compared methods for speech emotion recognition.

  10. Random Deep Belief Networks for Recognizing Emotions from Speech Signals

    Directory of Open Access Journals (Sweden)

    Guihua Wen

    2017-01-01

    Full Text Available Now the human emotions can be recognized from speech signals using machine learning methods; however, they are challenged by the lower recognition accuracies in real applications due to lack of the rich representation ability. Deep belief networks (DBN can automatically discover the multiple levels of representations in speech signals. To make full of its advantages, this paper presents an ensemble of random deep belief networks (RDBN method for speech emotion recognition. It firstly extracts the low level features of the input speech signal and then applies them to construct lots of random subspaces. Each random subspace is then provided for DBN to yield the higher level features as the input of the classifier to output an emotion label. All outputted emotion labels are then fused through the majority voting to decide the final emotion label for the input speech signal. The conducted experimental results on benchmark speech emotion databases show that RDBN has better accuracy than the compared methods for speech emotion recognition.

  11. Older adults' decoding of emotions: age-related differences in interpreting dynamic emotional displays and the well-preserved ability to recognize happiness.

    Science.gov (United States)

    Moraitou, Despina; Papantoniou, Georgia; Gkinopoulos, Theofilos; Nigritinou, Magdalini

    2013-09-01

    Although the ability to recognize emotions through bodily and facial muscular movements is vital to everyday life, numerous studies have found that older adults are less adept at identifying emotions than younger adults. The message gleaned from research has been one of greater decline in abilities to recognize specific negative emotions than positive ones. At the same time, these results raise methodological issues with regard to different modalities in which emotion decoding is measured. The main aim of the present study is to identify the pattern of age differences in the ability to decode basic emotions from naturalistic visual emotional displays. The sample comprised a total of 208 adults from Greece, aged from 18 to 86 years. Participants were examined using the Emotion Evaluation Test, which is the first part of a broader audiovisual tool, The Awareness of Social Inference Test. The Emotion Evaluation Test was designed to examine a person's ability to identify six emotions and discriminate these from neutral expressions, as portrayed dynamically by professional actors. The findings indicate that decoding of basic emotions occurs along the broad affective dimension of uncertainty, and a basic step in emotion decoding involves recognizing whether information presented is emotional or not. Age was found to negatively affect the ability to decode basic negatively valenced emotions as well as pleasant surprise. Happiness decoding is the only ability that was found well-preserved with advancing age. The main conclusion drawn from the study is that the pattern in which emotion decoding from visual cues is affected by normal ageing depends on the rate of uncertainty, which either is related to decoding difficulties or is inherent to a specific emotion. © 2013 The Authors. Psychogeriatrics © 2013 Japanese Psychogeriatric Society.

  12. Recognizing emotional speech in Persian: a validated database of Persian emotional speech (Persian ESD).

    Science.gov (United States)

    Keshtiari, Niloofar; Kuhlmann, Michael; Eslami, Moharram; Klann-Delius, Gisela

    2015-03-01

    Research on emotional speech often requires valid stimuli for assessing perceived emotion through prosody and lexical content. To date, no comprehensive emotional speech database for Persian is officially available. The present article reports the process of designing, compiling, and evaluating a comprehensive emotional speech database for colloquial Persian. The database contains a set of 90 validated novel Persian sentences classified in five basic emotional categories (anger, disgust, fear, happiness, and sadness), as well as a neutral category. These sentences were validated in two experiments by a group of 1,126 native Persian speakers. The sentences were articulated by two native Persian speakers (one male, one female) in three conditions: (1) congruent (emotional lexical content articulated in a congruent emotional voice), (2) incongruent (neutral sentences articulated in an emotional voice), and (3) baseline (all emotional and neutral sentences articulated in neutral voice). The speech materials comprise about 470 sentences. The validity of the database was evaluated by a group of 34 native speakers in a perception test. Utterances recognized better than five times chance performance (71.4 %) were regarded as valid portrayals of the target emotions. Acoustic analysis of the valid emotional utterances revealed differences in pitch, intensity, and duration, attributes that may help listeners to correctly classify the intended emotion. The database is designed to be used as a reliable material source (for both text and speech) in future cross-cultural or cross-linguistic studies of emotional speech, and it is available for academic research purposes free of charge. To access the database, please contact the first author.

  13. Recognizing facial expressions of emotion in infancy: A replication and extension.

    Science.gov (United States)

    Safar, Kristina; Moulson, Margaret C

    2017-05-01

    Infants may recognize facial expressions of emotion more readily when familiar faces express the emotions. Studies 1 and 2 investigated whether familiarity influences two metrics of emotion processing: Categorization and spontaneous preference. In Study 1 (n = 32), we replicated previous findings showing an asymmetrical pattern of categorization of happy and fearful faces in 6.5-month-old infants, and extended these findings by demonstrating that infants' categorization did not differ when emotions were expressed by familiar (i.e., caregiver) faces. In Study 2 (n = 34), we replicated the spontaneous preference for fearful over happy expressions in 6.5-month-old infants, and extended these findings by demonstrating that the spontaneous preference for fear was also present for familiar faces. Thus, infants' performance on two metrics of emotion processing did not differ depending on face familiarity. © 2017 Wiley Periodicals, Inc.

  14. Older adults' decoding of emotions: age-related differences in interpreting dynamic emotional displays and the well-preserved ability to recognize happiness

    OpenAIRE

    Moraitou, Despina; Papantoniou, Georgia; Gkinopoulos, Theofilos; Nigritinou, Magdalini

    2013-01-01

    Background Although the ability to recognize emotions through bodily and facial muscular movements is vital to everyday life, numerous studies have found that older adults are less adept at identifying emotions, compared to younger ones. The message gleaned from research has been rather a message for greater decline for specific negative emotions than positive ones. At the same time, it refers to methodological issues raised with regard to different modalities in which emotion decoding i...

  15. Recognizing vocal emotions in Mandarin Chinese: a validated database of Chinese vocal emotional stimuli.

    Science.gov (United States)

    Liu, Pan; Pell, Marc D

    2012-12-01

    To establish a valid database of vocal emotional stimuli in Mandarin Chinese, a set of Chinese pseudosentences (i.e., semantically meaningless sentences that resembled real Chinese) were produced by four native Mandarin speakers to express seven emotional meanings: anger, disgust, fear, sadness, happiness, pleasant surprise, and neutrality. These expressions were identified by a group of native Mandarin listeners in a seven-alternative forced choice task, and items reaching a recognition rate of at least three times chance performance in the seven-choice task were selected as a valid database and then subjected to acoustic analysis. The results demonstrated expected variations in both perceptual and acoustic patterns of the seven vocal emotions in Mandarin. For instance, fear, anger, sadness, and neutrality were associated with relatively high recognition, whereas happiness, disgust, and pleasant surprise were recognized less accurately. Acoustically, anger and pleasant surprise exhibited relatively high mean f0 values and large variation in f0 and amplitude; in contrast, sadness, disgust, fear, and neutrality exhibited relatively low mean f0 values and small amplitude variations, and happiness exhibited a moderate mean f0 value and f0 variation. Emotional expressions varied systematically in speech rate and harmonics-to-noise ratio values as well. This validated database is available to the research community and will contribute to future studies of emotional prosody for a number of purposes. To access the database, please contact pan.liu@mail.mcgill.ca.

  16. Selecting pure-emotion materials from the International Affective Picture System (IAPS by Chinese university students: A study based on intensity-ratings only

    Directory of Open Access Journals (Sweden)

    Zhicha Xu

    2017-08-01

    Full Text Available There is a need to use selected pictures with pure emotion as stimulation or treatment media for basic and clinical research. Pictures from the widely-used International Affective Picture System (IAPS contain rich emotions, but no study has clearly stated that an emotion is exclusively expressed in its putative IAPS picture to date. We hypothesize that the IAPS images contain at least pure vectors of disgust, erotism (or erotica, fear, happiness, sadness and neutral emotions. Accordingly, we have selected 108 IAPS images, each with a specific emotion, and invited 219 male and 274 female university students to rate only the intensity of the emotion conveyed in each picture. Their answers were analyzed using exploratory and confirmatory factor analysis. Four first-order factors manifested as disgust-fear, happiness-sadness, erotism, and neutral. Later, ten second-order sub-factors manifested as mutilation-disgust, vomit-disgust, food-disgust, violence-fear, happiness, sadness, couple- erotism, female-erotism, male- erotism, and neutral. Fifty-nine pictures for the ten sub-factors, which had established good model-fit indices, satisfactory sub-factor internal reliabilities, and prominent gender-differences in the picture intensity ratings were ultimately retained. We thus have selected a series of pure-emotion IAPS pictures, which together displayed both satisfactorily convergent and discriminant structure-validities. We did not intend to evaluate all IAPS items, but instead selected some pictures conveying pure emotions, which might help both basic and clinical researches in the future.

  17. Racism and Psychological and Emotional Injury: Recognizing and Assessing Race-Based Traumatic Stress

    Science.gov (United States)

    Carter, Robert T.

    2007-01-01

    The purpose of this article is to discuss the psychological and emotional effects of racism on people of Color. Psychological models and research on racism, discrimination, stress, and trauma will be integrated to promote a model to be used to understand, recognize, and assess race-based traumatic stress to aid counseling and psychological…

  18. The voice of emotion across species: how do human listeners recognize animals' affective states?

    Directory of Open Access Journals (Sweden)

    Marina Scheumann

    Full Text Available Voice-induced cross-taxa emotional recognition is the ability to understand the emotional state of another species based on its voice. In the past, induced affective states, experience-dependent higher cognitive processes or cross-taxa universal acoustic coding and processing mechanisms have been discussed to underlie this ability in humans. The present study sets out to distinguish the influence of familiarity and phylogeny on voice-induced cross-taxa emotional perception in humans. For the first time, two perspectives are taken into account: the self- (i.e. emotional valence induced in the listener versus the others-perspective (i.e. correct recognition of the emotional valence of the recording context. Twenty-eight male participants listened to 192 vocalizations of four different species (human infant, dog, chimpanzee and tree shrew. Stimuli were recorded either in an agonistic (negative emotional valence or affiliative (positive emotional valence context. Participants rated the emotional valence of the stimuli adopting self- and others-perspective by using a 5-point version of the Self-Assessment Manikin (SAM. Familiarity was assessed based on subjective rating, objective labelling of the respective stimuli and interaction time with the respective species. Participants reliably recognized the emotional valence of human voices, whereas the results for animal voices were mixed. The correct classification of animal voices depended on the listener's familiarity with the species and the call type/recording context, whereas there was less influence of induced emotional states and phylogeny. Our results provide first evidence that explicit voice-induced cross-taxa emotional recognition in humans is shaped more by experience-dependent cognitive mechanisms than by induced affective states or cross-taxa universal acoustic coding and processing mechanisms.

  19. RECOGNIZING INFANTS' EMOTIONAL EXPRESSIONS: ARE ADOLESCENTS LESS SENSITIVE TO INFANTS' CUES?

    Science.gov (United States)

    Niessen, Anke; Konrad, Kerstin; Dahmen, Brigitte; Herpertz-Dahlmann, Beate; Firk, Christine

    2017-07-01

    Previous studies have shown that adolescent mothers interact less sensitively with their infants than do adult mothers. This difference might be due to developmental difficulties in the recognition of infants' emotional states in adolescents. Therefore, the aim of the current study was to explore differences in the recognition of infant signals between nonparous adolescent girls and boys as compared to female and male adults. To this end, we examined 54 childless adolescents and 54 childless adults (50% female). Participants were shown a series of 20 short videos of infants aged 3 to 6 months presenting different emotional states ranging from very distressed to very happy. In addition, participants were asked to report their own parental experiences using the German version, Fragebogen zum erinnerten elterlichen Erziehungsverhalten (J. Schumacher, M. Eisemann, & E. Brähler, ), of the Egna Minnen Befräffande Uppfostran (Own Memories of Parental Rearing Experiences in Childhood; C. Perris, L. Jacobsson, H. Lindstrom, L. von Knorring, & H. Perris, ). Adolescents rated distressed infants as more distressed than did the adults. Furthermore, female participants rated the very distressed infants as more distressed than did male participants. These data suggest that adolescents, in general, are not impaired in recognizing infant emotional states, as compared to adults. Thus, we suggest that more extreme ratings of infant signals of discomfort together with immature sociocognitive regulation processes during adolescence might contribute to reduced sensitivity observed in adolescent mothers. © 2017 Michigan Association for Infant Mental Health.

  20. Emotion Locomotion: Promoting the Emotional Health of Elementary School Children by Recognizing Emotions

    Science.gov (United States)

    McLachlan, Debra A.; Burgos, Teresa; Honeycutt, Holly K.; Linam, Eve H.; Moneymaker, Laura D.; Rathke, Meghan K.

    2009-01-01

    Emotion recognition is a critical life skill children need for mental health promotion to meet the complexities and challenges of growing up in the world today. Five nursing students and their instructor designed "Emotion Locomotion," a program for children ages 6-8 during a public health nursing practicum for an inner-city parochial school.…

  1. Understanding Mixed Emotions: Paradigms and Measures

    Science.gov (United States)

    Kreibig, Sylvia D.; Gross, James J.

    2017-01-01

    In this review, we examine the paradigms and measures available for experimentally studying mixed emotions in the laboratory. For eliciting mixed emotions, we describe a mixed emotions film library that allows for the repeated elicitation of a specific homogeneous mixed emotional state and appropriately matched pure positive, pure negative, and neutral emotional states. For assessing mixed emotions, we consider subjective and objective measures that fall into univariate, bivariate, and multivariate measurement categories. As paradigms and measures for objectively studying mixed emotions are still in their early stages, we conclude by outlining future directions that focus on the reliability, temporal dynamics, and response coherence of mixed emotions paradigms and measures. This research will build a strong foundation for future studies and significantly advance our understanding of mixed emotions. PMID:28804752

  2. Wordsworthian Emotion

    Institute of Scientific and Technical Information of China (English)

    张敏

    2010-01-01

    As a great poet in British Romanticism.Wordsworth is not the practioner of an artistic craft designed tO satisfy "taste" of a literary connoisseur.He is,instead."a man speaking to men" with his uniqueness in emotion.This paper tempts to demonstrate how Wordsworth conveys emotion with poetic language.Wordsworthian "emotion recollected in tranquility" is simple,pure and genuine,which is the true art in wordsworth's poems.

  3. A new look at emotion perception: Concepts speed and shape facial emotion recognition.

    Science.gov (United States)

    Nook, Erik C; Lindquist, Kristen A; Zaki, Jamil

    2015-10-01

    Decades ago, the "New Look" movement challenged how scientists thought about vision by suggesting that conceptual processes shape visual perceptions. Currently, affective scientists are likewise debating the role of concepts in emotion perception. Here, we utilized a repetition-priming paradigm in conjunction with signal detection and individual difference analyses to examine how providing emotion labels-which correspond to discrete emotion concepts-affects emotion recognition. In Study 1, pairing emotional faces with emotion labels (e.g., "sad") increased individuals' speed and sensitivity in recognizing emotions. Additionally, individuals with alexithymia-who have difficulty labeling their own emotions-struggled to recognize emotions based on visual cues alone, but not when emotion labels were provided. Study 2 replicated these findings and further demonstrated that emotion concepts can shape perceptions of facial expressions. Together, these results suggest that emotion perception involves conceptual processing. We discuss the implications of these findings for affective, social, and clinical psychology. (c) 2015 APA, all rights reserved).

  4. Facial Emotion Recognition and Expression in Parkinson's Disease: An Emotional Mirror Mechanism?

    Science.gov (United States)

    Ricciardi, Lucia; Visco-Comandini, Federica; Erro, Roberto; Morgante, Francesca; Bologna, Matteo; Fasano, Alfonso; Ricciardi, Diego; Edwards, Mark J; Kilner, James

    2017-01-01

    Parkinson's disease (PD) patients have impairment of facial expressivity (hypomimia) and difficulties in interpreting the emotional facial expressions produced by others, especially for aversive emotions. We aimed to evaluate the ability to produce facial emotional expressions and to recognize facial emotional expressions produced by others in a group of PD patients and a group of healthy participants in order to explore the relationship between these two abilities and any differences between the two groups of participants. Twenty non-demented, non-depressed PD patients and twenty healthy participants (HC) matched for demographic characteristics were studied. The ability of recognizing emotional facial expressions was assessed with the Ekman 60-faces test (Emotion recognition task). Participants were video-recorded while posing facial expressions of 6 primary emotions (happiness, sadness, surprise, disgust, fear and anger). The most expressive pictures for each emotion were derived from the videos. Ten healthy raters were asked to look at the pictures displayed on a computer-screen in pseudo-random fashion and to identify the emotional label in a six-forced-choice response format (Emotion expressivity task). Reaction time (RT) and accuracy of responses were recorded. At the end of each trial the participant was asked to rate his/her confidence in his/her perceived accuracy of response. For emotion recognition, PD reported lower score than HC for Ekman total score (pemotions sub-scores happiness, fear, anger, sadness (pfacial emotion expressivity task, PD and HC significantly differed in the total score (p = 0.05) and in the sub-scores for happiness, sadness, anger (all pemotions. There was a significant positive correlation between the emotion facial recognition and expressivity in both groups; the correlation was even stronger when ranking emotions from the best recognized to the worst (R = 0.75, p = 0.004). PD patients showed difficulties in recognizing emotional

  5. Facial emotion recognition in male antisocial personality disorders with or without adult attention deficit hyperactivity disorder.

    Science.gov (United States)

    Bagcioglu, Erman; Isikli, Hasmet; Demirel, Husrev; Sahin, Esat; Kandemir, Eyup; Dursun, Pinar; Yuksek, Erhan; Emul, Murat

    2014-07-01

    We aimed to investigate facial emotion recognition abilities in violent individuals with antisocial personality disorder who have comorbid attention deficient hyperactivity disorder (ADHD) or not. The photos of happy, surprised, fearful, sad, angry, disgust, and neutral facial expressions and Wender Utah Rating Scale have been performed in all groups. The mean ages were as follows: in antisocial personality disorder with ADHD 22.0 ± 1.59, in pure antisocial individuals 21.90 ± 1.80 and in controls 22.97 ± 2.85 (p>0.05). The mean score in Wender Utah Rating Scale was significantly different between groups (p0.05) excluding disgust faces which was significantly impaired in ASPD+ADHD and pure ASPD groups. Antisocial individuals with attention deficient and hyperactivity had spent significantly more time to each facial emotion than healthy controls (pantisocial individual had more time to recognize disgust and neutral faces than healthy controls (pantisocial individuals and antisocial individuals with ADHD. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Emotional availability, understanding emotions, and recognition of facial emotions in obese mothers with young children.

    Science.gov (United States)

    Bergmann, Sarah; von Klitzing, Kai; Keitel-Korndörfer, Anja; Wendt, Verena; Grube, Matthias; Herpertz, Sarah; Schütz, Astrid; Klein, Annette M

    2016-01-01

    Recent research has identified mother-child relationships of low quality as possible risk factors for childhood obesity. However, it remains open how mothers' own obesity influences the quality of mother-child interaction, and particularly emotional availability (EA). Also unclear is the influence of maternal emotional competencies, i.e. understanding emotions and recognizing facial emotions. This study aimed to (1) investigate differences between obese and normal-weight mothers regarding mother-child EA, maternal understanding emotions and recognition of facial emotions, and (2) explore how maternal emotional competencies and maternal weight interact with each other in predicting EA. A better understanding of these associations could inform strategies of obesity prevention especially in children at risk. We assessed EA, understanding emotions and recognition of facial emotions in 73 obese versus 73 normal-weight mothers, and their children aged 6 to 47 months (Mchild age=24.49, 80 females). Obese mothers showed lower EA and understanding emotions. Mothers' normal weight and their ability to understand emotions were positively associated with EA. The ability to recognize facial emotions was positively associated with EA in obese but not in normal-weight mothers. Maternal weight status indirectly influenced EA through its effect on understanding emotions. Maternal emotional competencies may play an important role for establishing high EA in interaction with the child. Children of obese mothers experience lower EA, which may contribute to overweight development. We suggest including elements that aim to improve maternal emotional competencies and mother-child EA in prevention or intervention programmes targeting childhood obesity. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Parents’ Emotion-Related Beliefs, Behaviors, and Skills Predict Children's Recognition of Emotion

    Science.gov (United States)

    Castro, Vanessa L.; Halberstadt, Amy G.; Lozada, Fantasy T.; Craig, Ashley B.

    2015-01-01

    Children who are able to recognize others’ emotions are successful in a variety of socioemotional domains, yet we know little about how school-aged children's abilities develop, particularly in the family context. We hypothesized that children develop emotion recognition skill as a function of parents’ own emotion-related beliefs, behaviors, and skills. We examined parents’ beliefs about the value of emotion and guidance of children's emotion, parents’ emotion labeling and teaching behaviors, and parents’ skill in recognizing children's emotions in relation to their school-aged children's emotion recognition skills. Sixty-nine parent-child dyads completed questionnaires, participated in dyadic laboratory tasks, and identified their own emotions and emotions felt by the other participant from videotaped segments. Regression analyses indicate that parents’ beliefs, behaviors, and skills together account for 37% of the variance in child emotion recognition ability, even after controlling for parent and child expressive clarity. The findings suggest the importance of the family milieu in the development of children's emotion recognition skill in middle childhood, and add to accumulating evidence suggesting important age-related shifts in the relation between parental emotion socialization and child emotional development. PMID:26005393

  8. Aging and emotional expressions: is there a positivity bias during dynamic emotion recognition?

    OpenAIRE

    Di Domenico, Alberto; Palumbo, Rocco; Mammarella, Nicola; Fairfield, Beth

    2015-01-01

    In this study, we investigated whether age-related differences in emotion regulation priorities influence online dynamic emotional facial discrimination. A group of 40 younger and a group of 40 older adults were invited to recognize a positive or negative expression as soon as the expression slowly emerged and subsequently rate it in terms of intensity. Our findings show that older adults recognized happy expressions faster than angry ones, while the direction of emotional expression does not...

  9. The automaticity of emotion recognition.

    Science.gov (United States)

    Tracy, Jessica L; Robins, Richard W

    2008-02-01

    Evolutionary accounts of emotion typically assume that humans evolved to quickly and efficiently recognize emotion expressions because these expressions convey fitness-enhancing messages. The present research tested this assumption in 2 studies. Specifically, the authors examined (a) how quickly perceivers could recognize expressions of anger, contempt, disgust, embarrassment, fear, happiness, pride, sadness, shame, and surprise; (b) whether accuracy is improved when perceivers deliberate about each expression's meaning (vs. respond as quickly as possible); and (c) whether accurate recognition can occur under cognitive load. Across both studies, perceivers quickly and efficiently (i.e., under cognitive load) recognized most emotion expressions, including the self-conscious emotions of pride, embarrassment, and shame. Deliberation improved accuracy in some cases, but these improvements were relatively small. Discussion focuses on the implications of these findings for the cognitive processes underlying emotion recognition.

  10. Facial Emotion Recognition and Expression in Parkinson’s Disease: An Emotional Mirror Mechanism?

    Science.gov (United States)

    Ricciardi, Lucia; Visco-Comandini, Federica; Erro, Roberto; Morgante, Francesca; Bologna, Matteo; Fasano, Alfonso; Ricciardi, Diego; Edwards, Mark J.; Kilner, James

    2017-01-01

    Background and aim Parkinson’s disease (PD) patients have impairment of facial expressivity (hypomimia) and difficulties in interpreting the emotional facial expressions produced by others, especially for aversive emotions. We aimed to evaluate the ability to produce facial emotional expressions and to recognize facial emotional expressions produced by others in a group of PD patients and a group of healthy participants in order to explore the relationship between these two abilities and any differences between the two groups of participants. Methods Twenty non-demented, non-depressed PD patients and twenty healthy participants (HC) matched for demographic characteristics were studied. The ability of recognizing emotional facial expressions was assessed with the Ekman 60-faces test (Emotion recognition task). Participants were video-recorded while posing facial expressions of 6 primary emotions (happiness, sadness, surprise, disgust, fear and anger). The most expressive pictures for each emotion were derived from the videos. Ten healthy raters were asked to look at the pictures displayed on a computer-screen in pseudo-random fashion and to identify the emotional label in a six-forced-choice response format (Emotion expressivity task). Reaction time (RT) and accuracy of responses were recorded. At the end of each trial the participant was asked to rate his/her confidence in his/her perceived accuracy of response. Results For emotion recognition, PD reported lower score than HC for Ekman total score (pemotions sub-scores happiness, fear, anger, sadness (pfacial emotion expressivity task, PD and HC significantly differed in the total score (p = 0.05) and in the sub-scores for happiness, sadness, anger (all pemotions. There was a significant positive correlation between the emotion facial recognition and expressivity in both groups; the correlation was even stronger when ranking emotions from the best recognized to the worst (R = 0.75, p = 0.004). Conclusions PD

  11. Comparison of emotion recognition from facial expression and music.

    Science.gov (United States)

    Gaspar, Tina; Labor, Marina; Jurić, Iva; Dumancić, Dijana; Ilakovac, Vesna; Heffer, Marija

    2011-01-01

    The recognition of basic emotions in everyday communication involves interpretation of different visual and auditory clues. The ability to recognize emotions is not clearly determined as their presentation is usually very short (micro expressions), whereas the recognition itself does not have to be a conscious process. We assumed that the recognition from facial expressions is selected over the recognition of emotions communicated through music. In order to compare the success rate in recognizing emotions presented as facial expressions or in classical music works we conducted a survey which included 90 elementary school and 87 high school students from Osijek (Croatia). The participants had to match 8 photographs of different emotions expressed on the face and 8 pieces of classical music works with 8 offered emotions. The recognition of emotions expressed through classical music pieces was significantly less successful than the recognition of emotional facial expressions. The high school students were significantly better at recognizing facial emotions than the elementary school students, whereas girls were better than boys. The success rate in recognizing emotions from music pieces was associated with higher grades in mathematics. Basic emotions are far better recognized if presented on human faces than in music, possibly because the understanding of facial emotions is one of the oldest communication skills in human society. Female advantage in emotion recognition was selected due to the necessity of their communication with the newborns during early development. The proficiency in recognizing emotional content of music and mathematical skills probably share some general cognitive skills like attention, memory and motivation. Music pieces were differently processed in brain than facial expressions and consequently, probably differently evaluated as relevant emotional clues.

  12. Using Alba Emoting™ to work with emotions in psychotherapy.

    Science.gov (United States)

    Kalawski, Juan Pablo

    2013-01-01

    Alba Emoting™ is a physical method to help recognize, induce, express and regulate the basic emotions. This is achieved through specific breathing, postural and facial behaviours. Alba Emoting is based on psychophysiological research by Susana Bloch and her collaborators, who have applied this method mainly to train actors. Alba Emoting can be used in psychotherapy to facilitate emotion awareness, regulation and transformation. It can also help therapists better recognize their own and their clients' emotions. The application of Alba Emoting in psychotherapy is illustrated with a case example. Alba Emoting is a physical, scientific method for working with emotions. Alba Emoting can help therapists better recognize their own and their clients' emotions. Alba Emoting can help clients achieve better emotional awareness and regulation. Alba Emoting can also help clients experience and express emotions they may normally inhibit. Copyright © 2011 John Wiley & Sons, Ltd.

  13. Aging and emotional expressions: is there a positivity bias during dynamic emotion recognition?

    Directory of Open Access Journals (Sweden)

    Alberto eDi Domenico

    2015-08-01

    Full Text Available In this study, we investigated whether age-related differences in emotion regulation priorities influence online dynamic emotional facial discrimination. A group of 40 younger and a group of 40 older adults were invited to recognize a positive or negative expression as soon as the expression slowly emerged and subsequently rate it in terms of intensity. Our findings show that older adults recognized happy expressions faster than angry ones, while the direction of emotional expression does not seem to affect younger adults’ performance. Furthermore, older adults rated both negative and positive emotional faces as more intense compared to younger controls. This study detects age-related differences with a dynamic online paradigm and suggests that different regulation strategies may shape emotional face recognition.

  14. The neural fate of neutral information in emotion-enhanced memory.

    Science.gov (United States)

    Watts, Sarah; Buratto, Luciano G; Brotherhood, Emilie V; Barnacle, Gemma E; Schaefer, Alexandre

    2014-07-01

    In this study, we report evidence that neural activity reflecting the encoding of emotionally neutral information in memory is reduced when neutral and emotional stimuli are intermixed during encoding. Specifically, participants studied emotional and neutral pictures organized in mixed lists (in which emotional and neutral pictures were intermixed) or in pure lists (only-neutral or only-emotional pictures) and performed a recall test. To estimate encoding efficiency, we used the Dm effect, measured with event-related potentials. Recall for neutral items was lower in mixed compared to pure lists and posterior Dm activity for neutral items was reduced in mixed lists, whereas it remained robust in pure lists. These findings might be caused by an asymmetrical competition for attentional and working memory resources between emotional and neutral information, which could be a major determinant of emotional memory effects. Copyright © 2014 Society for Psychophysiological Research.

  15. Recognizing Emotions: Testing an Intervention for Children with Autism Spectrum Disorders

    Science.gov (United States)

    Richard, Donna Abely; More, William; Joy, Stephen P.

    2015-01-01

    A severely impaired capacity for social interaction is one of the characteristics of individuals with autism spectrum disorder (ASD). Deficits in facial emotional recognition processing may be associated with this limitation. The Build-a-Face (BAF) art therapy intervention was developed to assist with emotional recognition through the viewing and…

  16. Emotion categorization of body expressions in narrative scenarios.

    Science.gov (United States)

    Volkova, Ekaterina P; Mohler, Betty J; Dodds, Trevor J; Tesch, Joachim; Bülthoff, Heinrich H

    2014-01-01

    Humans can recognize emotions expressed through body motion with high accuracy even when the stimuli are impoverished. However, most of the research on body motion has relied on exaggerated displays of emotions. In this paper we present two experiments where we investigated whether emotional body expressions could be recognized when they were recorded during natural narration. Our actors were free to use their entire body, face, and voice to express emotions, but our resulting visual stimuli used only the upper body motion trajectories in the form of animated stick figures. Observers were asked to perform an emotion recognition task on short motion sequences using a large and balanced set of emotions (amusement, joy, pride, relief, surprise, anger, disgust, fear, sadness, shame, and neutral). Even with only upper body motion available, our results show recognition accuracy significantly above chance level and high consistency rates among observers. In our first experiment, that used more classic emotion induction setup, all emotions were well recognized. In the second study that employed narrations, four basic emotion categories (joy, anger, fear, and sadness), three non-basic emotion categories (amusement, pride, and shame) and the "neutral" category were recognized above chance. Interestingly, especially in the second experiment, observers showed a bias toward anger when recognizing the motion sequences for emotions. We discovered that similarities between motion sequences across the emotions along such properties as mean motion speed, number of peaks in the motion trajectory and mean motion span can explain a large percent of the variation in observers' responses. Overall, our results show that upper body motion is informative for emotion recognition in narrative scenarios.

  17. The primacy of perceiving: emotion recognition buffers negative effects of emotional labor

    NARCIS (Netherlands)

    Bechtoldt, M.N.; Rohrmann, S.; de Pater, I.E.; Beersma, B.

    2011-01-01

    There is ample empirical evidence for negative effects of emotional labor (surface acting and deep acting) on workers' well-being. This study analyzed to what extent workers' ability to recognize others' emotions may buffer these effects. In a 4-week study with 85 nurses and police officers, emotion

  18. Negative emotions in cancer care: do oncologists' responses depend on severity and type of emotion?

    Science.gov (United States)

    Kennifer, Sarah L; Alexander, Stewart C; Pollak, Kathryn I; Jeffreys, Amy S; Olsen, Maren K; Rodriguez, Keri L; Arnold, Robert M; Tulsky, James A

    2009-07-01

    To examine how type and severity of patients' negative emotions influence oncologists' responses and subsequent conversations. We analyzed 264 audio-recorded conversations between advanced cancer patients and their oncologists. Conversations were coded for patients' expressions of negative emotion, which were categorized by type of emotion and severity. Oncologists' responses were coded as using either empathic language or blocking and distancing approaches. Patients presented fear more often than anger or sadness; severity of disclosures was most often moderate. Oncologists responded to 35% of these negative emotional disclosures with empathic language. They were most empathic when patients presented intense emotions. Responding empathically to patients' emotional disclosures lengthened discussions by an average of only 21s. Greater response rates to severe emotions suggest oncologists may recognize negative emotions better when patients express them more intensely. Oncologists were least responsive to patient fear and responded with greatest empathy to sadness. Oncologists may benefit from additional training to recognize negative emotions, even when displayed without intensity. Teaching cancer patients to better articulate their emotional concerns may also enhance patient-oncologist communication.

  19. The Primacy of Perceiving: Emotion Recognition Buffers Negative Effects of Emotional Labor

    Science.gov (United States)

    Bechtoldt, Myriam N.; Rohrmann, Sonja; De Pater, Irene E.; Beersma, Bianca

    2011-01-01

    There is ample empirical evidence for negative effects of emotional labor (surface acting and deep acting) on workers' well-being. This study analyzed to what extent workers' ability to recognize others' emotions may buffer these effects. In a 4-week study with 85 nurses and police officers, emotion recognition moderated the relationship between…

  20. Food-Induced Emotional Resonance Improves Emotion Recognition.

    Science.gov (United States)

    Pandolfi, Elisa; Sacripante, Riccardo; Cardini, Flavia

    2016-01-01

    The effect of food substances on emotional states has been widely investigated, showing, for example, that eating chocolate is able to reduce negative mood. Here, for the first time, we have shown that the consumption of specific food substances is not only able to induce particular emotional states, but more importantly, to facilitate recognition of corresponding emotional facial expressions in others. Participants were asked to perform an emotion recognition task before and after eating either a piece of chocolate or a small amount of fish sauce-which we expected to induce happiness or disgust, respectively. Our results showed that being in a specific emotional state improves recognition of the corresponding emotional facial expression. Indeed, eating chocolate improved recognition of happy faces, while disgusted expressions were more readily recognized after eating fish sauce. In line with the embodied account of emotion understanding, we suggest that people are better at inferring the emotional state of others when their own emotional state resonates with the observed one.

  1. Food-Induced Emotional Resonance Improves Emotion Recognition

    Science.gov (United States)

    Pandolfi, Elisa; Sacripante, Riccardo; Cardini, Flavia

    2016-01-01

    The effect of food substances on emotional states has been widely investigated, showing, for example, that eating chocolate is able to reduce negative mood. Here, for the first time, we have shown that the consumption of specific food substances is not only able to induce particular emotional states, but more importantly, to facilitate recognition of corresponding emotional facial expressions in others. Participants were asked to perform an emotion recognition task before and after eating either a piece of chocolate or a small amount of fish sauce—which we expected to induce happiness or disgust, respectively. Our results showed that being in a specific emotional state improves recognition of the corresponding emotional facial expression. Indeed, eating chocolate improved recognition of happy faces, while disgusted expressions were more readily recognized after eating fish sauce. In line with the embodied account of emotion understanding, we suggest that people are better at inferring the emotional state of others when their own emotional state resonates with the observed one. PMID:27973559

  2. Facial emotion recognition deficits following moderate-severe Traumatic Brain Injury (TBI): re-examining the valence effect and the role of emotion intensity.

    Science.gov (United States)

    Rosenberg, Hannah; McDonald, Skye; Dethier, Marie; Kessels, Roy P C; Westbrook, R Frederick

    2014-11-01

    Many individuals who sustain moderate-severe traumatic brain injuries (TBI) are poor at recognizing emotional expressions, with a greater impairment in recognizing negative (e.g., fear, disgust, sadness, and anger) than positive emotions (e.g., happiness and surprise). It has been questioned whether this "valence effect" might be an artifact of the wide use of static facial emotion stimuli (usually full-blown expressions) which differ in difficulty rather than a real consequence of brain impairment. This study aimed to investigate the valence effect in TBI, while examining emotion recognition across different intensities (low, medium, and high). Twenty-seven individuals with TBI and 28 matched control participants were tested on the Emotion Recognition Task (ERT). The TBI group was more impaired in overall emotion recognition, and less accurate recognizing negative emotions. However, examining the performance across the different intensities indicated that this difference was driven by some emotions (e.g., happiness) being much easier to recognize than others (e.g., fear and surprise). Our findings indicate that individuals with TBI have an overall deficit in facial emotion recognition, and that both people with TBI and control participants found some emotions more difficult than others. These results suggest that conventional measures of facial affect recognition that do not examine variance in the difficulty of emotions may produce erroneous conclusions about differential impairment. They also cast doubt on the notion that dissociable neural pathways underlie the recognition of positive and negative emotions, which are differentially affected by TBI and potentially other neurological or psychiatric disorders.

  3. Gently does it: Humans outperform a software classifier in recognizing subtle, nonstereotypical facial expressions.

    Science.gov (United States)

    Yitzhak, Neta; Giladi, Nir; Gurevich, Tanya; Messinger, Daniel S; Prince, Emily B; Martin, Katherine; Aviezer, Hillel

    2017-12-01

    According to dominant theories of affect, humans innately and universally express a set of emotions using specific configurations of prototypical facial activity. Accordingly, thousands of studies have tested emotion recognition using sets of highly intense and stereotypical facial expressions, yet their incidence in real life is virtually unknown. In fact, a commonplace experience is that emotions are expressed in subtle and nonprototypical forms. Such facial expressions are at the focus of the current study. In Experiment 1, we present the development and validation of a novel stimulus set consisting of dynamic and subtle emotional facial displays conveyed without constraining expressers to using prototypical configurations. Although these subtle expressions were more challenging to recognize than prototypical dynamic expressions, they were still well recognized by human raters, and perhaps most importantly, they were rated as more ecological and naturalistic than the prototypical expressions. In Experiment 2, we examined the characteristics of subtle versus prototypical expressions by subjecting them to a software classifier, which used prototypical basic emotion criteria. Although the software was highly successful at classifying prototypical expressions, it performed very poorly at classifying the subtle expressions. Further validation was obtained from human expert face coders: Subtle stimuli did not contain many of the key facial movements present in prototypical expressions. Together, these findings suggest that emotions may be successfully conveyed to human viewers using subtle nonprototypical expressions. Although classic prototypical facial expressions are well recognized, they appear less naturalistic and may not capture the richness of everyday emotional communication. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. Does cortisol modulate emotion recognition and empathy?

    Science.gov (United States)

    Duesenberg, Moritz; Weber, Juliane; Schulze, Lars; Schaeuffele, Carmen; Roepke, Stefan; Hellmann-Regen, Julian; Otte, Christian; Wingenfeld, Katja

    2016-04-01

    Emotion recognition and empathy are important aspects in the interaction and understanding of other people's behaviors and feelings. The Human environment comprises of stressful situations that impact social interactions on a daily basis. Aim of the study was to examine the effects of the stress hormone cortisol on emotion recognition and empathy. In this placebo-controlled study, 40 healthy men and 40 healthy women (mean age 24.5 years) received either 10mg of hydrocortisone or placebo. We used the Multifaceted Empathy Test to measure emotional and cognitive empathy. Furthermore, we examined emotion recognition from facial expressions, which contained two emotions (anger and sadness) and two emotion intensities (40% and 80%). We did not find a main effect for treatment or sex on either empathy or emotion recognition but a sex × emotion interaction on emotion recognition. The main result was a four-way-interaction on emotion recognition including treatment, sex, emotion and task difficulty. At 40% task difficulty, women recognized angry faces better than men in the placebo condition. Furthermore, in the placebo condition, men recognized sadness better than anger. At 80% task difficulty, men and women performed equally well in recognizing sad faces but men performed worse compared to women with regard to angry faces. Apparently, our results did not support the hypothesis that increases in cortisol concentration alone influence empathy and emotion recognition in healthy young individuals. However, sex and task difficulty appear to be important variables in emotion recognition from facial expressions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Cultural similarities and differences in perceiving and recognizing facial expressions of basic emotions.

    Science.gov (United States)

    Yan, Xiaoqian; Andrews, Timothy J; Young, Andrew W

    2016-03-01

    The ability to recognize facial expressions of basic emotions is often considered a universal human ability. However, recent studies have suggested that this commonality has been overestimated and that people from different cultures use different facial signals to represent expressions (Jack, Blais, Scheepers, Schyns, & Caldara, 2009; Jack, Caldara, & Schyns, 2012). We investigated this possibility by examining similarities and differences in the perception and categorization of facial expressions between Chinese and white British participants using whole-face and partial-face images. Our results showed no cultural difference in the patterns of perceptual similarity of expressions from whole-face images. When categorizing the same expressions, however, both British and Chinese participants were slightly more accurate with whole-face images of their own ethnic group. To further investigate potential strategy differences, we repeated the perceptual similarity and categorization tasks with presentation of only the upper or lower half of each face. Again, the perceptual similarity of facial expressions was similar between Chinese and British participants for both the upper and lower face regions. However, participants were slightly better at categorizing facial expressions of their own ethnic group for the lower face regions, indicating that the way in which culture shapes the categorization of facial expressions is largely driven by differences in information decoding from this part of the face. (c) 2016 APA, all rights reserved).

  6. Group Emotions: The Social and Cognitive Functions of Emotions in Argumentation

    Science.gov (United States)

    Polo, Claire; Lund, Kristine; Plantin, Christian; Niccolai, Gerald P.

    2016-01-01

    The learning sciences of today recognize the tri-dimensional nature of learning as involving cognitive, social and emotional phenomena. However, many computer-supported argumentation systems still fail in addressing the socio-emotional aspects of group reasoning, perhaps due to a lack of an integrated theoretical vision of how these three…

  7. Effects of aging on identifying emotions conveyed by point-light walkers.

    Science.gov (United States)

    Spencer, Justine M Y; Sekuler, Allison B; Bennett, Patrick J; Giese, Martin A; Pilz, Karin S

    2016-02-01

    The visual system is able to recognize human motion simply from point lights attached to the major joints of an actor. Moreover, it has been shown that younger adults are able to recognize emotions from such dynamic point-light displays. Previous research has suggested that the ability to perceive emotional stimuli changes with age. For example, it has been shown that older adults are impaired in recognizing emotional expressions from static faces. In addition, it has been shown that older adults have difficulties perceiving visual motion, which might be helpful to recognize emotions from point-light displays. In the current study, 4 experiments were completed in which older and younger adults were asked to identify 3 emotions (happy, sad, and angry) displayed by 4 types of point-light walkers: upright and inverted normal walkers, which contained both local motion and global form information; upright scrambled walkers, which contained only local motion information; and upright random-position walkers, which contained only global form information. Overall, emotion discrimination accuracy was lower in older participants compared with younger participants, specifically when identifying sad and angry point-light walkers. In addition, observers in both age groups were able to recognize emotions from all types of point-light walkers, suggesting that both older and younger adults are able to recognize emotions from point-light walkers on the basis of local motion or global form. (c) 2016 APA, all rights reserved).

  8. Facial Emotion Recognition Deficits following Moderate-Severe Traumatic Brain Injury (TBI): Re-examining the Valence Effect and the Role of Emotion Intensity

    NARCIS (Netherlands)

    Rosenberg, H.; McDonald, S.; Dethier, M.; Kessels, R.P.C.; Westbrook, R.F.

    2014-01-01

    Many individuals who sustain moderate-severe traumatic brain injuries (TBI) are poor at recognizing emotional expressions, with a greater impairment in recognizing negative (e.g., fear, disgust, sadness, and anger) than positive emotions (e.g., happiness and surprise). It has been questioned whether

  9. A Fragment on the Emotion, “Mathesis” and Time Dimension of the Purely Musical. Marginalia with Prelude to the Afternoon of a Faun by Claude Debussy

    Directory of Open Access Journals (Sweden)

    Tijana Popović Mladjenović

    2007-12-01

    Full Text Available In the dialogue What Is Music? between Carl Dahlhaus and Hans Heinrich Eggebrecht, music is defined as a “mathematized emotion” or an “emotionalized ‘mathesis’”. As emphasized by Marija Bergamo, this is the way of underlining its equal and unavoidable constitution, based on emotion and rational organization in the time dimension. So, Marija Bergamo is continuously searching for those music determinants in a musical work as an “autonomous aesthetic fact”, whose base and real essence lie “within the nature and essence of music itself”. In other words, the starting point of the author’s concern with (art music is her reflection on that which is “purely musical”, that is, on “the very nature of the musical”.The attempts to determine what the purely musical is and to understand the nature of the sense and inevitability of man’s musical dimension have been made since the beginnings of music and musical thinking. In that context, more recent knowledge and thinking about the phenomenon of music, which are derived from various disciplines, correspond closely to Marija Bergamo’s views. In a narrower sense, the notion of purely musical is closely related to aesthetic autonomy, that is, autonomous music or musical autonomy. From such a viewpoint – and in conformity with Marija Bergamo’s view – I would say that the purely musical in an art music work exists independently of non/autonomy (that is, independently of any function, except an aesthetic one, as well as independently of the origin of its content (musical or extra-musical, and that it always, whenever “one thinks in the sense of music and is seized by it” (in terms of emotion, mathesis and time, creates, brings and possesses its specific (non-conceptual perceptive musical-semantic stratum. This is shown, at least partly, on a characteristic and (in many respects paradigmatic example – the music of Prelude to the Afternoon of a Faun by Claude Debussy

  10. Emotional intelligence and emotions associated with optimal and dysfunctional athletic performance.

    Science.gov (United States)

    Lane, Andrew M; Devonport, Tracey J; Soos, Istvan; Karsai, Istvan; Leibinger, Eva; Hamar, Pal

    2010-01-01

    This study investigated relationships between self-report measures of emotional intelligence and memories of pre-competitive emotions before optimal and dysfunctional athletic performance. Participant-athletes (n = 284) completed a self-report measure of emotional intelligence and two measures of pre-competitive emotions; a) emotions experienced before an optimal performance, and b) emotions experienced before a dysfunctional performance. Consistent with theoretical predictions, repeated MANOVA results demonstrated pleasant emotions associated with optimal performance and unpleasant emotions associated with dysfunctional performance. Emotional intelligence correlated with pleasant emotions in both performances with individuals reporting low scores on the self-report emotional intelligence scale appearing to experience intense unpleasant emotions before dysfunctional performance. We suggest that future research should investigate relationships between emotional intelligence and emotion-regulation strategies used by athletes. Key pointsAthletes reporting high scores of self-report emotional intelligence tend to experience pleasant emotions.Optimal performance is associated with pleasant emotions and dysfunctional performance is associated with unpleasant emotions.Emotional intelligence might help athletes recognize which emotional states help performance.

  11. Emotions and Economic Preference

    OpenAIRE

    Todorova, Tamara; Ramachandran, Bharath

    2005-01-01

    We wish to examine critically the viewpoint that: a) economists take too narrow a view of rationality and do not recognize the role of emotions as a component of rationality and b) do not address the question of whether preferences are rational or not, and instead take them as just given. We trace the relationship between economics and emotions showing some economic dimensions of emotional states. We illustrate them with examples of economic behavior based on emotional reactions.

  12. Dystonia: Emotional and Mental Health

    Science.gov (United States)

    ... Support Frequently Asked Questions Faces of Dystonia Emotional & Mental Health Although dystonia is a movement disorder that impacts ... emotion as well as muscle movement. For years, mental health professionals have recognized that coping with a chronic ...

  13. Emotion recognition and oxytocin in patients with schizophrenia

    Science.gov (United States)

    Averbeck, B. B.; Bobin, T.; Evans, S.; Shergill, S. S.

    2012-01-01

    Background Studies have suggested that patients with schizophrenia are impaired at recognizing emotions. Recently, it has been shown that the neuropeptide oxytocin can have beneficial effects on social behaviors. Method To examine emotion recognition deficits in patients and see whether oxytocin could improve these deficits, we carried out two experiments. In the first experiment we recruited 30 patients with schizophrenia and 29 age- and IQ-matched control subjects, and gave them an emotion recognition task. Following this, we carried out a second experiment in which we recruited 21 patients with schizophrenia for a double-blind, placebo-controlled cross-over study of the effects of oxytocin on the same emotion recognition task. Results In the first experiment we found that patients with schizophrenia had a deficit relative to controls in recognizing emotions. In the second experiment we found that administration of oxytocin improved the ability of patients to recognize emotions. The improvement was consistent and occurred for most emotions, and was present whether patients were identifying morphed or non-morphed faces. Conclusions These data add to a growing literature showing beneficial effects of oxytocin on social–behavioral tasks, as well as clinical symptoms. PMID:21835090

  14. Parents' Emotion-Related Beliefs, Behaviours, and Skills Predict Children's Recognition of Emotion

    Science.gov (United States)

    Castro, Vanessa L.; Halberstadt, Amy G.; Lozada, Fantasy T.; Craig, Ashley B.

    2015-01-01

    Children who are able to recognize others' emotions are successful in a variety of socioemotional domains, yet we know little about how school-aged children's abilities develop, particularly in the family context. We hypothesized that children develop emotion recognition skill as a function of parents' own emotion-related beliefs,…

  15. Emotion Recognition from Congruent and Incongruent Emotional Expressions and Situational Cues in Children with Autism Spectrum Disorder

    Science.gov (United States)

    Tell, Dina; Davidson, Denise

    2015-01-01

    In this research, the emotion recognition abilities of children with autism spectrum disorder and typically developing children were compared. When facial expressions and situational cues of emotion were congruent, accuracy in recognizing emotions was good for both children with autism spectrum disorder and typically developing children. When…

  16. Cross-cultural decoding of positive and negative nonlinguistic emotion vocalizations

    Directory of Open Access Journals (Sweden)

    Petri eLaukka

    2013-07-01

    Full Text Available Which emotions are associated with universally recognized nonverbal signals? We address this issue by examining how reliably nonlinguistic vocalizations (affect bursts can convey emotions across cultures. Actors from India, Kenya, Singapore and USA were instructed to produce vocalizations that would convey 9 positive and 9 negative emotions to listeners. The vocalizations were judged by Swedish listeners using a within-valence forced-choice procedure, where positive and negative emotions were judged in separate experiments. Results showed that listeners could recognize a wide range of positive and negative emotions with accuracy above chance. For positive emotions, we observed the highest recognition rates for relief, followed by lust, interest, serenity and positive surprise, with affection and pride receiving the lowest recognition rates. Anger, disgust, fear, sadness and negative surprise received the highest recognition rates for negative emotions, with the lowest rates observed for guilt and shame. By way of summary, results showed that the voice can reveal both basic emotions and several positive emotions other than happiness across cultures, but self-conscious emotions such as guilt, pride, and shame seem not to be well recognized from nonlinguistic vocalizations.

  17. Emotions in economic action and interaction

    OpenAIRE

    Bandelj, Nina

    2009-01-01

    How do emotions influence economic action? Current literature recognizes the importance of emotions for economy because they either help individuals perform economic roles through emotion management or enhancement of emotional intelligence, or because they aid rationality through their influence on preference formation. All these strands of research investigate the link between emotions and economy from an atomistic/individualistic perspective. I argue for a different approach, one that adopt...

  18. On the Time Course of Vocal Emotion Recognition

    Science.gov (United States)

    Pell, Marc D.; Kotz, Sonja A.

    2011-01-01

    How quickly do listeners recognize emotions from a speaker's voice, and does the time course for recognition vary by emotion type? To address these questions, we adapted the auditory gating paradigm to estimate how much vocal information is needed for listeners to categorize five basic emotions (anger, disgust, fear, sadness, happiness) and neutral utterances produced by male and female speakers of English. Semantically-anomalous pseudo-utterances (e.g., The rivix jolled the silling) conveying each emotion were divided into seven gate intervals according to the number of syllables that listeners heard from sentence onset. Participants (n = 48) judged the emotional meaning of stimuli presented at each gate duration interval, in a successive, blocked presentation format. Analyses looked at how recognition of each emotion evolves as an utterance unfolds and estimated the “identification point” for each emotion. Results showed that anger, sadness, fear, and neutral expressions are recognized more accurately at short gate intervals than happiness, and particularly disgust; however, as speech unfolds, recognition of happiness improves significantly towards the end of the utterance (and fear is recognized more accurately than other emotions). When the gate associated with the emotion identification point of each stimulus was calculated, data indicated that fear (M = 517 ms), sadness (M = 576 ms), and neutral (M = 510 ms) expressions were identified from shorter acoustic events than the other emotions. These data reveal differences in the underlying time course for conscious recognition of basic emotions from vocal expressions, which should be accounted for in studies of emotional speech processing. PMID:22087275

  19. Novel acoustic features for speech emotion recognition

    Institute of Scientific and Technical Information of China (English)

    ROH; Yong-Wan; KIM; Dong-Ju; LEE; Woo-Seok; HONG; Kwang-Seok

    2009-01-01

    This paper focuses on acoustic features that effectively improve the recognition of emotion in human speech.The novel features in this paper are based on spectral-based entropy parameters such as fast Fourier transform(FFT) spectral entropy,delta FFT spectral entropy,Mel-frequency filter bank(MFB) spectral entropy,and Delta MFB spectral entropy.Spectral-based entropy features are simple.They reflect frequency characteristic and changing characteristic in frequency of speech.We implement an emotion rejection module using the probability distribution of recognized-scores and rejected-scores.This reduces the false recognition rate to improve overall performance.Recognized-scores and rejected-scores refer to probabilities of recognized and rejected emotion recognition results,respectively.These scores are first obtained from a pattern recognition procedure.The pattern recognition phase uses the Gaussian mixture model(GMM).We classify the four emotional states as anger,sadness,happiness and neutrality.The proposed method is evaluated using 45 sentences in each emotion for 30 subjects,15 males and 15 females.Experimental results show that the proposed method is superior to the existing emotion recognition methods based on GMM using energy,Zero Crossing Rate(ZCR),linear prediction coefficient(LPC),and pitch parameters.We demonstrate the effectiveness of the proposed approach.One of the proposed features,combined MFB and delta MFB spectral entropy improves performance approximately 10% compared to the existing feature parameters for speech emotion recognition methods.We demonstrate a 4% performance improvement in the applied emotion rejection with low confidence score.

  20. Novel acoustic features for speech emotion recognition

    Institute of Scientific and Technical Information of China (English)

    ROH Yong-Wan; KIM Dong-Ju; LEE Woo-Seok; HONG Kwang-Seok

    2009-01-01

    This paper focuses on acoustic features that effectively improve the recognition of emotion in human speech. The novel features in this paper are based on spectral-based entropy parameters such as fast Fourier transform (FFT) spectral entropy, delta FFT spectral entropy, Mel-frequency filter bank (MFB)spectral entropy, and Delta MFB spectral entropy. Spectral-based entropy features are simple. They reflect frequency characteristic and changing characteristic in frequency of speech. We implement an emotion rejection module using the probability distribution of recognized-scores and rejected-scores.This reduces the false recognition rate to improve overall performance. Recognized-scores and rejected-scores refer to probabilities of recognized and rejected emotion recognition results, respectively.These scores are first obtained from a pattern recognition procedure. The pattern recognition phase uses the Gaussian mixture model (GMM). We classify the four emotional states as anger, sadness,happiness and neutrality. The proposed method is evaluated using 45 sentences in each emotion for 30 subjects, 15 males and 15 females. Experimental results show that the proposed method is superior to the existing emotion recognition methods based on GMM using energy, Zero Crossing Rate (ZCR),linear prediction coefficient (LPC), and pitch parameters. We demonstrate the effectiveness of the proposed approach. One of the proposed features, combined MFB and delta MFB spectral entropy improves performance approximately 10% compared to the existing feature parameters for speech emotion recognition methods. We demonstrate a 4% performance improvement in the applied emotion rejection with low confidence score.

  1. Behavioral dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia.

    Science.gov (United States)

    Daini, Roberta; Comparetti, Chiara M; Ricciardelli, Paola

    2014-01-01

    Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition.

  2. Cross-cultural decoding of positive and negative non-linguistic emotion vocalizations.

    Science.gov (United States)

    Laukka, Petri; Elfenbein, Hillary Anger; Söder, Nela; Nordström, Henrik; Althoff, Jean; Chui, Wanda; Iraki, Frederick K; Rockstuhl, Thomas; Thingujam, Nutankumar S

    2013-01-01

    Which emotions are associated with universally recognized non-verbal signals?We address this issue by examining how reliably non-linguistic vocalizations (affect bursts) can convey emotions across cultures. Actors from India, Kenya, Singapore, and USA were instructed to produce vocalizations that would convey nine positive and nine negative emotions to listeners. The vocalizations were judged by Swedish listeners using a within-valence forced-choice procedure, where positive and negative emotions were judged in separate experiments. Results showed that listeners could recognize a wide range of positive and negative emotions with accuracy above chance. For positive emotions, we observed the highest recognition rates for relief, followed by lust, interest, serenity and positive surprise, with affection and pride receiving the lowest recognition rates. Anger, disgust, fear, sadness, and negative surprise received the highest recognition rates for negative emotions, with the lowest rates observed for guilt and shame. By way of summary, results showed that the voice can reveal both basic emotions and several positive emotions other than happiness across cultures, but self-conscious emotions such as guilt, pride, and shame seem not to be well recognized from non-linguistic vocalizations.

  3. Role of temperament in early adolescent pure and co-occurring internalizing and externalizing problems using a bifactor model: Moderation by parenting and gender.

    Science.gov (United States)

    Wang, Frances L; Eisenberg, Nancy; Valiente, Carlos; Spinrad, Tracy L

    2016-11-01

    We contribute to the literature on the relations of temperament to externalizing and internalizing problems by considering parental emotional expressivity and child gender as moderators of such relations and examining prediction of pure and co-occurring problem behaviors during early to middle adolescence using bifactor models (which provide unique and continuous factors for pure and co-occurring internalizing and externalizing problems). Parents and teachers reported on children's (4.5- to 8-year-olds; N = 214) and early adolescents' (6 years later; N = 168) effortful control, impulsivity, anger, sadness, and problem behaviors. Parental emotional expressivity was measured observationally and with parents' self-reports. Early-adolescents' pure externalizing and co-occurring problems shared childhood and/or early-adolescent risk factors of low effortful control, high impulsivity, and high anger. Lower childhood and early-adolescent impulsivity and higher early-adolescent sadness predicted early-adolescents' pure internalizing. Childhood positive parental emotional expressivity more consistently related to early-adolescents' lower pure externalizing compared to co-occurring problems and pure internalizing. Lower effortful control predicted changes in externalizing (pure and co-occurring) over 6 years, but only when parental positive expressivity was low. Higher impulsivity predicted co-occurring problems only for boys. Findings highlight the probable complex developmental pathways to adolescent pure and co-occurring externalizing and internalizing problems.

  4. The communication of emotion during conflict in married couples.

    Science.gov (United States)

    Sanford, Keith

    2012-06-01

    This study investigated emotion during interpersonal conflicts between mates. It addressed questions about how clearly couples express emotion (encoding), how accurately they recognize each other's emotion (decoding), and how well they distinguish between types of negative emotion. It was theorized that couples express and perceive both: (a) event-specific emotions, which are unique to particular people on particular occasions, and (b) contextual-couple emotions, which reflect the additive effect of emotions across different events and across both partners. Eighty-three married couples engaged in a series of two conflict conversations. Self-report ratings, observer ratings, and partner ratings were used to assess two types of negative emotion: hard emotion (e.g., angry or annoyed) and soft emotion (e.g., sad or hurt). Couples were reasonably accurate in encoding, decoding, and in distinguishing between types of emotion. Emotion expression was strongly associated with general levels of contextual-couple emotion summed across two conversations, whereas emotion perception was more closely tied to specific events. Hard emotion was readily perceived when it was overtly expressed, and soft emotion could sometimes be recognized even when it was not expressed clearly. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  5. Mapping the emotional face. How individual face parts contribute to successful emotion recognition.

    Directory of Open Access Journals (Sweden)

    Martin Wegrzyn

    Full Text Available Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes to disgust and happiness (mouth. The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation.

  6. Mapping the emotional face. How individual face parts contribute to successful emotion recognition

    Science.gov (United States)

    Wegrzyn, Martin; Vogt, Maria; Kireclioglu, Berna; Schneider, Julia; Kissler, Johanna

    2017-01-01

    Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation. PMID:28493921

  7. Subject-independent emotion recognition based on physiological signals: a three-stage decision method.

    Science.gov (United States)

    Chen, Jing; Hu, Bin; Wang, Yue; Moore, Philip; Dai, Yongqiang; Feng, Lei; Ding, Zhijie

    2017-12-20

    Collaboration between humans and computers has become pervasive and ubiquitous, however current computer systems are limited in that they fail to address the emotional component. An accurate understanding of human emotions is necessary for these computers to trigger proper feedback. Among multiple emotional channels, physiological signals are synchronous with emotional responses; therefore, analyzing physiological changes is a recognized way to estimate human emotions. In this paper, a three-stage decision method is proposed to recognize four emotions based on physiological signals in the multi-subject context. Emotion detection is achieved by using a stage-divided strategy in which each stage deals with a fine-grained goal. The decision method consists of three stages. During the training process, the initial stage transforms mixed training subjects to separate groups, thus eliminating the effect of individual differences. The second stage categorizes four emotions into two emotion pools in order to reduce recognition complexity. The third stage trains a classifier based on emotions in each emotion pool. During the testing process, a test case or test trial will be initially classified to a group followed by classification into an emotion pool in the second stage. An emotion will be assigned to the test trial in the final stage. In this paper we consider two different ways of allocating four emotions into two emotion pools. A comparative analysis is also carried out between the proposal and other methods. An average recognition accuracy of 77.57% was achieved on the recognition of four emotions with the best accuracy of 86.67% to recognize the positive and excited emotion. Using differing ways of allocating four emotions into two emotion pools, we found there is a difference in the effectiveness of a classifier on learning each emotion. When compared to other methods, the proposed method demonstrates a significant improvement in recognizing four emotions in the

  8. Emotion in reinforcement learning agents and robots : A survey

    NARCIS (Netherlands)

    Moerland, T.M.; Broekens, D.J.; Jonker, C.M.

    2018-01-01

    This article provides the first survey of computational models of emotion in reinforcement learning (RL) agents. The survey focuses on agent/robot emotions, and mostly ignores human user emotions. Emotions are recognized as functional in decision-making by influencing motivation and action

  9. Emotion Recognition in Children with Down Syndrome: Influence of Emotion Label and Expression Intensity

    Science.gov (United States)

    Cebula, Katie R.; Wishart, Jennifer G.; Willis, Diane S.; Pitcairn, Tom K.

    2017-01-01

    Some children with Down syndrome may experience difficulties in recognizing facial emotions, particularly fear, but it is not clear why, nor how such skills can best be facilitated. Using a photo-matching task, emotion recognition was tested in children with Down syndrome, children with nonspecific intellectual disability and cognitively matched,…

  10. What is the effect of basic emotions on directed forgetting ? Investigating the role of basic emotions in memory.

    Directory of Open Access Journals (Sweden)

    Artur Marchewka

    2016-08-01

    Full Text Available Studies presenting memory-facilitating effect of emotions typically focused on affective dimensions of arousal and valence. Little is known, however, about the extent to which stimulus-driven basic emotions could have distinct effects on memory. In the present paper we sought to examine the modulatory effect of disgust, fear and sadness on intentional remembering and forgetting using widely used item-method directed forgetting paradigm. Eighteen women underwent fMRI scanning during encoding phase in which they were asked either to remember (R or to forget (F pictures. In the test phase all previously used stimuli were re-presented together with the same number of new pictures and participants had to categorize them as old or new, irrespective of the F/R instruction. On the behavioral level we found a typical directed forgetting effect, i.e. higher recognition rates for to-be-remembered (TBR items than to-be-forgotten (TBF ones for both neutral and emotional categories. Emotional stimuli had higher recognition rate than neutral ones, while among emotional those eliciting disgust produced highest recognition, but at the same time induced more false alarms. Therefore when false alarm corrected recognition was examined the directed forgetting effect was equally strong irrespective of emotion. Additionally, even though subjects rated disgusting pictures as more arousing and negative than other picture categories, logistic regression on the item level showed that the effect of disgust on recognition memory was stronger than the effect of arousal or valence. On the neural level, ROI analyses (with valence and arousal covariates revealed that correctly recognized disgusting stimuli evoked the highest activity in the left amygdala compared to all other categories. This structure was also more activated for remembered vs. forgotten stimuli, but only in case of disgust or fear eliciting pictures. Our findings, despite several limitations, suggest that

  11. What Is the Effect of Basic Emotions on Directed Forgetting? Investigating the Role of Basic Emotions in Memory.

    Science.gov (United States)

    Marchewka, Artur; Wypych, Marek; Michałowski, Jarosław M; Sińczuk, Marcin; Wordecha, Małgorzata; Jednoróg, Katarzyna; Nowicka, Anna

    2016-01-01

    Studies presenting memory-facilitating effect of emotions typically focused on affective dimensions of arousal and valence. Little is known, however, about the extent to which stimulus-driven basic emotions could have distinct effects on memory. In the present paper we sought to examine the modulatory effect of disgust, fear, and sadness on intentional remembering and forgetting using widely used item-method directed forgetting (DF) paradigm. Eighteen women underwent fMRI scanning during encoding phase in which they were asked either to remember (R) or to forget (F) pictures. In the test phase all previously used stimuli were re-presented together with the same number of new pictures and participants had to categorize them as old or new, irrespective of the F/R instruction. On the behavioral level we found a typical DF effect, i.e., higher recognition rates for to-be-remembered (TBR) items than to-be-forgotten (TBF) ones for both neutral and emotional categories. Emotional stimuli had higher recognition rate than neutral ones, while among emotional those eliciting disgust produced highest recognition, but at the same time induced more false alarms. Therefore, when false alarm corrected recognition was examined the DF effect was equally strong irrespective of emotion. Additionally, even though subjects rated disgusting pictures as more arousing and negative than other picture categories, logistic regression on the item level showed that the effect of disgust on recognition memory was stronger than the effect of arousal or valence. On the neural level, ROI analyses (with valence and arousal covariates) revealed that correctly recognized disgusting stimuli evoked the highest activity in the left amygdala compared to all other categories. This structure was also more activated for remembered vs. forgotten stimuli, but only in case of disgust or fear eliciting pictures. Our findings, despite several limitations, suggest that disgust have a special salience in memory

  12. Literature Review: Is the Emotional Expression of Contempt Recognized Universally or Culturally?

    OpenAIRE

    Phoukhao, Julianna

    2017-01-01

    The universal facial expression of contempt is often described as one lip corner raised and tightened. This literature reviews whether or not this expression is recognized universally. After examining theories and methods, low agreement of this expression recognized as contempt was found across cultures. Evidence so far is not sufficient enough to support the unilateral lip corner as an universal expression for contempt. The expression and recognition of contempt is highly dependent on cultur...

  13. Emotion Recognition in Children With Down Syndrome: Influence of Emotion Label and Expression Intensity.

    Science.gov (United States)

    Cebula, Katie R; Wishart, Jennifer G; Willis, Diane S; Pitcairn, Tom K

    2017-03-01

    Some children with Down syndrome may experience difficulties in recognizing facial emotions, particularly fear, but it is not clear why, nor how such skills can best be facilitated. Using a photo-matching task, emotion recognition was tested in children with Down syndrome, children with nonspecific intellectual disability and cognitively matched, typically developing children (all groups N = 21) under four conditions: veridical vs. exaggerated emotions and emotion-labelling vs. generic task instructions. In all groups, exaggerating emotions facilitated recognition accuracy and speed, with emotion labelling facilitating recognition accuracy. Overall accuracy and speed did not differ in the children with Down syndrome, although recognition of fear was poorer than in the typically developing children and unrelated to emotion label use. Implications for interventions are considered.

  14. Impaired emotion recognition in music in Parkinson's disease.

    Science.gov (United States)

    van Tricht, Mirjam J; Smeding, Harriet M M; Speelman, Johannes D; Schmand, Ben A

    2010-10-01

    Music has the potential to evoke strong emotions and plays a significant role in the lives of many people. Music might therefore be an ideal medium to assess emotion recognition. We investigated emotion recognition in music in 20 patients with idiopathic Parkinson's disease (PD) and 20 matched healthy volunteers. The role of cognitive dysfunction and other disease characteristics in emotion recognition was also evaluated. We used 32 musical excerpts that expressed happiness, sadness, fear or anger. PD patients were impaired in recognizing fear and anger in music. Fear recognition was associated with executive functions in PD patients and in healthy controls, but the emotion recognition impairments of PD patients persisted after adjusting for executive functioning. We found no differences in the recognition of happy or sad music. Emotion recognition was not related to depressive symptoms, disease duration or severity of motor symptoms. We conclude that PD patients are impaired in recognizing complex emotions in music. Although this impairment is related to executive dysfunction, our findings most likely reflect an additional primary deficit in emotional processing. 2010 Elsevier Inc. All rights reserved.

  15. Emotion through Locomotion: Gender Impact

    OpenAIRE

    Kr?ger, Samuel; Sokolov, Alexander N.; Enck, Paul; Kr?geloh-Mann, Ingeborg; Pavlova, Marina A.

    2013-01-01

    Body language reading is of significance for daily life social cognition and successful social interaction, and constitutes a core component of social competence. Yet it is unclear whether our ability for body language reading is gender specific. In the present work, female and male observers had to visually recognize emotions through point-light human locomotion performed by female and male actors with different emotional expressions. For subtle emotional expressions only, males surpass fema...

  16. Processing environmental stimuli in paranoid schizophrenia: recognizing facial emotions and performing executive functions.

    Science.gov (United States)

    Yu, Shao Hua; Zhu, Jun Peng; Xu, You; Zheng, Lei Lei; Chai, Hao; He, Wei; Liu, Wei Bo; Li, Hui Chun; Wang, Wei

    2012-12-01

    To study the contribution of executive function to abnormal recognition of facial expressions of emotion in schizophrenia patients. Abnormal recognition of facial expressions of emotion was assayed according to Japanese and Caucasian facial expressions of emotion (JACFEE), Wisconsin card sorting test (WCST), positive and negative symptom scale, and Hamilton anxiety and depression scale, respectively, in 88 paranoid schizophrenia patients and 75 healthy volunteers. Patients scored higher on the Positive and Negative Symptom Scale and the Hamilton Anxiety and Depression Scales, displayed lower JACFEE recognition accuracies and poorer WCST performances. The JACFEE recognition accuracy of contempt and disgust was negatively correlated with the negative symptom scale score while the recognition accuracy of fear was positively with the positive symptom scale score and the recognition accuracy of surprise was negatively with the general psychopathology score in patients. Moreover, the WCST could predict the JACFEE recognition accuracy of contempt, disgust, and sadness in patients, and the perseverative errors negatively predicted the recognition accuracy of sadness in healthy volunteers. The JACFEE recognition accuracy of sadness could predict the WCST categories in paranoid schizophrenia patients. Recognition accuracy of social-/moral emotions, such as contempt, disgust and sadness is related to the executive function in paranoid schizophrenia patients, especially when regarding sadness. Copyright © 2012 The Editorial Board of Biomedical and Environmental Sciences. Published by Elsevier B.V. All rights reserved.

  17. Development of Facial Emotion Recognition in Childhood : Age-related Differences in a Shortened Version of the Facial Expressions of Emotion - Stimuli and Tests

    NARCIS (Netherlands)

    Coenen, Maraike; Aarnoudse, Ceciel; Huitema, Rients; Braams, Olga; Veenstra, Wencke S.

    2013-01-01

    Introduction Facial emotion recognition is essential for social interaction. The development of emotion recognition abilities is not yet entirely understood (Tonks et al. 2007). Facial emotion recognition emerges gradually, with happiness recognized earliest (Herba & Phillips, 2004). The recognition

  18. 'Emotional Intelligence': Lessons from Lesions.

    Science.gov (United States)

    Hogeveen, J; Salvi, C; Grafman, J

    2016-10-01

    'Emotional intelligence' (EI) is one of the most highly used psychological terms in popular nomenclature, yet its construct, divergent, and predictive validities are contentiously debated. Despite this debate, the EI construct is composed of a set of emotional abilities - recognizing emotional states in the self and others, using emotions to guide thought and behavior, understanding how emotions shape behavior, and emotion regulation - that undoubtedly influence important social and personal outcomes. In this review, evidence from human lesion studies is reviewed in order to provide insight into the necessary brain regions for each of these core emotional abilities. Critically, we consider how this neuropsychological evidence might help to guide efforts to define and measure EI. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Emotional Communication in Finger Braille

    Directory of Open Access Journals (Sweden)

    Yasuhiro Matsuda

    2010-01-01

    Full Text Available We describe analyses of the features of emotions (neutral, joy, sadness, and anger expressed by Finger Braille interpreters and subsequently examine the effectiveness of emotional expression and emotional communication between people unskilled in Finger Braille. The goal is to develop a Finger Braille system to teach emotional expression and a system to recognize emotion. The results indicate the following features of emotional expression by interpreters. The durations of the code of joy were significantly shorter than the durations of the other emotions, the durations of the code of sadness were significantly longer, and the finger loads of anger were significantly larger. The features of emotional expression by unskilled subjects were very similar to those of the interpreters, and the coincidence ratio of emotional communication was 75.1%. Therefore, it was confirmed that people unskilled in Finger Braille can express and communicate emotions using this communication medium.

  20. Adolescents' emotional competence is associated with parents' neural sensitivity to emotions.

    Science.gov (United States)

    Telzer, Eva H; Qu, Yang; Goldenberg, Diane; Fuligni, Andrew J; Galván, Adriana; Lieberman, Matthew D

    2014-01-01

    An essential component of youths' successful development is learning to appropriately respond to emotions, including the ability to recognize, identify, and describe one's feelings. Such emotional competence is thought to arise through the parent-child relationship. Yet, the mechanisms by which parents transmit emotional competence to their children are difficult to measure because they are often implicit, idiosyncratic, and not easily articulated by parents or children. In the current study, we used a multifaceted approach that went beyond self-report measures and examined whether parental neural sensitivity to emotions predicted their child's emotional competence. Twenty-two adolescent-parent dyads completed an fMRI scan during which they labeled the emotional expressions of negatively valenced faces. Results indicate that parents who recruited the amygdala, VLPFC, and brain regions involved in mentalizing (i.e., inferring others' emotional states) had adolescent children with greater emotional competence. These results held after controlling for parents' self-reports of emotional expressivity and adolescents' self-reports of the warmth and support of their parent relationships. In addition, adolescents recruited neural regions involved in mentalizing during affect labeling, which significantly mediated the associated between parental neural sensitivity and adolescents' emotional competence, suggesting that youth are modeling or referencing their parents' emotional profiles, thereby contributing to better emotional competence.

  1. Adolescents’ emotional competence is associated with parents’ neural sensitivity to emotions

    Directory of Open Access Journals (Sweden)

    Eva H Telzer

    2014-07-01

    Full Text Available An essential component of youths’ successful development is learning to appropriately respond to emotions, including the ability to recognize, identify, and describe one’s feelings. Such emotional competence is thought to arise through the parent-child relationship. Yet, the mechanisms by which parents transmit emotional competence to their children are difficult to measure because they are often implicit, idiosyncratic, and not easily articulated by parents or children. In the current study, we used a multifaceted approach that went beyond self-report measures and examined whether parental neural sensitivity to emotions predicted their child’s emotional competence. Twenty-two adolescent-parent dyads completed an fMRI scan during which they labeled the emotional expressions of negatively valenced faces. Results indicate that parents who recruited the amygdala, VLPFC, and brain regions involved in mentalizing (i.e., inferring others’ emotional states had adolescent children with greater emotional competence. These results held after controlling for parents’ self-reports of emotional expressivity and adolescents’ self-reports of the warmth and support of their parent relationships. In addition, adolescents recruited neural regions involved in mentalizing during affect labeling, which significantly mediated the associated between parental neural sensitivity and adolescents’ emotional competence, suggesting that youth are modeling or referencing their parents’ emotional profiles, thereby contributing to better emotional competence.

  2. Operator error and emotions. Operator error and emotions - a major cause of human failure

    International Nuclear Information System (INIS)

    Patterson, B.K.; Bradley, M.; Artiss, W.G.

    2000-01-01

    This paper proposes the idea that a large proportion of the incidents attributed to operator and maintenance error in a nuclear or industrial plant are actually founded in our human emotions. Basic psychological theory of emotions is briefly presented and then the authors present situations and instances that can cause emotions to swell and lead to operator and maintenance error. Since emotional information is not recorded in industrial incident reports, the challenge is extended to industry, to review incident source documents for cases of emotional involvement and to develop means to collect emotion related information in future root cause analysis investigations. Training must then be provided to operators and maintainers to enable them to know one's emotions, manage emotions, motivate one's self, recognize emotions in others and handle relationships. Effective training will reduce the instances of human error based in emotions and enable a cooperative, productive environment in which to work. (author)

  3. Operator error and emotions. Operator error and emotions - a major cause of human failure

    Energy Technology Data Exchange (ETDEWEB)

    Patterson, B.K. [Human Factors Practical Incorporated (Canada); Bradley, M. [Univ. of New Brunswick, Saint John, New Brunswick (Canada); Artiss, W.G. [Human Factors Practical (Canada)

    2000-07-01

    This paper proposes the idea that a large proportion of the incidents attributed to operator and maintenance error in a nuclear or industrial plant are actually founded in our human emotions. Basic psychological theory of emotions is briefly presented and then the authors present situations and instances that can cause emotions to swell and lead to operator and maintenance error. Since emotional information is not recorded in industrial incident reports, the challenge is extended to industry, to review incident source documents for cases of emotional involvement and to develop means to collect emotion related information in future root cause analysis investigations. Training must then be provided to operators and maintainers to enable them to know one's emotions, manage emotions, motivate one's self, recognize emotions in others and handle relationships. Effective training will reduce the instances of human error based in emotions and enable a cooperative, productive environment in which to work. (author)

  4. Inventions on expressing emotions In Graphical User Interface

    OpenAIRE

    Mishra, Umakant

    2014-01-01

    The conventional GUI is more mechanical and does not recognize or communicate emotions. The modern GUIs are trying to infer the likely emotional state and personality of the user and communicate through a corresponding emotional state. Emotions are expressed in graphical icons, sounds, pictures and other means. The emotions are found to be useful in especially in communication software, interactive learning systems, robotics and other adaptive environments. Various mechanisms have been develo...

  5. Emotional Intelligence’: Lessons from Lesions

    Science.gov (United States)

    Hogeveen, J.; Salvi, C.; Grafman, J.

    2018-01-01

    Emotional intelligence’ (EI) is one of the most highly used psychological terms in popular nomenclature, yet its construct, divergent, and predictive validities are contentiously debated. Despite this debate, the EI construct is composed of a set of emotional abilities – recognizing emotional states in the self and others, using emotions to guide thought and behavior, understanding how emotions shape behavior, and emotion regulation – that undoubtedly influence important social and personal outcomes. In this review, evidence from human lesion studies is reviewed in order to provide insight into the necessary brain regions for each of these core emotional abilities. Critically, we consider how this neuropsychological evidence might help to guide efforts to define and measure EI. PMID:27647325

  6. Feeling something without knowing why : measuring emotions toward archetypal content

    NARCIS (Netherlands)

    Chang, H.M.; Ivonin, L.; Chen, W.; Rauterberg, G.W.M.; Mancs, M.; d'Alessandro, N.; Siebert, X.; Gosselin, B.; Valderrama, C.; Dutoit, T.

    2013-01-01

    To enhance communication among users through technology, we propose a framework that communicates ‘pure experience.’ This framework can be achieved by providing emotionally charged communication. To initiate this undertaking, we propose to explore materials for communicating human emotions. Research

  7. Emotion of Physiological Signals Classification Based on TS Feature Selection

    Institute of Scientific and Technical Information of China (English)

    Wang Yujing; Mo Jianlin

    2015-01-01

    This paper propose a method of TS-MLP about emotion recognition of physiological signal.It can recognize emotion successfully by Tabu search which selects features of emotion’s physiological signals and multilayer perceptron that is used to classify emotion.Simulation shows that it has achieved good emotion classification performance.

  8. Computer-mediated communication preferences predict biobehavioral measures of social-emotional functioning.

    Science.gov (United States)

    Babkirk, Sarah; Luehring-Jones, Peter; Dennis-Tiwary, Tracy A

    2016-12-01

    The use of computer-mediated communication (CMC) as a form of social interaction has become increasingly prevalent, yet few studies examine individual differences that may shed light on implications of CMC for adjustment. The current study examined neurocognitive individual differences associated with preferences to use technology in relation to social-emotional outcomes. In Study 1 (N = 91), a self-report measure, the Social Media Communication Questionnaire (SMCQ), was evaluated as an assessment of preferences for communicating positive and negative emotions on a scale ranging from purely via CMC to purely face-to-face. In Study 2, SMCQ preferences were examined in relation to event-related potentials (ERPs) associated with early emotional attention capture and reactivity (the frontal N1) and later sustained emotional processing and regulation (the late positive potential (LPP)). Electroencephalography (EEG) was recorded while 22 participants passively viewed emotional and neutral pictures and completed an emotion regulation task with instructions to increase, decrease, or maintain their emotional responses. A greater preference for CMC was associated with reduced size of and satisfaction with social support, greater early (N1) attention capture by emotional stimuli, and reduced LPP amplitudes to unpleasant stimuli in the increase emotion regulatory task. These findings are discussed in the context of possible emotion- and social-regulatory functions of CMC.

  9. The Effect of Emotional Intelligence on Student Success

    Science.gov (United States)

    Chapin, Krysta

    2015-01-01

    Emotional intelligence (EI) is the ability to recognize, assess, and control one's emotions, as well as the emotions of others, and even groups. It also allows people to handle added pressures, as they often experience in higher education. Occasionally clinicians report a small number of senior veterinary medicine students lack the ability to…

  10. Emotion in reinforcement learning agents and robots: A survey

    OpenAIRE

    Moerland, T.M.; Broekens, D.J.; Jonker, C.M.

    2018-01-01

    This article provides the first survey of computational models of emotion in reinforcement learning (RL) agents. The survey focuses on agent/robot emotions, and mostly ignores human user emotions. Emotions are recognized as functional in decision-making by influencing motivation and action selection. Therefore, computational emotion models are usually grounded in the agent's decision making architecture, of which RL is an important subclass. Studying emotions in RL-based agents is useful for ...

  11. A New Concept of Marketing: The Emotional Marketing

    Directory of Open Access Journals (Sweden)

    Domenico Consoli

    2010-03-01

    Full Text Available Nowadays, in the marketing area, a new concept of marketing is emerging: the emotional marketing. The emotional marketing studies how to arouse emotions in people to induce them to buy that particular produc/service. Recent studies shown how purchasing choices and decisions are the result of a careful analysis of rational and emotional aspects. Psychological literature recognizes that the emotional conditions influence every stage of decision-making in purchasing process.Emotions play a key role in any kind of social or business decision. The emotions are manifested in verbal, facial and textual expressions. People when speak, interact and write, convey emotions.Keywords: emotions, emotional marketing, emotional brand, emotional intelligence,emotions measurement.

  12. Differential emotional processing in concrete and abstract words.

    Science.gov (United States)

    Yao, Bo; Keitel, Anne; Bruce, Gillian; Scott, Graham G; O'Donnell, Patrick J; Sereno, Sara C

    2018-02-12

    Emotion (positive and negative) words are typically recognized faster than neutral words. Recent research suggests that emotional valence, while often treated as a unitary semantic property, may be differentially represented in concrete and abstract words. Studies that have explicitly examined the interaction of emotion and concreteness, however, have demonstrated inconsistent patterns of results. Moreover, these findings may be limited as certain key lexical variables (e.g., familiarity, age of acquisition) were not taken into account. We investigated the emotion-concreteness interaction in a large-scale, highly controlled lexical decision experiment. A 3 (Emotion: negative, neutral, positive) × 2 (Concreteness: abstract, concrete) design was used, with 45 items per condition and 127 participants. We found a significant interaction between emotion and concreteness. Although positive and negative valenced words were recognized faster than neutral words, this emotion advantage was significantly larger in concrete than in abstract words. We explored potential contributions of participant alexithymia level and item imageability to this interactive pattern. We found that only word imageability significantly modulated the emotion-concreteness interaction. While both concrete and abstract emotion words are advantageously processed relative to comparable neutral words, the mechanisms of this facilitation are paradoxically more dependent on imageability in abstract words. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  13. Associations between feeling and judging the emotions of happiness and fear: findings from a large-scale field experiment.

    Directory of Open Access Journals (Sweden)

    Tony W Buchanan

    2010-05-01

    Full Text Available How do we recognize emotions from other people? One possibility is that our own emotional experiences guide us in the online recognition of emotion in others. A distinct but related possibility is that emotion experience helps us to learn how to recognize emotions in childhood.We explored these ideas in a large sample of people (N = 4,608 ranging from 5 to over 50 years old. Participants were asked to rate the intensity of emotional experience in their own lives, as well as to perform a task of facial emotion recognition. Those who reported more intense experience of fear and happiness were significantly more accurate (closer to prototypical in recognizing facial expressions of fear and happiness, respectively, and intense experience of fear was associated also with more accurate recognition of surprised and happy facial expressions. The associations held across all age groups.These results suggest that the intensity of one's own emotional experience of fear and happiness correlates with the ability to recognize these emotions in others, and demonstrate such an association as early as age 5.

  14. A qualitative thematic review: emotional labour in healthcare settings.

    Science.gov (United States)

    Riley, Ruth; Weiss, Marjorie C

    2016-01-01

    To identify the range of emotional labour employed by healthcare professionals in a healthcare setting and implications of this for staff and organisations. In a healthcare setting, emotional labour is the act or skill involved in the caring role, in recognizing the emotions of others and in managing our own. A thematic synthesis of qualitative studies which included emotion work theory in their design, employed qualitative methods and were situated in a healthcare setting. The reporting of the review was informed by the ENTREQ framework. 6 databases were searched between 1979-2014. Studies were included if they were qualitative, employed emotion work theory and were written in English. Papers were appraised and themes identified. Thirteen papers were included. The reviewed studies identified four key themes: (1) The professionalization of emotion and gendered aspects of emotional labour; (2) Intrapersonal aspects of emotional labour - how healthcare workers manage their own emotions in the workplace; (3) Collegial and organisational sources of emotional labour; (4) Support and training needs of professionals This review identified gendered, personal, organisational, collegial and socio-cultural sources of and barriers to emotional labour in healthcare settings. The review highlights the importance of ensuring emotional labour is recognized and valued, ensuring support and supervision is in place to enable staff to cope with the varied emotional demands of their work. © 2015 John Wiley & Sons Ltd.

  15. Recognizing emotions in faces : effects of acute tryptophan depletion and bright light

    NARCIS (Netherlands)

    aan het Rot, Marije; Coupland, Nicholas; Boivin, Diane B.; Benkelfat, Chawki; Young, Simon N.

    2010-01-01

    In healthy never-depressed individuals, acute tryptophan depletion (ATD) may selectively decrease the accurate recognition of fearful facial expressions. Here we investigated the perception of facial emotions after ATD in more detail. We also investigated whether bright light, which can reverse

  16. Rational emotions.

    Science.gov (United States)

    Meshulam, Meir; Winter, Eyal; Ben-Shakhar, Gershon; Aharon, Itzhak

    2012-01-01

    We present here the concept of rational emotions: Emotions may be directly controlled and utilized in a conscious, analytic fashion, enabling an individual to size up a situation, to determine that a certain "mental state" is strategically advantageous and adjust accordingly. Building on the growing body of literature recognizing the vital role of emotions in determining decisions, we explore the complementary role of rational choice in choosing emotional states. Participants played the role of "recipient" in the dictator game, in which an anonymous "dictator" decides how to split an amount of money between himself and the recipient. A subset of recipients was given a monetary incentive to be angry at low-split offers. That subset demonstrated increased physiological arousal at low offers relative to high offers as well as more anger than other participants. These results provide a fresh outlook on human decision-making and contribute to the continuing effort to build more complete models of rational behavior.

  17. The Cultivation of Pure Altruism via Gratitude: A Functional MRI Study of Change with Gratitude Practice.

    Science.gov (United States)

    Karns, Christina M; Moore, William E; Mayr, Ulrich

    2017-01-01

    Gratitude is an emotion and a trait linked to well-being and better health, and welcoming benefits to oneself is instrumentally valuable. However, theoretical and empirical work highlights that gratitude is more fully understood as an intrinsically valuable moral emotion. To understand the role of neural reward systems in the association between gratitude and altruistic motivations we tested two hypotheses: First, whether self-reported propensity toward gratitude relates to fMRI-derived indicators of "pure altruism," operationalized as the neural valuation of passive, private transfers to a charity versus to oneself. In young adult female participants, self-reported gratitude and altruism were associated with "neural pure altruism" in ventromedial prefrontal cortex (VMPFC) and nucleus accumbens. Second, whether neural pure altruism can be increased through practicing gratitude. In a double-blind study, we randomly assigned participants to either a gratitude-journal or active-neutral control journal group for 3 weeks. Relative to pre-test levels, gratitude journaling increased the neural pure altruism response in the VMPFC. We posit that as a context-dependent value-sensitive cortical region, the VMPFC supports change with gratitude practice, a change that is larger for benefits to others versus oneself.

  18. Pure and Public, Popular and personal

    DEFF Research Database (Denmark)

    Eriksson, Birgit

    2013-01-01

    In the article I reexamine the traditional aesthetical and political critiques of popular culture and reevaluate the social and communicative potential of bestselling cultural artifacts such as highly popular television series. First, I sketch the alleged aesthetic and social problems of popular...... and the exclusions of the public sphere. I argue that the ideals of a pure aesthetic and a public sphere neglect issues that are crucial to the type of commonality at stake in popular cultural artifacts: personal issues, social conflicts, and what is pleasurable to the senses or has to do with emotions. Third, I...

  19. Speech Emotion Feature Selection Method Based on Contribution Analysis Algorithm of Neural Network

    International Nuclear Information System (INIS)

    Wang Xiaojia; Mao Qirong; Zhan Yongzhao

    2008-01-01

    There are many emotion features. If all these features are employed to recognize emotions, redundant features may be existed. Furthermore, recognition result is unsatisfying and the cost of feature extraction is high. In this paper, a method to select speech emotion features based on contribution analysis algorithm of NN is presented. The emotion features are selected by using contribution analysis algorithm of NN from the 95 extracted features. Cluster analysis is applied to analyze the effectiveness for the features selected, and the time of feature extraction is evaluated. Finally, 24 emotion features selected are used to recognize six speech emotions. The experiments show that this method can improve the recognition rate and the time of feature extraction

  20. Toward Emotional Internet of Things for Smart Industry

    OpenAIRE

    Gómez Jáuregui , David Antonio

    2017-01-01

    International audience; In this paper, an approach to design and implement non-invasive and wearable emotion recognition technologies in smart industries is proposed. The proposed approach benefits from the interconnectivity of Internet of Things (IoT) to recognize and adapt to complex negative emotional states of employees (e.g., stress, frustration, etc.). Two types of connected objects are proposed: emotional detectors and emotional actors. The steps to design and implement these connected...

  1. Neural basis of emotion recognition deficits in first-episode major depression

    NARCIS (Netherlands)

    van Wingen, G. A.; van Eijndhoven, P.; Tendolkar, I.; Buitelaar, J.; Verkes, R. J.; Fernández, G.

    2011-01-01

    Depressed individuals demonstrate a poorer ability to recognize the emotions of others, which could contribute to difficulties in interpersonal behaviour. This emotion recognition deficit appears related to the depressive state and is particularly pronounced when emotions are labelled semantically.

  2. Neural basis of emotion recognition deficits in first-episode major depression

    NARCIS (Netherlands)

    Wingen, G.A. van; Eijndhoven, P.F.P. van; Tendolkar, I.; Buitelaar, J.K.; Verkes, R.J.; Fernandez, G.S.E.

    2011-01-01

    BACKGROUND: Depressed individuals demonstrate a poorer ability to recognize the emotions of others, which could contribute to difficulties in interpersonal behaviour. This emotion recognition deficit appears related to the depressive state and is particularly pronounced when emotions are labelled

  3. Age-related differences in emotion recognition ability: a cross-sectional study.

    Science.gov (United States)

    Mill, Aire; Allik, Jüri; Realo, Anu; Valk, Raivo

    2009-10-01

    Experimental studies indicate that recognition of emotions, particularly negative emotions, decreases with age. However, there is no consensus at which age the decrease in emotion recognition begins, how selective this is to negative emotions, and whether this applies to both facial and vocal expression. In the current cross-sectional study, 607 participants ranging in age from 18 to 84 years (mean age = 32.6 +/- 14.9 years) were asked to recognize emotions expressed either facially or vocally. In general, older participants were found to be less accurate at recognizing emotions, with the most distinctive age difference pertaining to a certain group of negative emotions. Both modalities revealed an age-related decline in the recognition of sadness and -- to a lesser degree -- anger, starting at about 30 years of age. Although age-related differences in the recognition of expression of emotion were not mediated by personality traits, 2 of the Big 5 traits, openness and conscientiousness, made an independent contribution to emotion-recognition performance. Implications of age-related differences in facial and vocal emotion expression and early onset of the selective decrease in emotion recognition are discussed in terms of previous findings and relevant theoretical models.

  4. On Emotional Intelligence: A Conversation with Daniel Goleman.

    Science.gov (United States)

    O'Neil, John

    1996-01-01

    Emotional intelligence involves a cluster of skills, including self-control, zeal, persistence, and self-motivation. Every child must be taught the essentials of handling anger, managing conflicts, developing empathy, and controlling impulses. Schools must help children recognize and manage their emotions. Educators should model emotional…

  5. Body Emotion Recognition Disproportionately Depends on Vertical Orientations during Childhood

    Science.gov (United States)

    Balas, Benjamin; Auen, Amanda; Saville, Alyson; Schmidt, Jamie

    2018-01-01

    Children's ability to recognize emotional expressions from faces and bodies develops during childhood. However, the low-level features that support accurate body emotion recognition during development have not been well characterized. This is in marked contrast to facial emotion recognition, which is known to depend upon specific spatial frequency…

  6. Do Dynamic Facial Expressions Convey Emotions to Children Better than Do Static Ones?

    Science.gov (United States)

    Widen, Sherri C.; Russell, James A.

    2015-01-01

    Past research has shown that children recognize emotions from facial expressions poorly and improve only gradually with age, but the stimuli in such studies have been static faces. Because dynamic faces include more information, it may well be that children more readily recognize emotions from dynamic facial expressions. The current study of…

  7. Perception of emotion in frontotemporal dementia and Alzheimer disease.

    Science.gov (United States)

    Lavenu, I; Pasquier, F; Lebert, F; Petit, H; Van der Linden, M

    1999-01-01

    Frontotemporal dementia (FTD) is the second cause of degenerative dementia. Behavioral changes occur before the cognitive decline and remain the major feature. A poor perception of emotion could account for some behavioral symptoms. The aim of this study was to assess the perception of emotion in patients with FTD and to compare it with that of patients with Alzheimer disease (AD). Fifty subjects performed the tests: 20 patients with probable AD, 18 patients with FTD, and 12 matched controls. The two patient groups did not differ in age, sex, severity of dementia, duration of the disease, and language tests. Subjects had to recognize and point out the name of one of seven basic emotions (anger, disgust, happiness, fear, sadness, surprise, and contempt) on a set of 28 faces presented on slides. The three groups were equally able to distinguish a face displaying affect from one not displaying affect. Naming of emotion was worse in patients with FTD than in patients with AD (correct answers 46% vs. 62%; p = 0.0006) who did not differ significantly from controls (72%). Anger, sadness, and disgust were less recognized in FTD than in AD patients who did not differ from controls, whereas fear and contempt were poorly recognized in both groups of patients compared with controls. These findings argue for different neural substrates underlying the recognition of various basic emotions. Behavioral disorders in FTD may be partly due to an impaired interpretation of the emotional environment.

  8. Emotion-affected decision making in human simulation

    OpenAIRE

    Zhao, Y; Kang, J; Wright, D K

    2006-01-01

    Human modelling is an interdisciplinary research field. The topic, emotion-affected decision making, was originally a cognitive psychology issue, but is now recognized as an important research direction for both computer science and biomedical modelling. The main aim of this paper is to attempt to bridge the gap between psychology and bioengineering in emotion-affected decision making. The work is based on Ortony's theory of emotions and bounded rationality theory, and attempts to connect the...

  9. Emotional Intelligence as Assessed by Situational Judgment and Emotion Recognition Tests: Building the Nomological Net

    Directory of Open Access Journals (Sweden)

    Carolyn MacCann

    2011-12-01

    Full Text Available Recent research on emotion recognition ability (ERA suggests that the capacity to process emotional information may differ for disparate emotions. However, little research has examined whether this findings holds for emotional understanding and emotion management, as well as emotion recognition. Moreover, little research has examined whether the abilities to recognize emotions, understand emotions, and manage emotions form a distinct emotional intelligence (EI construct that is independent from traditional cognitive ability factors. The current study addressed these issues. Participants (N=118 completed two ERA measures, two situational judgment tests assessing emotional understanding and emotion management, and three cognitive ability tests. Exploratory and confirmatory factor analyses of both the understanding and management item parcels showed that a three-factor model relating to fear, sadness, and anger content was a better fit than a one-factor model, supporting an emotion-specific view of EI. In addition, an EI factor composed of emotion recognition, emotional understanding, and emotion management was distinct from a cognitive ability factor composed of a matrices task, general knowledge test, and reading comprehension task. Results are discussed in terms of their potential implications for theory and practice, as well as the integration of EI research with known models of cognitive ability.

  10. Gender differences in facial emotion recognition in persons with chronic schizophrenia.

    Science.gov (United States)

    Weiss, Elisabeth M; Kohler, Christian G; Brensinger, Colleen M; Bilker, Warren B; Loughead, James; Delazer, Margarete; Nolan, Karen A

    2007-03-01

    The aim of the present study was to investigate possible sex differences in the recognition of facial expressions of emotion and to investigate the pattern of classification errors in schizophrenic males and females. Such an approach provides an opportunity to inspect the degree to which males and females differ in perceiving and interpreting the different emotions displayed to them and to analyze which emotions are most susceptible to recognition errors. Fifty six chronically hospitalized schizophrenic patients (38 men and 18 women) completed the Penn Emotion Recognition Test (ER40), a computerized emotion discrimination test presenting 40 color photographs of evoked happy, sad, anger, fear expressions and neutral expressions balanced for poser gender and ethnicity. We found a significant sex difference in the patterns of error rates in the Penn Emotion Recognition Test. Neutral faces were more commonly mistaken as angry in schizophrenic men, whereas schizophrenic women misinterpreted neutral faces more frequently as sad. Moreover, female faces were better recognized overall, but fear was better recognized in same gender photographs, whereas anger was better recognized in different gender photographs. The findings of the present study lend support to the notion that sex differences in aggressive behavior could be related to a cognitive style characterized by hostile attributions to neutral faces in schizophrenic men.

  11. Evaluating music emotion recognition

    DEFF Research Database (Denmark)

    Sturm, Bob L.

    2013-01-01

    A fundamental problem with nearly all work in music genre recognition (MGR)is that evaluation lacks validity with respect to the principal goals of MGR. This problem also occurs in the evaluation of music emotion recognition (MER). Standard approaches to evaluation, though easy to implement, do...... not reliably differentiate between recognizing genre or emotion from music, or by virtue of confounding factors in signals (e.g., equalization). We demonstrate such problems for evaluating an MER system, and conclude with recommendations....

  12. Emotion perception accuracy and bias in face-to-face versus cyberbullying.

    Science.gov (United States)

    Ciucci, Enrica; Baroncelli, Andrea; Nowicki, Stephen

    2014-01-01

    The authors investigated the association of traditional and cyber forms of bullying and victimization with emotion perception accuracy and emotion perception bias. Four basic emotions were considered (i.e., happiness, sadness, anger, and fear); 526 middle school students (280 females; M age = 12.58 years, SD = 1.16 years) were recruited, and emotionality was controlled. Results indicated no significant findings for girls. Boys with higher levels of traditional bullying did not show any deficit in perception accuracy of emotions, but they were prone to identify happiness and fear in faces when a different emotion was expressed; in addition, male cyberbullying was related to greater accuracy in recognizing fear. In terms of the victims, cyber victims had a global problem in recognizing emotions and a specific problem in processing anger and fear. It was concluded that emotion perception accuracy and bias were associated with bullying and victimization for boys not only in traditional settings but also in the electronic ones. Implications of these findings for possible intervention are discussed.

  13. Children's developing ability to depict emotions in their drawings.

    Science.gov (United States)

    Bonoti, F; Misailidi, P

    2006-10-01

    55 children aged 5 to 9 years were asked to draw pictures depicting happiness, sadness, anger, surprise, and fear as well as pictures that did not express any emotion. These pictures were then scored by nonexpert adults for their overall emotional expressiveness, that is, how well they depicted the intended emotion. The results showed that drawings were generally regarded by adults as emotionally expressive. Happiness was the emotion most easily recognized in children's drawings, closely followed by sadness. The results also showed a linear increase in ratings of emotional expressiveness with age.

  14. Repetition and brain potentials when recognizing natural scenes: task and emotion differences

    Science.gov (United States)

    Bradley, Margaret M.; Codispoti, Maurizio; Karlsson, Marie; Lang, Peter J.

    2013-01-01

    Repetition has long been known to facilitate memory performance, but its effects on event-related potentials (ERPs), measured as an index of recognition memory, are less well characterized. In Experiment 1, effects of both massed and distributed repetition on old–new ERPs were assessed during an immediate recognition test that followed incidental encoding of natural scenes that also varied in emotionality. Distributed repetition at encoding enhanced both memory performance and the amplitude of an old–new ERP difference over centro-parietal sensors. To assess whether these repetition effects reflect encoding or retrieval differences, the recognition task was replaced with passive viewing of old and new pictures in Experiment 2. In the absence of an explicit recognition task, ERPs were completely unaffected by repetition at encoding, and only emotional pictures prompted a modestly enhanced old–new difference. Taken together, the data suggest that repetition facilitates retrieval processes and that, in the absence of an explicit recognition task, differences in old–new ERPs are only apparent for affective cues. PMID:22842817

  15. The Cultivation of Pure Altruism via Gratitude: A Functional MRI Study of Change with Gratitude Practice

    Directory of Open Access Journals (Sweden)

    Christina M. Karns

    2017-12-01

    Full Text Available Gratitude is an emotion and a trait linked to well-being and better health, and welcoming benefits to oneself is instrumentally valuable. However, theoretical and empirical work highlights that gratitude is more fully understood as an intrinsically valuable moral emotion. To understand the role of neural reward systems in the association between gratitude and altruistic motivations we tested two hypotheses: First, whether self-reported propensity toward gratitude relates to fMRI-derived indicators of “pure altruism,” operationalized as the neural valuation of passive, private transfers to a charity versus to oneself. In young adult female participants, self-reported gratitude and altruism were associated with “neural pure altruism” in ventromedial prefrontal cortex (VMPFC and nucleus accumbens. Second, whether neural pure altruism can be increased through practicing gratitude. In a double-blind study, we randomly assigned participants to either a gratitude-journal or active-neutral control journal group for 3 weeks. Relative to pre-test levels, gratitude journaling increased the neural pure altruism response in the VMPFC. We posit that as a context-dependent value-sensitive cortical region, the VMPFC supports change with gratitude practice, a change that is larger for benefits to others versus oneself.

  16. Emotion-affected decision making in human simulation.

    Science.gov (United States)

    Zhao, Y; Kang, J; Wright, D K

    2006-01-01

    Human modelling is an interdisciplinary research field. The topic, emotion-affected decision making, was originally a cognitive psychology issue, but is now recognized as an important research direction for both computer science and biomedical modelling. The main aim of this paper is to attempt to bridge the gap between psychology and bioengineering in emotion-affected decision making. The work is based on Ortony's theory of emotions and bounded rationality theory, and attempts to connect the emotion process with decision making. A computational emotion model is proposed, and the initial framework of this model in virtual human simulation within the platform of Virtools is presented.

  17. Finding Emotional-Laden Resources on the World Wide Web

    Directory of Open Access Journals (Sweden)

    Diane Rasmussen Neal

    2011-03-01

    Full Text Available Some content in multimedia resources can depict or evoke certain emotions in users. The aim of Emotional Information Retrieval (EmIR and of our research is to identify knowledge about emotional-laden documents and to use these findings in a new kind of World Wide Web information service that allows users to search and browse by emotion. Our prototype, called Media EMOtion SEarch (MEMOSE, is largely based on the results of research regarding emotive music pieces, images and videos. In order to index both evoked and depicted emotions in these three media types and to make them searchable, we work with a controlled vocabulary, slide controls to adjust the emotions’ intensities, and broad folksonomies to identify and separate the correct resource-specific emotions. This separation of so-called power tags is based on a tag distribution which follows either an inverse power law (only one emotion was recognized or an inverse-logistical shape (two or three emotions were recognized. Both distributions are well known in information science. MEMOSE consists of a tool for tagging basic emotions with the help of slide controls, a processing device to separate power tags, a retrieval component consisting of a search interface (for any topic in combination with one or more emotions and a results screen. The latter shows two separately ranked lists of items for each media type (depicted and felt emotions, displaying thumbnails of resources, ranked by the mean values of intensity. In the evaluation of the MEMOSE prototype, study participants described our EmIR system as an enjoyable Web 2.0 service.

  18. Emotion as the amplifier and the primary motive: Some theories of emotion with relevance to language learning

    Directory of Open Access Journals (Sweden)

    Rebecca L. Oxford

    2015-01-01

    Full Text Available Emotion is crucial to living and learning. The powerful intertwining of emotion and cognition ignites learning within a complex dynamic system, which, as several sections of this paper show, also includes societal and cultural influences. As “the primary human motive” (MacIntyre, 2002a, p. 61, emotion operates as an amplifier, which provides energetic intensity to all human behavior, including language learning. This chapter explains major theories of emotion drawn from positive psychology, social psychology, social constructivism, social constructionism, and existential psychotherapy. It also offers implications for language learning related to understanding and managing emotions; expressing emotions appropriately despite cultural and linguistic differences; viewing emotions as transitory social roles; enhancing positive emotions and developing resilience; and recognizing, perhaps paradoxically, both the negative and the positive aspects of anxiety. The chapter concludes with the statement that language learners can become more agentic in dealing with their emotions. This form of self-regulation can lead to greater success in language learning.

  19. Impaired Attribution of Emotion to Facial Expressions in Anxiety and Major Depression

    NARCIS (Netherlands)

    Demenescu, Liliana R.; Kortekaas, Rudie; den Boer, Johan A.; Aleman, Andre

    2010-01-01

    Background: Recognition of others' emotions is an important aspect of interpersonal communication. In major depression, a significant emotion recognition impairment has been reported. It remains unclear whether the ability to recognize emotion from facial expressions is also impaired in anxiety

  20. Intelligent Tutor with Emotion Recognition and Student Emotion Management for Math Performance

    Directory of Open Access Journals (Sweden)

    Mari­a Luci­a Barron Estrada

    2014-10-01

    Full Text Available This research presents the development, implementation, and testing of an Intelligent Tutoring System for math in third grade elementary students, it identifies and manages the emotional state of the student; it produces affective feedback for the student during the course that also it is part of a social network. Emotions are recognized via facial expressions by means of an artificial neural network. The social network and the intelligent tutoring system with affective management have been tested in public and private elementary schools with very satisfying results.

  1. COGNITIVE STYLE OF A PERSON AS A FACTOR OF EFFECTIVE EMOTION RECOGNITION

    Directory of Open Access Journals (Sweden)

    E V Belovol

    2015-12-01

    Full Text Available Facial expression is one of the most informative sources of non-verbal information. Early studies on the ability to recognize emotions over the face, pointed to the universality of emotion expression and recognition. More recent studies have shown a combination of universal mechanisms and cultural-specific patterns. The process of emotion recognition is based on face perception that’s why the in-group effect should be taken under consideration. The in-group advantage hypothesis posits that observers are more accurate at recognizing facial expressions displayed by the same culture compared to other culture members. On the other hand, the process of emotion recognition is determined by such cognitive features as a cognitive style. This article describes the approaches to emotion expression and recognition, culture-specific features to basic emotion expression. It also describes factors related to recognition of basic emotions by people from different cultures. It was discovered that field-independent people are more accurate in emotion recognition than field- dependent people because they are able to distinguish markers of emotions. There was found no correlation between successful emotion recognition and the observers’ gender, no correlation between successful emotion recognition and the observers’ race

  2. Emotional Intelligence Tests: Potential Impacts on the Hiring Process for Accounting Students

    Science.gov (United States)

    Nicholls, Shane; Wegener, Matt; Bay, Darlene; Cook, Gail Lynn

    2012-01-01

    Emotional intelligence is increasingly recognized as being important for professional career success. Skills related to emotional intelligence (e.g. organizational commitment, public speaking, teamwork, and leadership) are considered essential. Human resource professionals have begun including tests of emotional intelligence (EI) in job applicant…

  3. Emotion Recognition in Adolescents with Down Syndrome: A Nonverbal Approach

    Directory of Open Access Journals (Sweden)

    Régis Pochon

    2017-05-01

    Full Text Available Several studies have reported that persons with Down syndrome (DS have difficulties recognizing emotions; however, there is insufficient research to prove that a deficit of emotional knowledge exists in DS. The aim of this study was to evaluate the recognition of emotional facial expressions without making use of emotional vocabulary, given the language problems known to be associated with this syndrome. The ability to recognize six emotions was assessed in 24 adolescents with DS. Their performance was compared to that of 24 typically developing children with the same nonverbal-developmental age, as assessed by Raven’s Progressive Matrices. Analysis of the results revealed no global difference; only marginal differences in the recognition of different emotions appeared. Study of the developmental trajectories revealed a developmental difference: the nonverbal reasoning level assessed by Raven’s matrices did not predict success on the experimental tasks in the DS group, contrary to the typically developing group. These results do not corroborate the hypothesis that there is an emotional knowledge deficit in DS and emphasize the importance of using dynamic, strictly nonverbal tasks in populations with language disorders.

  4. Comparing the Recognition of Emotional Facial Expressions in Patients with

    Directory of Open Access Journals (Sweden)

    Abdollah Ghasempour

    2014-05-01

    Full Text Available Background: Recognition of emotional facial expressions is one of the psychological factors which involve in obsessive-compulsive disorder (OCD and major depressive disorder (MDD. The aim of present study was to compare the ability of recognizing emotional facial expressions in patients with Obsessive-Compulsive Disorder and major depressive disorder. Materials and Methods: The present study is a cross-sectional and ex-post facto investigation (causal-comparative method. Forty participants (20 patients with OCD, 20 patients with MDD were selected through available sampling method from the clients referred to Tabriz Bozorgmehr clinic. Data were collected through Structured Clinical Interview and Recognition of Emotional Facial States test. The data were analyzed utilizing MANOVA. Results: The obtained results showed that there is no significant difference between groups in the mean score of recognition emotional states of surprise, sadness, happiness and fear; but groups had a significant difference in the mean score of diagnosing disgust and anger states (p<0.05. Conclusion: Patients suffering from both OCD and MDD show equal ability to recognize surprise, sadness, happiness and fear. However, the former are less competent in recognizing disgust and anger than the latter.

  5. Classification of emotions by multivariate analysis and individual differences of nuclear power plant operators` emotion

    Energy Technology Data Exchange (ETDEWEB)

    Hasegawa, Naoko; Yoshimura, Seiichi [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    1999-03-01

    The purpose of this study is the development of a simulation model which expresses operators` emotion under plant emergency. This report shows the classification of emotions by multivariate analysis and investigation results conducted to clarify individual differences of activated emotion influenced by personal traits. Although a former investigation was conducted to classify emotions into five basic emotions proposed by Johnson-Laird, the basic emotions was not based on real data. For the development of more realistic and accurate simulation model, it is necessary to recognize basic emotion and to classify emotions into them. As a result of analysis by qualification method 3 and cluster analysis, four basic clusters were clarified, i.e., Emotion expressed towards objects, Emotion affected by objects, Pleasant emotion, and Surprise. Moreover, 51 emotions were ranked in the order according to their similarities in each cluster. An investigation was conducted to clarify individual differences in emotion process using 87 plant operators. The results showed the differences of emotion depending on the existence of operators` foresight, cognitive style, experience in operation, and consciousness of attribution to an operating team. For example, operators with low self-efficacy, short experience or low consciousness of attribution to a team, feel more intensive emotion under plant emergency and more affected by severe plant conditions. The model which can express individual differences will be developed utilizing and converting these data hereafter. (author)

  6. Classification of emotions by multivariate analysis and individual differences of nuclear power plant operators' emotion

    International Nuclear Information System (INIS)

    Hasegawa, Naoko; Yoshimura, Seiichi

    1999-01-01

    The purpose of this study is the development of a simulation model which expresses operators' emotion under plant emergency. This report shows the classification of emotions by multivariate analysis and investigation results conducted to clarify individual differences of activated emotion influenced by personal traits. Although a former investigation was conducted to classify emotions into five basic emotions proposed by Johnson-Laird, the basic emotions was not based on real data. For the development of more realistic and accurate simulation model, it is necessary to recognize basic emotion and to classify emotions into them. As a result of analysis by qualification method 3 and cluster analysis, four basic clusters were clarified, i.e., Emotion expressed towards objects, Emotion affected by objects, Pleasant emotion, and Surprise. Moreover, 51 emotions were ranked in the order according to their similarities in each cluster. An investigation was conducted to clarify individual differences in emotion process using 87 plant operators. The results showed the differences of emotion depending on the existence of operators' foresight, cognitive style, experience in operation, and consciousness of attribution to an operating team. For example, operators with low self-efficacy, short experience or low consciousness of attribution to a team, feel more intensive emotion under plant emergency and more affected by severe plant conditions. The model which can express individual differences will be developed utilizing and converting these data hereafter. (author)

  7. Electrodermal Reactivity to Emotion Processing in Adults with Autistic Spectrum Disorders

    Science.gov (United States)

    Hubert, B. E.; Wicker, B.; Monfardini, E.; Deruelle, C.

    2009-01-01

    Although alterations of emotion processing are recognized as a core component of autism, the level at which alterations occur is still debated. Discrepant results suggest that overt assessment of emotion processing is not appropriate. In this study, skin conductance response (SCR) was used to examine covert emotional processes. Both behavioural…

  8. T cells recognizing a peptide contaminant undetectable by mass spectrometry

    DEFF Research Database (Denmark)

    Brezar, Vedran; Culina, Slobodan; Østerbye, Thomas

    2011-01-01

    Synthetic peptides are widely used in immunological research as epitopes to stimulate their cognate T cells. These preparations are never completely pure, but trace contaminants are commonly revealed by mass spectrometry quality controls. In an effort to characterize novel major histocompatibility...... complex (MHC) Class I-restricted ß-cell epitopes in non-obese diabetic (NOD) mice, we identified islet-infiltrating CD8+ T cells recognizing a contaminating peptide. The amount of this contaminant was so small to be undetectable by direct mass spectrometry. Only after concentration by liquid...... chromatography, we observed a mass peak corresponding to an immunodominant islet-specific glucose-6-phosphatase catalytic subunit-related protein (IGRP)(206-214) epitope described in the literature. Generation of CD8+ T-cell clones recognizing IGRP(206-214) using a novel method confirmed the identity...

  9. Investigating emotional contagion in dogs (Canis familiaris) to emotional sounds of humans and conspecifics.

    Science.gov (United States)

    Huber, Annika; Barber, Anjuli L A; Faragó, Tamás; Müller, Corsin A; Huber, Ludwig

    2017-07-01

    Emotional contagion, a basic component of empathy defined as emotional state-matching between individuals, has previously been shown in dogs even upon solely hearing negative emotional sounds of humans or conspecifics. The current investigation further sheds light on this phenomenon by directly contrasting emotional sounds of both species (humans and dogs) as well as opposed valences (positive and negative) to gain insights into intra- and interspecies empathy as well as differences between positively and negatively valenced sounds. Different types of sounds were played back to measure the influence of three dimensions on the dogs' behavioural response. We found that dogs behaved differently after hearing non-emotional sounds of their environment compared to emotional sounds of humans and conspecifics ("Emotionality" dimension), but the subjects responded similarly to human and conspecific sounds ("Species" dimension). However, dogs expressed more freezing behaviour after conspecific sounds, independent of the valence. Comparing positively with negatively valenced sounds of both species ("Valence" dimension), we found that, independent of the species from which the sound originated, dogs expressed more behavioural indicators for arousal and negatively valenced states after hearing negative emotional sounds. This response pattern indicates emotional state-matching or emotional contagion for negative sounds of humans and conspecifics. It furthermore indicates that dogs recognized the different valences of the emotional sounds, which is a promising finding for future studies on empathy for positive emotional states in dogs.

  10. Intelligibility of emotional speech in younger and older adults.

    Science.gov (United States)

    Dupuis, Kate; Pichora-Fuller, M Kathleen

    2014-01-01

    Little is known about the influence of vocal emotions on speech understanding. Word recognition accuracy for stimuli spoken to portray seven emotions (anger, disgust, fear, sadness, neutral, happiness, and pleasant surprise) was tested in younger and older listeners. Emotions were presented in either mixed (heterogeneous emotions mixed in a list) or blocked (homogeneous emotion blocked in a list) conditions. Three main hypotheses were tested. First, vocal emotion affects word recognition accuracy; specifically, portrayals of fear enhance word recognition accuracy because listeners orient to threatening information and/or distinctive acoustical cues such as high pitch mean and variation. Second, older listeners recognize words less accurately than younger listeners, but the effects of different emotions on intelligibility are similar across age groups. Third, blocking emotions in list results in better word recognition accuracy, especially for older listeners, and reduces the effect of emotion on intelligibility because as listeners develop expectations about vocal emotion, the allocation of processing resources can shift from emotional to lexical processing. Emotion was the within-subjects variable: all participants heard speech stimuli consisting of a carrier phrase followed by a target word spoken by either a younger or an older talker, with an equal number of stimuli portraying each of seven vocal emotions. The speech was presented in multi-talker babble at signal to noise ratios adjusted for each talker and each listener age group. Listener age (younger, older), condition (mixed, blocked), and talker (younger, older) were the main between-subjects variables. Fifty-six students (Mage= 18.3 years) were recruited from an undergraduate psychology course; 56 older adults (Mage= 72.3 years) were recruited from a volunteer pool. All participants had clinically normal pure-tone audiometric thresholds at frequencies ≤3000 Hz. There were significant main effects of

  11. Notes on the conceptual construction of emotions and bodies

    Directory of Open Access Journals (Sweden)

    Rafael Andrés Sánchez Aguirre

    2013-12-01

    Full Text Available This paper presents a reflection about the process-sociological study on the emotions to recognize their corporal-social dynamics. This conceptual exercise has two moments. Initially, it makes a review of Norbert Elias’ analytical proposal developed in his text On human beings and their emotions: a process-sociological essay, highlighting key ideas and research hypotheses suggested considering emotional phenomena. Later, it revisits some ideas of Marx and Scribano to problematize the place of bodies-emotions in capitalist society.

  12. Measuring emotion regulation and emotional expression in breast cancer patients: A systematic review.

    Science.gov (United States)

    Brandão, Tânia; Tavares, Rita; Schulz, Marc S; Matos, Paula Mena

    2016-02-01

    The important role of emotion regulation and expression in adaptation to breast cancer is now widely recognized. Studies have shown that optimal emotion regulation strategies, including less constrained emotional expression, are associated with better adaptation. Our objective was to systematically review measures used to assess the way women with breast cancer regulate their emotions. This systematic review was conducted in accordance with PRISMA guidelines. Nine different databases were searched. Data were independently extracted and assessed by two researchers. English-language articles that used at least one instrument to measure strategies to regulate emotions in women with breast cancer were included. Of 679 abstracts identified 59 studies were deemed eligible for inclusion. Studies were coded regarding their objectives, methods, and results. We identified 16 instruments used to measure strategies of emotion regulation and expression. The most frequently employed instrument was the Courtauld Emotional Control Scale. Few psychometric proprieties other than internal consistency were reported for most instruments. Many studies did not include important information regarding descriptive characteristics and psychometric properties of the instruments used. The instruments used tap different aspects of emotion regulation. Specific instruments should be explored further with regard to content, validity, and reliability in the context of breast cancer. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Embodying Emotional Disorders: New Hypotheses about Possible Emotional Consequences of Motor Disorders in Parkinson's Disease and Tourette's Syndrome.

    Science.gov (United States)

    Mermillod, Martial; Vermeulen, Nicolas; Droit-Volet, Sylvie; Jalenques, Isabelle; Durif, Franck; Niedenthal, Paula

    2011-01-01

    Parkinson's disease (PD) and Tourette's syndrome (TS) lead to important motor disorders among patients such as possible facial amimia in PD and tics in Tourette's syndrome. Under the grounded cognition framework that shows the importance of motor embodiment in emotional feeling (Niedenthal, 2007), both types of pathology with motor symptoms should be sufficient to induce potential impairments for these patients when recognizing emotional facial expressions (EFE). In this opinion paper, we describe a theoretical framework that assumes potential emotional disorders in Parkinson's disease and Tourette's syndrome based on motor disorders characterizing these two pathologies. We also review different methodological barriers in previous experimental designs that could enable the identification of emotional facial expressions despite emotional disorders in PD and TS.

  14. Automated Facial Coding Software Outperforms People in Recognizing Neutral Faces as Neutral from Standardized Datasets

    Directory of Open Access Journals (Sweden)

    Peter eLewinski

    2015-09-01

    Full Text Available Little is known about people’s accuracy of recognizing neutral faces as neutral. In this paper, I demonstrate the importance of knowing how well people recognize neutral faces. I contrasted human recognition scores of 100 typical, neutral front-up facial images with scores of an arguably objective judge – automated facial coding (AFC software. I hypothesized that the software would outperform humans in recognizing neutral faces because of the inherently objective nature of computer algorithms. Results confirmed this hypothesis. I provided the first-ever evidence that computer software (90% was more accurate in recognizing neutral faces than people were (59%. I posited two theoretical mechanisms, i.e. smile-as-a-baseline and false recognition of emotion, as possible explanations for my findings.

  15. Recognition of facial emotion and perceived parental bonding styles in healthy volunteers and personality disorder patients.

    Science.gov (United States)

    Zheng, Leilei; Chai, Hao; Chen, Wanzhen; Yu, Rongrong; He, Wei; Jiang, Zhengyan; Yu, Shaohua; Li, Huichun; Wang, Wei

    2011-12-01

    Early parental bonding experiences play a role in emotion recognition and expression in later adulthood, and patients with personality disorder frequently experience inappropriate parental bonding styles, therefore the aim of the present study was to explore whether parental bonding style is correlated with recognition of facial emotion in personality disorder patients. The Parental Bonding Instrument (PBI) and the Matsumoto and Ekman Japanese and Caucasian Facial Expressions of Emotion (JACFEE) photo set tests were carried out in 289 participants. Patients scored lower on parental Care but higher on parental Freedom Control and Autonomy Denial subscales, and they displayed less accuracy when recognizing contempt, disgust and happiness than the healthy volunteers. In healthy volunteers, maternal Autonomy Denial significantly predicted accuracy when recognizing fear, and maternal Care predicted the accuracy of recognizing sadness. In patients, paternal Care negatively predicted the accuracy of recognizing anger, paternal Freedom Control predicted the perceived intensity of contempt, maternal Care predicted the accuracy of recognizing sadness, and the intensity of disgust. Parenting bonding styles have an impact on the decoding process and sensitivity when recognizing facial emotions, especially in personality disorder patients. © 2011 The Authors. Psychiatry and Clinical Neurosciences © 2011 Japanese Society of Psychiatry and Neurology.

  16. Impaired recognition of social emotion in patients with complex regional pain syndrome.

    Science.gov (United States)

    Shin, Na Young; Kang, Do-Hyung; Jang, Joon Hwan; Park, Soo Young; Hwang, Jae Yeon; Kim, Sung Nyun; Byun, Min Soo; Park, Hye Youn; Kim, Yong Chul

    2013-11-01

    Multiple brain areas involved in nociceptive, autonomic, and social-emotional processing are disproportionally changed in patients with complex regional pain syndrome (CRPS). Little empirical evidence is available involving social cognitive functioning in patients with chronic pain conditions. We investigated the ability of patients with CRPS to recognize the mental/emotional states of other people. Forty-three patients with CRPS and 30 healthy controls performed the Reading Mind in the Eyes Test, which consists of photos in which human eyes express various emotional and mental states. Neuropsychological tests, including the Wisconsin Card Sorting Test, the stop-signal test, and the reaction time test, were administered to evaluate other cognitive functions. Patients with CRPS were significantly less accurate at recognizing emotional states in other persons, but not on other cognitive tests, compared with control subjects. We found a significant association between the deficit in social-emotion recognition and the affective dimension of pain, whereas this deficit was not related to the sensory dimension of pain. Our findings suggest a disrupted ability to recognize others' mental/emotional states in patients with CRPS. This article demonstrated a deficit in inferring mental/emotional states of others in patients with CRPS that was related to pain affect. Our study suggests that additional interventions directed toward reducing distressful affective pain may be helpful to restore social cognitive processing in patients with CRPS. Copyright © 2013 American Pain Society. Published by Elsevier Inc. All rights reserved.

  17. Emotions as pragmatic and epistemic actions

    Science.gov (United States)

    Wilutzky, Wendy

    2015-01-01

    This paper explores the idea that emotions in social contexts and their intentionality may be conceived of as pragmatic or epistemic actions. That is, emotions are often aimed at achieving certain goals within a social context, so that they resemble pragmatic actions; and in other cases emotions can be plausibly construed as acts of probing the social environment so as to extract or uncover important information, thus complying with the functions of epistemic actions (cf. Kirsh and Maglio, 1994). This view of emotions stands at odds with the wide-held conception that emotions' intentionality can be cashed out in terms of representations of value. On such a position, emotions' intentionality has only a mind-to-world direction of fit while any world-to-mind direction of fit is deemed secondary or is even outrightly denied. However, acknowledging that emotions (qua actions) also have a world-to-mind direction fit has several advantages over the typical rendition of emotions as representations of value, such as accounting for emotions' sensitivity to contextual factors, variations in emotion expression and, importantly, assessing the appropriateness of emotional reactions. To substantiate this claim, several cases of emotions in social contexts are discussed, as the social dimension of emotions highlights that emotions are inherently ways of interacting with one's social environment. In sum, the construal of emotions in social contexts as pragmatic or epistemic actions yields a more fine-grained and accurate understanding of emotions' intentionality and their roles in social contexts than the insistence on a purely mind-to-world direction of fit. PMID:26578999

  18. The study of emotional intelligence at preadolescents from different environment

    Directory of Open Access Journals (Sweden)

    Racu Iulia

    2016-09-01

    Full Text Available The issue of emotional intelligence is an important one in the sphere of human resources, management, education and psychology. Emotional intelligence is the capability of individuals to recognize their own, and other people’s emotions, to discriminate between different feelings and label them appropriately, and to use emotional information to guide thinking and behaviour. The present research is focused on emotional intelligence at preadolescents. As a result we established that the high level of emotional intelligence is particular for 23,46% preadolescents. Girls manifest a high level of emotional intelligence. Also high level of emotional intelligence is characteristic to 13 – 14 preadolescents. The emotional intelligence are more developed at preadolescents from rural environment.

  19. The relationship between high-frequency pure-tone hearing loss, hearing in noise test (HINT) thresholds, and the articulation index.

    Science.gov (United States)

    Vermiglio, Andrew J; Soli, Sigfrid D; Freed, Daniel J; Fisher, Laurel M

    2012-01-01

    Speech recognition in noise testing has been conducted at least since the 1940s (Dickson et al, 1946). The ability to recognize speech in noise is a distinct function of the auditory system (Plomp, 1978). According to Kochkin (2002), difficulty recognizing speech in noise is the primary complaint of hearing aid users. However, speech recognition in noise testing has not found widespread use in the field of audiology (Mueller, 2003; Strom, 2003; Tannenbaum and Rosenfeld, 1996). The audiogram has been used as the "gold standard" for hearing ability. However, the audiogram is a poor indicator of speech recognition in noise ability. This study investigates the relationship between pure-tone thresholds, the articulation index, and the ability to recognize speech in quiet and in noise. Pure-tone thresholds were measured for audiometric frequencies 250-6000 Hz. Pure-tone threshold groups were created. These included a normal threshold group and slight, mild, severe, and profound high-frequency pure-tone threshold groups. Speech recognition thresholds in quiet and in noise were obtained using the Hearing in Noise Test (HINT) (Nilsson et al, 1994; Vermiglio, 2008). The articulation index was determined by using Pavlovic's method with pure-tone thresholds (Pavlovic, 1989, 1991). Two hundred seventy-eight participants were tested. All participants were native speakers of American English. Sixty-three of the original participants were removed in order to create groups of participants with normal low-frequency pure-tone thresholds and relatively symmetrical high-frequency pure-tone threshold groups. The final set of 215 participants had a mean age of 33 yr with a range of 17-59 yr. Pure-tone threshold data were collected using the Hughson-Weslake procedure. Speech recognition data were collected using a Windows-based HINT software system. Statistical analyses were conducted using descriptive, correlational, and multivariate analysis of covariance (MANCOVA) statistics. The

  20. Degraded Impairment of Emotion Recognition in Parkinson's Disease Extends from Negative to Positive Emotions.

    Science.gov (United States)

    Lin, Chia-Yao; Tien, Yi-Min; Huang, Jong-Tsun; Tsai, Chon-Haw; Hsu, Li-Chuan

    2016-01-01

    Because of dopaminergic neurodegeneration, patients with Parkinson's disease (PD) show impairment in the recognition of negative facial expressions. In the present study, we aimed to determine whether PD patients with more advanced motor problems would show a much greater deficit in recognition of emotional facial expressions than a control group and whether impairment of emotion recognition would extend to positive emotions. Twenty-nine PD patients and 29 age-matched healthy controls were recruited. Participants were asked to discriminate emotions in Experiment  1 and identify gender in Experiment  2. In Experiment  1, PD patients demonstrated a recognition deficit for negative (sadness and anger) and positive faces. Further analysis showed that only PD patients with high motor dysfunction performed poorly in recognition of happy faces. In Experiment  2, PD patients showed an intact ability for gender identification, and the results eliminated possible abilities in the functions measured in Experiment  2 as alternative explanations for the results of Experiment  1. We concluded that patients' ability to recognize emotions deteriorated as the disease progressed. Recognition of negative emotions was impaired first, and then the impairment extended to positive emotions.

  1. Emotional handicaps to learning in two cultures | Buchan | South ...

    African Journals Online (AJOL)

    Deteriorating academic performance in schoolchildren may be due to emotional rather than intellectual causes, but these are not always recognized. The problem is likely to be of increasing importance in African children in Rhodesia as the general level of education rises. Emotional problems in 11 European and 16 African ...

  2. Functional architecture of visual emotion recognition ability: A latent variable approach.

    Science.gov (United States)

    Lewis, Gary J; Lefevre, Carmen E; Young, Andrew W

    2016-05-01

    Emotion recognition has been a focus of considerable attention for several decades. However, despite this interest, the underlying structure of individual differences in emotion recognition ability has been largely overlooked and thus is poorly understood. For example, limited knowledge exists concerning whether recognition ability for one emotion (e.g., disgust) generalizes to other emotions (e.g., anger, fear). Furthermore, it is unclear whether emotion recognition ability generalizes across modalities, such that those who are good at recognizing emotions from the face, for example, are also good at identifying emotions from nonfacial cues (such as cues conveyed via the body). The primary goal of the current set of studies was to address these questions through establishing the structure of individual differences in visual emotion recognition ability. In three independent samples (Study 1: n = 640; Study 2: n = 389; Study 3: n = 303), we observed that the ability to recognize visually presented emotions is based on different sources of variation: a supramodal emotion-general factor, supramodal emotion-specific factors, and face- and within-modality emotion-specific factors. In addition, we found evidence that general intelligence and alexithymia were associated with supramodal emotion recognition ability. Autism-like traits, empathic concern, and alexithymia were independently associated with face-specific emotion recognition ability. These results (a) provide a platform for further individual differences research on emotion recognition ability, (b) indicate that differentiating levels within the architecture of emotion recognition ability is of high importance, and (c) show that the capacity to understand expressions of emotion in others is linked to broader affective and cognitive processes. (c) 2016 APA, all rights reserved).

  3. Emotional Listening: How Students Can Measure and Eliminate This Barrier to Learning.

    Science.gov (United States)

    Pearce, C. Glenn

    1995-01-01

    Emotional responses affect interpretation of messages heard and raise barriers to effective listening. Teaching students to listen objectively and recognize emotional triggers will help them develop clearer understanding and result in better learning. (SK)

  4. Psychometric Properties of the Spanish Version of the Cognitive Emotion Regulation Questionnaire

    Science.gov (United States)

    Dominguez-Sanchez, Francisco J.; Lasa-Aristu, Amaia; Amor, Pedro J.; Holgado-Tello, Francisco P.

    2013-01-01

    The aim of this study was to validate a Spanish version of the Cognitive Emotion Regulation Questionnaire (CERQ-S), originally developed by Garnefski, Kraaij, and Spinhoven. To date, it is the only available instrument that permits a conceptually pure quantification of cognitive strategies of emotional regulation. A sample of 615 students (25…

  5. The list-composition effect in memory for emotional and neutral pictures: Differential contribution of ventral and dorsal attention networks to successful encoding

    OpenAIRE

    Barnacle, Gemma; Montaldi, Daniela; Talmi, Deborah; Sommer, Tobias

    2016-01-01

    The Emotional enhancement of memory (EEM) is observed in immediate free-recall memory tests when emotional and neutral stimuli are encoded and tested together (“mixed lists”), but surprisingly, not when they are encoded and tested separately (“pure lists”). Here our aim was to investigate whether the effect of list-composition (mixed versus pure lists) on the EEM is due to differential allocation of attention. We scanned participants with fMRI during encoding of semantically-related emotional...

  6. Age, gender, and puberty influence the development of facial emotion recognition.

    Science.gov (United States)

    Lawrence, Kate; Campbell, Ruth; Skuse, David

    2015-01-01

    Our ability to differentiate between simple facial expressions of emotion develops between infancy and early adulthood, yet few studies have explored the developmental trajectory of emotion recognition using a single methodology across a wide age-range. We investigated the development of emotion recognition abilities through childhood and adolescence, testing the hypothesis that children's ability to recognize simple emotions is modulated by chronological age, pubertal stage and gender. In order to establish norms, we assessed 478 children aged 6-16 years, using the Ekman-Friesen Pictures of Facial Affect. We then modeled these cross-sectional data in terms of competence in accurate recognition of the six emotions studied, when the positive correlation between emotion recognition and IQ was controlled. Significant linear trends were seen in children's ability to recognize facial expressions of happiness, surprise, fear, and disgust; there was improvement with increasing age. In contrast, for sad and angry expressions there is little or no change in accuracy over the age range 6-16 years; near-adult levels of competence are established by middle-childhood. In a sampled subset, pubertal status influenced the ability to recognize facial expressions of disgust and anger; there was an increase in competence from mid to late puberty, which occurred independently of age. A small female advantage was found in the recognition of some facial expressions. The normative data provided in this study will aid clinicians and researchers in assessing the emotion recognition abilities of children and will facilitate the identification of abnormalities in a skill that is often impaired in neurodevelopmental disorders. If emotion recognition abilities are a good model with which to understand adolescent development, then these results could have implications for the education, mental health provision and legal treatment of teenagers.

  7. Age, gender, and puberty influence the development of facial emotion recognition

    Science.gov (United States)

    Lawrence, Kate; Campbell, Ruth; Skuse, David

    2015-01-01

    Our ability to differentiate between simple facial expressions of emotion develops between infancy and early adulthood, yet few studies have explored the developmental trajectory of emotion recognition using a single methodology across a wide age-range. We investigated the development of emotion recognition abilities through childhood and adolescence, testing the hypothesis that children’s ability to recognize simple emotions is modulated by chronological age, pubertal stage and gender. In order to establish norms, we assessed 478 children aged 6–16 years, using the Ekman-Friesen Pictures of Facial Affect. We then modeled these cross-sectional data in terms of competence in accurate recognition of the six emotions studied, when the positive correlation between emotion recognition and IQ was controlled. Significant linear trends were seen in children’s ability to recognize facial expressions of happiness, surprise, fear, and disgust; there was improvement with increasing age. In contrast, for sad and angry expressions there is little or no change in accuracy over the age range 6–16 years; near-adult levels of competence are established by middle-childhood. In a sampled subset, pubertal status influenced the ability to recognize facial expressions of disgust and anger; there was an increase in competence from mid to late puberty, which occurred independently of age. A small female advantage was found in the recognition of some facial expressions. The normative data provided in this study will aid clinicians and researchers in assessing the emotion recognition abilities of children and will facilitate the identification of abnormalities in a skill that is often impaired in neurodevelopmental disorders. If emotion recognition abilities are a good model with which to understand adolescent development, then these results could have implications for the education, mental health provision and legal treatment of teenagers. PMID:26136697

  8. Misinterpretation of Facial Expressions of Emotion in Verbal Adults with Autism Spectrum Disorder

    Science.gov (United States)

    Eack, Shaun M.; Mazefsky, Carla A.; Minshew, Nancy J.

    2015-01-01

    Facial emotion perception is significantly affected in autism spectrum disorder, yet little is known about how individuals with autism spectrum disorder misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. This study examined facial emotion perception in 45 verbal adults with autism spectrum…

  9. Embodying Emotional Disorders: New Hypotheses about Possible Emotional Consequences of Motor Disorders in Parkinson's Disease and Tourette's Syndrome

    OpenAIRE

    Mermillod, Martial; Vermeulen, Nicolas; Droit-Volet, Sylvie; Jalenques, Isabelle; Durif, Franck; Niedenthal, Paula

    2011-01-01

    Parkinson's disease (PD) and Tourette's syndrome (TS) lead to important motor disorders among patients such as possible facial amimia in PD and tics in Tourette's syndrome. Under the grounded cognition framework that shows the importance of motor embodiment in emotional feeling (Niedenthal, 2007), both types of pathology with motor symptoms should be sufficient to induce potential impairments for these patients when recognizing emotional facial expressions (EFE). In this opinion paper, we des...

  10. A 3-stage model of patient-centered communication for addressing cancer patients' emotional distress.

    Science.gov (United States)

    Dean, Marleah; Street, Richard L

    2014-02-01

    To describe pathways through which clinicians can more effectively respond to patients' emotions in ways that contribute to betterment of the patient's health and well-being. A representative review of literature on managing emotions in clinical consultations was conducted. A three-stage, conceptual model for assisting clinicians to more effectively address the challenges of recognizing, exploring, and managing cancer patients' emotional distress in the clinical encounter was developed. To enhance and enact recognition of patients' emotions, clinicians can engage in mindfulness, self-situational awareness, active listening, and facilitative communication. To enact exploration, clinicians can acknowledge and validate emotions and provide empathy. Finally, clinicians can provide information empathetically, identify therapeutic resources, and give referrals and interventions as needed to help lessen patients' emotional distress. This model serves as a framework for future research examining pathways that link clinicians' emotional cue recognition to patient-centered responses exploring a patient's emotional distress to therapeutic actions that contribute to improved psychological and emotional health. Specific communicative and cognitive strategies are presented that can help clinicians better recognize a patient's emotional distress and respond in ways that have therapeutic value. Published by Elsevier Ireland Ltd.

  11. Degraded Impairment of Emotion Recognition in Parkinson’s Disease Extends from Negative to Positive Emotions

    Directory of Open Access Journals (Sweden)

    Chia-Yao Lin

    2016-01-01

    Full Text Available Because of dopaminergic neurodegeneration, patients with Parkinson’s disease (PD show impairment in the recognition of negative facial expressions. In the present study, we aimed to determine whether PD patients with more advanced motor problems would show a much greater deficit in recognition of emotional facial expressions than a control group and whether impairment of emotion recognition would extend to positive emotions. Twenty-nine PD patients and 29 age-matched healthy controls were recruited. Participants were asked to discriminate emotions in Experiment  1 and identify gender in Experiment  2. In Experiment  1, PD patients demonstrated a recognition deficit for negative (sadness and anger and positive faces. Further analysis showed that only PD patients with high motor dysfunction performed poorly in recognition of happy faces. In Experiment  2, PD patients showed an intact ability for gender identification, and the results eliminated possible abilities in the functions measured in Experiment  2 as alternative explanations for the results of Experiment  1. We concluded that patients’ ability to recognize emotions deteriorated as the disease progressed. Recognition of negative emotions was impaired first, and then the impairment extended to positive emotions.

  12. Emotion Recognition of Weblog Sentences Based on an Ensemble Algorithm of Multi-label Classification and Word Emotions

    Science.gov (United States)

    Li, Ji; Ren, Fuji

    Weblogs have greatly changed the communication ways of mankind. Affective analysis of blog posts is found valuable for many applications such as text-to-speech synthesis or computer-assisted recommendation. Traditional emotion recognition in text based on single-label classification can not satisfy higher requirements of affective computing. In this paper, the automatic identification of sentence emotion in weblogs is modeled as a multi-label text categorization task. Experiments are carried out on 12273 blog sentences from the Chinese emotion corpus Ren_CECps with 8-dimension emotion annotation. An ensemble algorithm RAKEL is used to recognize dominant emotions from the writer's perspective. Our emotion feature using detailed intensity representation for word emotions outperforms the other main features such as the word frequency feature and the traditional lexicon-based feature. In order to deal with relatively complex sentences, we integrate grammatical characteristics of punctuations, disjunctive connectives, modification relations and negation into features. It achieves 13.51% and 12.49% increases for Micro-averaged F1 and Macro-averaged F1 respectively compared to the traditional lexicon-based feature. Result shows that multiple-dimension emotion representation with grammatical features can efficiently classify sentence emotion in a multi-label problem.

  13. Impairment in the recognition of emotion across different media following traumatic brain injury.

    Science.gov (United States)

    Williams, Claire; Wood, Rodger Ll

    2010-02-01

    The current study examined emotion recognition following traumatic brain injury (TBI) and examined whether performance differed according to the affective valence and type of media presentation of the stimuli. A total of 64 patients with TBI and matched controls completed the Emotion Evaluation Test (EET) and Ekman 60 Faces Test (E-60-FT). Patients with TBI also completed measures of information processing and verbal ability. Results revealed that the TBI group were significantly impaired compared to controls when recognizing emotion on the EET and E-60-FT. A significant main effect of valence was found in both groups, with poor recognition of negative emotions. However, the difference between the recognition of positive and negative emotions was larger in the TBI group. The TBI group were also more accurate recognizing emotion displayed in audiovisual media (EET) than that displayed in still media (E-60-FT). No significant relationship was obtained between emotion recognition tasks and information-processing speed. A significant positive relationship was found between the E-60-FT and one measure of verbal ability. These findings support models of emotion that specify separate neurological pathways for certain emotions and different media and confirm that patients with TBI are vulnerable to experiencing emotion recognition difficulties.

  14. When nurse emotional intelligence matters: How transformational leadership influences intent to stay.

    Science.gov (United States)

    Wang, Lin; Tao, Hong; Bowers, Barbara J; Brown, Roger; Zhang, Yaqing

    2018-05-01

    The purpose of this study was to examine the role of staff nurse emotional intelligence between transformational leadership and nurse intent to stay. Nurse intent to stay and transformational leadership are widely recognized as vital components of nurse retention. Staff nurse emotional intelligence that has been confirmed improvable has been recently recognized in the nursing literature as correlated with retention. Yet, the nature of the relationships among these three variables is not known. Cross-sectional data for 535 Chinese nurses were analysed using Structural Equation Modelling. Transformational leadership and staff nurse emotional intelligence were significant predictors of nurse intent to stay, accounting for 34.3% of the variance in nurse intent to stay. Staff nurse emotional intelligence partially mediates the relationship between transformational leadership and nurse intent to stay. The findings of the study emphasized the importance of transformational leadership in enhancing nurse emotional intelligence and to provide a deeper understanding of the mediating role of emotional intelligence in the relationship between nurse manager's transformational leadership and nurse's intent to stay. Nurse leaders should develop training programmes to improve nursing manager transformational leadership and staff nurse emotional intelligence in the workplace. © 2018 John Wiley & Sons Ltd.

  15. Exploring cultural differences in the recognition of the self-conscious emotions

    NARCIS (Netherlands)

    Chung, J.M.H.; Robins, R.W.

    2015-01-01

    Recent research suggests that the self-conscious emotions of embarrassment, shame, and pride have distinct, nonverbal expressions that can be recognized in the United States at above-chance levels. However, few studies have examined the recognition of these emotions in other cultures, and little

  16. The attribution approach to emotion and motivation: History, hypotheses, home runs, headaches/heartaches

    OpenAIRE

    Weiner, B

    2014-01-01

    © The Author(s) 2014. In this article the history of the attribution approach to emotion and motivation is reviewed. Early motivation theorists incorporated emotion within the pleasure/pain principle but they did not recognize specific emotions. This changed when Atkinson introduced his theory of achievement motivation, which argued that achievement strivings are determined by the anticipated emotions of pride and shame. Attribution theorists then suggested many other emotional reactions to s...

  17. Computer-Mediated Communication Preferences and Individual Differences in Neurocognitive Measures of Emotional Attention Capture, Reactivity and Regulation

    Science.gov (United States)

    Babkirk, Sarah; Luehring-Jones, Peter; Dennis, Tracy A.

    2016-01-01

    The use of computer-mediated communication (CMC) to engage socially has become increasingly prevalent, yet few studies examined individual differences that may shed light on implications of CMC for adjustment. The current study examined neurocognitive individual differences associated with preferences to use technology in relation to social-emotional outcomes. In Study 1 (N =91), a self-report measure, the Social Media Communication Questionnaire (SMCQ), was evaluated as an assessment of preferences for communicating positive and negative emotions on a scale ranging from purely via CMC to purely face-to-face. In Study 2, SMCQ preferences were examined in relation to event-related potentials (ERPs) associated with early emotional attention capture and reactivity (the frontal N1) and later sustained emotional processing and regulation [the late positive potential (LPP)]. Electroencephalography (EEG) was recorded while 22 participants passively viewed emotional and neutral pictures and completed an emotion regulation task with instructions to increase, decrease or maintain their emotional responses. A greater preference for CMC was associated with reduced size of and satisfaction with social support, greater early (N1) attention capture by emotional stimuli, and reduced LPP amplitudes to unpleasant stimuli in the increase emotion regulatory task. These findings are discussed in the context of possible emotion- and social-regulatory functions of CMC. PMID:26613269

  18. Audio-Visual Integration Modifies Emotional Judgment in Music

    Directory of Open Access Journals (Sweden)

    Shen-Yuan Su

    2011-10-01

    Full Text Available The conventional view that perceived emotion in music is derived mainly from auditory signals has led to neglect of the contribution of visual image. In this study, we manipulated mode (major vs. minor and examined the influence of a video image on emotional judgment in music. Melodies in either major or minor mode were controlled for tempo and rhythm and played to the participants. We found that Taiwanese participants, like Westerners, judged major melodies as expressing positive, and minor melodies negative, emotions. The major or minor melodies were then paired with video images of the singers, which were either emotionally congruent or incongruent with their modes. Results showed that participants perceived stronger positive or negative emotions with congruent audio-visual stimuli. Compared to listening to music alone, stronger emotions were perceived when an emotionally congruent video image was added and weaker emotions were perceived when an incongruent image was added. We therefore demonstrate that mode is important to perceive the emotional valence in music and that treating musical art as a purely auditory event might lose the enhanced emotional strength perceived in music, since going to a concert may lead to stronger perceived emotion than listening to the CD at home.

  19. Encoding conditions affect recognition of vocally expressed emotions across cultures

    Directory of Open Access Journals (Sweden)

    Rebecca eJürgens

    2013-03-01

    Full Text Available Although the expression of emotions in humans is considered to be largely universal, cultural effects contribute to both emotion expression and recognition. To disentangle the interplay between these factors, play-acted and authentic (non-instructed vocal expressions of emotions were used, on the assumption that cultural effects may contribute differentially to the recognition of staged and spontaneous emotions. Speech tokens depicting four emotions (anger, sadness, joy, fear were obtained from German radio archives and reenacted by professional actors, and presented to 120 participants from Germany, Romania, and Indonesia. Participants in all three countries were poor at distinguishing between play-acted and spontaneous emotional utterances (58.73% correct on average with only marginal cultural differences. Nevertheless, authenticity influenced emotion recognition: across cultures, anger was recognized more accurately when play-acted (z = 15.06, p < .001 and sadness when authentic (z = 6.63, p < .001, replicating previous findings from German populations. German subjects revealed a slight advantage in recognizing emotions, indicating a moderate in-group advantage. There was no difference between Romanian and Indonesian subjects in the overall emotion recognition. Differential cultural effects became particularly apparent in terms of differential biases in emotion attribution. While all participants labeled play-acted expressions as anger more frequently than expected, German participants exhibited a further bias towards choosing anger for spontaneous stimuli. In contrast to the German sample, Romanian and Indonesian participants were biased towards choosing sadness. These results support the view that emotion recognition rests on a complex interaction of human universals and cultural specificities. Whether and in which way the observed biases are linked to cultural differences in self-construal remains an issue for further investigation.

  20. Is fear in your head? A comparison of instructed and real-life expressions of emotion in the face and body.

    Science.gov (United States)

    Abramson, Lior; Marom, Inbal; Petranker, Rotem; Aviezer, Hillel

    2017-04-01

    The majority of emotion perception studies utilize instructed and stereotypical expressions of faces or bodies. While such stimuli are highly standardized and well-recognized, their resemblance to real-life expressions of emotion remains unknown. Here we examined facial and body expressions of fear and anger during real-life situations and compared their recognition to that of instructed expressions of the same emotions. In order to examine the source of the affective signal, expressions of emotion were presented as faces alone, bodies alone, and naturally, as faces with bodies. The results demonstrated striking deviations between recognition of instructed and real-life stimuli, which differed as a function of the emotion expressed. In real-life fearful expressions of emotion, bodies were far better recognized than faces, a pattern not found with instructed expressions of emotion. Anger reactions were better recognized from the body than from the face in both real-life and instructed stimuli. However, the real-life stimuli were overall better recognized than their instructed counterparts. These results indicate that differences between instructed and real-life expressions of emotion are prevalent and raise caution against an overreliance of researchers on instructed affective stimuli. The findings also demonstrate that in real life, facial expression perception may rely heavily on information from the contextualizing body. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. From Physiological data to Emotional States: Conducting a User Study and Comparing Machine Learning Classifiers

    Directory of Open Access Journals (Sweden)

    Ali Mehmood KHAN

    2016-06-01

    Full Text Available Recognizing emotional states is becoming a major part of a user's context for wearable computing applications. The system should be able to acquire a user's emotional states by using physiological sensors. We want to develop a personal emotional states recognition system that is practical, reliable, and can be used for health-care related applications. We propose to use the eHealth platform 1 which is a ready-made, light weight, small and easy to use device for recognizing a few emotional states like ‘Sad’, ‘Dislike’, ‘Joy’, ‘Stress’, ‘Normal’, ‘No-Idea’, ‘Positive’ and ‘Negative’ using decision tree (J48 and k-Nearest Neighbors (IBK classifiers. In this paper, we present an approach to build a system that exhibits this property and provides evidence based on data for 8 different emotional states collected from 24 different subjects. Our results indicate that the system has an accuracy rate of approximately 98 %. In our work, we used four physiological sensors i.e. ‘Blood Volume Pulse’ (BVP, ‘Electromyogram’ (EMG, ‘Galvanic Skin Response’ (GSR, and ‘Skin Temperature’ in order to recognize emotional states (i.e. Stress, Joy/Happy, Sad, Normal/Neutral, Dislike, No-idea, Positive and Negative.

  2. Heterogeneity of long-history migration predicts emotion recognition accuracy.

    Science.gov (United States)

    Wood, Adrienne; Rychlowska, Magdalena; Niedenthal, Paula M

    2016-06-01

    Recent work (Rychlowska et al., 2015) demonstrated the power of a relatively new cultural dimension, historical heterogeneity, in predicting cultural differences in the endorsement of emotion expression norms. Historical heterogeneity describes the number of source countries that have contributed to a country's present-day population over the last 500 years. People in cultures originating from a large number of source countries may have historically benefited from greater and clearer emotional expressivity, because they lacked a common language and well-established social norms. We therefore hypothesized that in addition to endorsing more expressive display rules, individuals from heterogeneous cultures will also produce facial expressions that are easier to recognize by people from other cultures. By reanalyzing cross-cultural emotion recognition data from 92 papers and 82 cultures, we show that emotion expressions of people from heterogeneous cultures are more easily recognized by observers from other cultures than are the expressions produced in homogeneous cultures. Heterogeneity influences expression recognition rates alongside the individualism-collectivism of the perceivers' culture, as more individualistic cultures were more accurate in emotion judgments than collectivistic cultures. This work reveals the present-day behavioral consequences of long-term historical migration patterns and demonstrates the predictive power of historical heterogeneity. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  3. [The role of emotional intelligence in addiction disorders].

    Science.gov (United States)

    Kun, Bernadette; Demetrovics, Zsolt

    2010-01-01

    Role of emotions in the background of addictions is a long-studied question. Clinical observations and comorbidity studies unambiguously indicate that psychoactive substance use and dependence are related to emotional problems as well. Emotional intelligence is a relatively new concept of the study of managing emotions. On the revelation of this construct's relationship with psychoactive substance use and dependence only a few studies have been carried out so far. Present study systematically reviews articles born between 1990 and October 1, 2010 dealing with the relationship of these two factors. Out of the identified altogether 54 studies, 37 fitted the criteria of analysis. Studies overall indicate that lower levels of emotional intelligence are associated with more intensive drinking, smoking and illicit substance use and also more likely correlate with internet addiction, bulimia, gambling and impulsive buying. According to their results, especially the components called "recognizing emotions" and "regulation of emotions" of emotional intelligence play important roles regarding substance use.

  4. The development of the Athens Emotional States Inventory (AESI): collection, validation and automatic processing of emotionally loaded sentences.

    Science.gov (United States)

    Chaspari, Theodora; Soldatos, Constantin; Maragos, Petros

    2015-01-01

    The development of ecologically valid procedures for collecting reliable and unbiased emotional data towards computer interfaces with social and affective intelligence targeting patients with mental disorders. Following its development, presented with, the Athens Emotional States Inventory (AESI) proposes the design, recording and validation of an audiovisual database for five emotional states: anger, fear, joy, sadness and neutral. The items of the AESI consist of sentences each having content indicative of the corresponding emotion. Emotional content was assessed through a survey of 40 young participants with a questionnaire following the Latin square design. The emotional sentences that were correctly identified by 85% of the participants were recorded in a soundproof room with microphones and cameras. A preliminary validation of AESI is performed through automatic emotion recognition experiments from speech. The resulting database contains 696 recorded utterances in Greek language by 20 native speakers and has a total duration of approximately 28 min. Speech classification results yield accuracy up to 75.15% for automatically recognizing the emotions in AESI. These results indicate the usefulness of our approach for collecting emotional data with reliable content, balanced across classes and with reduced environmental variability.

  5. The level of emotional intelligence in undergraduate students of nursing

    Directory of Open Access Journals (Sweden)

    Majerníková Ľudmila

    2017-03-01

    Full Text Available Aim. The theory of emotional intelligence provides a framework to think about all of the non-technical skills you need in order to be a good nurse. It’s often described as the potential to feel, use, communicate, recognize, remember, describe, identify, learn from, manage, understand, and explain emotions. The aim of the study was to determine the level of total global Emotional Intelligence among undergraduate students of nursing and also to check the influence of factors (the year of study, type of completed high school education on Emotional Intelligence.

  6. Indirect Self-Destructiveness and Emotional Intelligence.

    Science.gov (United States)

    Tsirigotis, Konstantinos

    2016-06-01

    While emotional intelligence may have a favourable influence on the life and psychological and social functioning of the individual, indirect self-destructiveness exerts a rather negative influence. The aim of this study has been to explore possible relations between indirect self-destructiveness and emotional intelligence. A population of 260 individuals (130 females and 130 males) aged 20-30 (mean age of 24.5) was studied by using the Polish version of the chronic self-destructiveness scale and INTE, i.e., the Polish version of the assessing emotions scale. Indirect self-destructiveness has significant correlations with all variables of INTE (overall score, factor I, factor II), and these correlations are negative. The intensity of indirect self-destructiveness differentiates significantly the height of the emotional intelligence and vice versa: the height of the emotional intelligence differentiates significantly the intensity of indirect self-destructiveness. Indirect self-destructiveness has negative correlations with emotional intelligence as well as its components: the ability to recognize emotions and the ability to utilize emotions. The height of emotional intelligence differentiates the intensity of indirect self-destructiveness, and vice versa: the intensity of indirect self-destructiveness differentiates the height of emotional intelligence. It seems advisable to use emotional intelligence in the prophylactic and therapeutic work with persons with various types of disorders, especially with the syndrome of indirect self-destructiveness.

  7. Emotion through locomotion: gender impact.

    Directory of Open Access Journals (Sweden)

    Samuel Krüger

    Full Text Available Body language reading is of significance for daily life social cognition and successful social interaction, and constitutes a core component of social competence. Yet it is unclear whether our ability for body language reading is gender specific. In the present work, female and male observers had to visually recognize emotions through point-light human locomotion performed by female and male actors with different emotional expressions. For subtle emotional expressions only, males surpass females in recognition accuracy and readiness to respond to happy walking portrayed by female actors, whereas females exhibit a tendency to be better in recognition of hostile angry locomotion expressed by male actors. In contrast to widespread beliefs about female superiority in social cognition, the findings suggest that gender effects in recognition of emotions from human locomotion are modulated by emotional content of actions and opposite actor gender. In a nutshell, the study makes a further step in elucidation of gender impact on body language reading and on neurodevelopmental and psychiatric deficits in visual social cognition.

  8. Practice nurses mental health provide space to patients to discuss unpleasant emotions.

    NARCIS (Netherlands)

    Griep, E.C.M.; Noordman, J.; Dulmen, A.M. van

    2016-01-01

    WHAT IS KNOWN ON THE SUBJECT? A core skill of practice nurses' mental health is to recognize and explore patients' unpleasant emotions. Patients rarely express their unpleasant emotions directly and spontaneously, but instead give indirect signs that something is worrying them.

  9. Emotion expression in human punishment behavior.

    Science.gov (United States)

    Xiao, Erte; Houser, Daniel

    2005-05-17

    Evolutionary theory reveals that punishment is effective in promoting cooperation and maintaining social norms. Although it is accepted that emotions are connected to punishment decisions, there remains substantial debate over why humans use costly punishment. Here we show experimentally that constraints on emotion expression can increase the use of costly punishment. We report data from ultimatum games, where a proposer offers a division of a sum of money and a responder decides whether to accept the split, or reject and leave both players with nothing. Compared with the treatment in which expressing emotions directly to proposers is prohibited, rejection of unfair offers is significantly less frequent when responders can convey their feelings to the proposer concurrently with their decisions. These data support the view that costly punishment might itself be used to express negative emotions and suggest that future studies will benefit by recognizing that human demand for emotion expression can have significant behavioral consequences in social environments, including families, courts, companies, and markets.

  10. Educating the Emotions from Gradgrind to Goleman

    Science.gov (United States)

    Dixon, Thomas

    2012-01-01

    Charles Dickens famously satirised the rationalism and mechanism of utilitarian educational ideas through the figure of Gradgrind in "Hard Times". Even in the nineteenth century there were very few people, in reality, who would have agreed that the education of children should be a matter of purely intellectual, rather than emotional,…

  11. Play it again, Sam: brain correlates of emotional music recognition.

    Science.gov (United States)

    Altenmüller, Eckart; Siggel, Susann; Mohammadi, Bahram; Samii, Amir; Münte, Thomas F

    2014-01-01

    Music can elicit strong emotions and can be remembered in connection with these emotions even decades later. Yet, the brain correlates of episodic memory for highly emotional music compared with less emotional music have not been examined. We therefore used fMRI to investigate brain structures activated by emotional processing of short excerpts of film music successfully retrieved from episodic long-term memory. Eighteen non-musicians volunteers were exposed to 60 structurally similar pieces of film music of 10 s length with high arousal ratings and either less positive or very positive valence ratings. Two similar sets of 30 pieces were created. Each of these was presented to half of the participants during the encoding session outside of the scanner, while all stimuli were used during the second recognition session inside the MRI-scanner. During fMRI each stimulation period (10 s) was followed by a 20 s resting period during which participants pressed either the "old" or the "new" button to indicate whether they had heard the piece before. Musical stimuli vs. silence activated the bilateral superior temporal gyrus, right insula, right middle frontal gyrus, bilateral medial frontal gyrus and the left anterior cerebellum. Old pieces led to activation in the left medial dorsal thalamus and left midbrain compared to new pieces. For recognized vs. not recognized old pieces a focused activation in the right inferior frontal gyrus and the left cerebellum was found. Positive pieces activated the left medial frontal gyrus, the left precuneus, the right superior frontal gyrus, the left posterior cingulate, the bilateral middle temporal gyrus, and the left thalamus compared to less positive pieces. Specific brain networks related to memory retrieval and emotional processing of symphonic film music were identified. The results imply that the valence of a music piece is important for memory performance and is recognized very fast.

  12. Emotion perception in music in high-functioning adolescents with Autism Spectrum Disorders.

    Science.gov (United States)

    Quintin, Eve-Marie; Bhatara, Anjali; Poissant, Hélène; Fombonne, Eric; Levitin, Daniel J

    2011-09-01

    Individuals with Autism Spectrum Disorders (ASD) succeed at a range of musical tasks. The ability to recognize musical emotion as belonging to one of four categories (happy, sad, scared or peaceful) was assessed in high-functioning adolescents with ASD (N = 26) and adolescents with typical development (TD, N = 26) with comparable performance IQ, auditory working memory, and musical training and experience. When verbal IQ was controlled for, there was no significant effect of diagnostic group. Adolescents with ASD rated the intensity of the emotions similarly to adolescents with TD and reported greater confidence in their responses when they had correctly (vs. incorrectly) recognized the emotions. These findings are reviewed within the context of the amygdala theory of autism.

  13. Emotion recognition based on customized smart bracelet with built-in accelerometer

    Directory of Open Access Journals (Sweden)

    Zhan Zhang

    2016-07-01

    Full Text Available Background: Recently, emotion recognition has become a hot topic in human-computer interaction. If computers could understand human emotions, they could interact better with their users. This paper proposes a novel method to recognize human emotions (neutral, happy, and angry using a smart bracelet with built-in accelerometer. Methods: In this study, a total of 123 participants were instructed to wear a customized smart bracelet with built-in accelerometer that can track and record their movements. Firstly, participants walked two minutes as normal, which served as walking behaviors in a neutral emotion condition. Participants then watched emotional film clips to elicit emotions (happy and angry. The time interval between watching two clips was more than four hours. After watching film clips, they walked for one minute, which served as walking behaviors in a happy or angry emotion condition. We collected raw data from the bracelet and extracted a few features from raw data. Based on these features, we built classification models for classifying three types of emotions (neutral, happy, and angry. Results and Discussion: For two-category classification, the classification accuracy can reach 91.3% (neutral vs. angry, 88.5% (neutral vs. happy, and 88.5% (happy vs. angry, respectively; while, for the differentiation among three types of emotions (neutral, happy, and angry, the accuracy can reach 81.2%. Conclusions: Using wearable devices, we found it is possible to recognize human emotions (neutral, happy, and angry with fair accuracy. Results of this study may be useful to improve the performance of human-computer interaction.

  14. A Study On Relationship Between Emotional Intelligence And Self-Esteem Among Secondary School Children

    OpenAIRE

    Sujatha, B.

    2015-01-01

    The major objective of this article is to establish a relationship between emotional intelligence and self-esteem among secondary school students. Emotional intelligence (EI) is the ability to recognize one's own and other people's emotions, to discriminate between different feelings and label them appropriately, and to use emotional information to guide thinking and behavior (by Daniel Goleman). Self-esteem reflects a person's overall subjective emotional evaluation of his or her own worth. ...

  15. Practice nurses mental health provide space to patients to discuss unpleasant emotions

    NARCIS (Netherlands)

    Griep, E.C.; Noordman, J.; Dulmen, S. van

    2016-01-01

    WHAT IS KNOWN ON THE SUBJECT?: A core skill of practice nurses' mental health is to recognize and explore patients' unpleasant emotions. Patients rarely express their unpleasant emotions directly and spontaneously, but instead give indirect signs that something is worrying them. WHAT THIS PAPER ADDS

  16. A facial expression of pax: Assessing children's "recognition" of emotion from faces.

    Science.gov (United States)

    Nelson, Nicole L; Russell, James A

    2016-01-01

    In a classic study, children were shown an array of facial expressions and asked to choose the person who expressed a specific emotion. Children were later asked to name the emotion in the face with any label they wanted. Subsequent research often relied on the same two tasks--choice from array and free labeling--to support the conclusion that children recognize basic emotions from facial expressions. Here five studies (N=120, 2- to 10-year-olds) showed that these two tasks produce illusory recognition; a novel nonsense facial expression was included in the array. Children "recognized" a nonsense emotion (pax or tolen) and two familiar emotions (fear and jealousy) from the same nonsense face. Children likely used a process of elimination; they paired the unknown facial expression with a label given in the choice-from-array task and, after just two trials, freely labeled the new facial expression with the new label. These data indicate that past studies using this method may have overestimated children's expression knowledge. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Emotions are understood from biological motion across remote cultures.

    Science.gov (United States)

    Parkinson, Carolyn; Walker, Trent T; Memmi, Sarah; Wheatley, Thalia

    2017-04-01

    Patterns of bodily movement can be used to signal a wide variety of information, including emotional states. Are these signals reliant on culturally learned cues or are they intelligible across individuals lacking exposure to a common culture? To find out, we traveled to a remote Kreung village in Ratanakiri, Cambodia. First, we recorded Kreung portrayals of 5 emotions through bodily movement. These videos were later shown to American participants, who matched the videos with appropriate emotional labels with above chance accuracy (Study 1). The Kreung also viewed Western point-light displays of emotions. After each display, they were asked to either freely describe what was being expressed (Study 2) or choose from 5 predetermined response options (Study 3). Across these studies, Kreung participants recognized Western point-light displays of anger, fear, happiness, sadness, and pride with above chance accuracy. Kreung raters were not above chance in deciphering an American point-light display depicting love, suggesting that recognizing love may rely, at least in part, on culturally specific cues or modalities other than bodily movement. In addition, multidimensional scaling of the patterns of nonverbal behavior associated with each emotion in each culture suggested that similar patterns of nonverbal behavior are used to convey the same emotions across cultures. The considerable cross-cultural intelligibility observed across these studies suggests that the communication of emotion through movement is largely shaped by aspects of physiology and the environment shared by all humans, irrespective of differences in cultural context. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  18. The autistic child's appraisal of expressions of emotion.

    Science.gov (United States)

    Hobson, R P

    1986-05-01

    Groups of MA-matched autistic, normal and non-autistic retarded children were tested for their ability to choose drawn and photographed facial expressions of emotion to "go with" a person videotaped in gestures, vocalizations and contexts indicative of four emotional states. Although both autistic and control subjects were adept in choosing drawings of non-personal objects to correspond with videotaped cues, the autistic children were markedly impaired in selecting the appropriate faces for the videotaped expressions and contexts. Within the autistic group, the children's performance in this task of emotion recognition was related to MA. It is suggested that autistic children have difficulty in recognizing how different expressions of particular emotions are associated with each other, and that this might contribute to their failure to understand the emotional states of other people.

  19. An investigation of the effect of race-based social categorization on adults’ recognition of emotion

    Science.gov (United States)

    Reyes, B. Nicole; Segal, Shira C.

    2018-01-01

    Emotion recognition is important for social interaction and communication, yet previous research has identified a cross-cultural emotion recognition deficit: Recognition is less accurate for emotions expressed by individuals from a cultural group different than one’s own. The current study examined whether social categorization based on race, in the absence of cultural differences, influences emotion recognition in a diverse context. South Asian and White Canadians in the Greater Toronto Area completed an emotion recognition task that required them to identify the seven basic emotional expressions when posed by members of the same two groups, allowing us to tease apart the contributions of culture and social group membership. Contrary to our hypothesis, there was no mutual in-group advantage in emotion recognition: Participants were not more accurate at recognizing emotions posed by their respective racial in-groups. Both groups were more accurate at recognizing expressions when posed by South Asian faces, and White participants were more accurate overall compared to South Asian participants. These results suggest that in a diverse environment, categorization based on race alone does not lead to the creation of social out-groups in a way that negatively impacts emotion recognition. PMID:29474367

  20. An investigation of the effect of race-based social categorization on adults' recognition of emotion.

    Science.gov (United States)

    Reyes, B Nicole; Segal, Shira C; Moulson, Margaret C

    2018-01-01

    Emotion recognition is important for social interaction and communication, yet previous research has identified a cross-cultural emotion recognition deficit: Recognition is less accurate for emotions expressed by individuals from a cultural group different than one's own. The current study examined whether social categorization based on race, in the absence of cultural differences, influences emotion recognition in a diverse context. South Asian and White Canadians in the Greater Toronto Area completed an emotion recognition task that required them to identify the seven basic emotional expressions when posed by members of the same two groups, allowing us to tease apart the contributions of culture and social group membership. Contrary to our hypothesis, there was no mutual in-group advantage in emotion recognition: Participants were not more accurate at recognizing emotions posed by their respective racial in-groups. Both groups were more accurate at recognizing expressions when posed by South Asian faces, and White participants were more accurate overall compared to South Asian participants. These results suggest that in a diverse environment, categorization based on race alone does not lead to the creation of social out-groups in a way that negatively impacts emotion recognition.

  1. An investigation of the effect of race-based social categorization on adults' recognition of emotion.

    Directory of Open Access Journals (Sweden)

    B Nicole Reyes

    Full Text Available Emotion recognition is important for social interaction and communication, yet previous research has identified a cross-cultural emotion recognition deficit: Recognition is less accurate for emotions expressed by individuals from a cultural group different than one's own. The current study examined whether social categorization based on race, in the absence of cultural differences, influences emotion recognition in a diverse context. South Asian and White Canadians in the Greater Toronto Area completed an emotion recognition task that required them to identify the seven basic emotional expressions when posed by members of the same two groups, allowing us to tease apart the contributions of culture and social group membership. Contrary to our hypothesis, there was no mutual in-group advantage in emotion recognition: Participants were not more accurate at recognizing emotions posed by their respective racial in-groups. Both groups were more accurate at recognizing expressions when posed by South Asian faces, and White participants were more accurate overall compared to South Asian participants. These results suggest that in a diverse environment, categorization based on race alone does not lead to the creation of social out-groups in a way that negatively impacts emotion recognition.

  2. A Computational Model of Learners Achievement Emotions Using Control-Value Theory

    Science.gov (United States)

    Muñoz, Karla; Noguez, Julieta; Neri, Luis; Mc Kevitt, Paul; Lunney, Tom

    2016-01-01

    Game-based Learning (GBL) environments make instruction flexible and interactive. Positive experiences depend on personalization. Student modelling has focused on affect. Three methods are used: (1) recognizing the physiological effects of emotion, (2) reasoning about emotion from its origin and (3) an approach combining 1 and 2. These have proven…

  3. Social learning modulates the lateralization of emotional valence.

    Science.gov (United States)

    Shamay-Tsoory, Simone G; Lavidor, Michal; Aharon-Peretz, Judith

    2008-08-01

    Although neuropsychological studies of lateralization of emotion have emphasized valence (positive vs. negative) or type (basic vs. complex) dimensions, the interaction between the two dimensions has yet to be elucidated. The purpose of the current study was to test the hypothesis that recognition of basic emotions is processed preferentially by the right prefrontal cortex (PFC), whereas recognition of complex social emotions is processed preferentially by the left PFC. Experiment 1 assessed the ability of healthy controls and patients with right and left PFC lesions to recognize basic and complex emotions. Experiment 2 modeled the patient's data of Experiment 1 on healthy participants under lateralized displays of the emotional stimuli. Both experiments support the Type as well as the Valence Hypotheses. However, our findings indicate that the Valence Hypothesis holds for basic but less so for complex emotions. It is suggested that, since social learning overrules the basic preference of valence in the hemispheres, the processing of complex emotions in the hemispheres is less affected by valence.

  4. Biased emotional recognition in depression: perception of emotions in music by depressed patients.

    Science.gov (United States)

    Punkanen, Marko; Eerola, Tuomas; Erkkilä, Jaakko

    2011-04-01

    Depression is a highly prevalent mood disorder, that impairs a person's social skills and also their quality of life. Populations affected with depression also suffer from a higher mortality rate. Depression affects person's ability to recognize emotions. We designed a novel experiment to test the hypothesis that depressed patients show a judgment bias towards negative emotions. To investigate how depressed patients differ in their perception of emotions conveyed by musical examples, both healthy (n=30) and depressed (n=79) participants were presented with a set of 30 musical excerpts, representing one of five basic target emotions, and asked to rate each excerpt using five Likert scales that represented the amount of each one of those same emotions perceived in the example. Depressed patients showed moderate but consistent negative self-report biases both in the overall use of the scales and their particular application to certain target emotions, when compared to healthy controls. Also, the severity of the clinical state (depression, anxiety and alexithymia) had an effect on the self-report biases for both positive and negative emotion ratings, particularly depression and alexithymia. Only musical stimuli were used, and they were all clear examples of one of the basic emotions of happiness, sadness, fear, anger and tenderness. No neutral or ambiguous excerpts were included. Depressed patients' negative emotional bias was demonstrated using musical stimuli. This suggests that the evaluation of emotional qualities in music could become a means to discriminate between depressed and non-depressed subjects. The practical implications of the present study relate both to diagnostic uses of such perceptual evaluations, as well as a better understanding of the emotional regulation strategies of the patients. Copyright © 2010 Elsevier B.V. All rights reserved.

  5. Audio-based deep music emotion recognition

    Science.gov (United States)

    Liu, Tong; Han, Li; Ma, Liangkai; Guo, Dongwei

    2018-05-01

    As the rapid development of multimedia networking, more and more songs are issued through the Internet and stored in large digital music libraries. However, music information retrieval on these libraries can be really hard, and the recognition of musical emotion is especially challenging. In this paper, we report a strategy to recognize the emotion contained in songs by classifying their spectrograms, which contain both the time and frequency information, with a convolutional neural network (CNN). The experiments conducted on the l000-song dataset indicate that the proposed model outperforms traditional machine learning method.

  6. Emotional facial expressions in European-American, Japanese, and Chinese infants.

    Science.gov (United States)

    Camras, Linda A; Oster, Harriet; Campos, Joseph J; Bakemand, Roger

    2003-12-01

    Charles Darwin was among the first to recognize the important contribution that infant studies could make to our understanding of human emotional expression. Noting that infants come to exhibit many emotions, he also observed that at first their repertoire of expression is highly restricted. Today, considerable controversy exists regarding the question of whether infants experience and express discrete emotions. According to one position, discrete emotions emerge during infancy along with their prototypic facial expressions. These expressions closely resemble adult emotional expressions and are invariantly concordant with their corresponding emotions. In contrast, we propose that the relation between expression and emotion during infancy is more complex. Some infant emotions and emotional expressions may not be invariantly concordant. Furthermore, infant emotional expressions may be less differentiated than previously proposed. Together with past developmental studies, recent cross-cultural research supports this view and suggests that negative emotional expression in particular is only partly differentiated towards the end of the first year.

  7. Recognition of facial expressions and prosodic cues with graded emotional intensities in adults with Asperger syndrome.

    Science.gov (United States)

    Doi, Hirokazu; Fujisawa, Takashi X; Kanai, Chieko; Ohta, Haruhisa; Yokoi, Hideki; Iwanami, Akira; Kato, Nobumasa; Shinohara, Kazuyuki

    2013-09-01

    This study investigated the ability of adults with Asperger syndrome to recognize emotional categories of facial expressions and emotional prosodies with graded emotional intensities. The individuals with Asperger syndrome showed poorer recognition performance for angry and sad expressions from both facial and vocal information. The group difference in facial expression recognition was prominent for stimuli with low or intermediate emotional intensities. In contrast to this, the individuals with Asperger syndrome exhibited lower recognition accuracy than typically-developed controls mainly for emotional prosody with high emotional intensity. In facial expression recognition, Asperger and control groups showed an inversion effect for all categories. The magnitude of this effect was less in the Asperger group for angry and sad expressions, presumably attributable to reduced recruitment of the configural mode of face processing. The individuals with Asperger syndrome outperformed the control participants in recognizing inverted sad expressions, indicating enhanced processing of local facial information representing sad emotion. These results suggest that the adults with Asperger syndrome rely on modality-specific strategies in emotion recognition from facial expression and prosodic information.

  8. Improvement of emotional healthcare system with stress detection from ECG signal.

    Science.gov (United States)

    Tivatansakul, S; Ohkura, M

    2015-01-01

    Our emotional healthcare system is designed to cope with users' negative emotions in daily life. To make the system more intelligent, we integrated emotion recognition by facial expression to provide appropriate services based on user's current emotional state. Our emotion recognition by facial expression has confusion issue to recognize some positive, neutral and negative emotions that make the emotional healthcare system provide a relaxation service even though users don't have negative emotions. Therefore, to increase the effectiveness of the system to provide the relaxation service, we integrate stress detection from ECG signal. The stress detection might be able to address the confusion issue of emotion recognition by facial expression to provide the service. Indeed, our results show that integration of stress detection increases the effectiveness and efficiency of the emotional healthcare system to provide services.

  9. Evidence for unintentional emotional contagion beyond dyads.

    Directory of Open Access Journals (Sweden)

    Guillaume Dezecache

    Full Text Available Little is known about the spread of emotions beyond dyads. Yet, it is of importance for explaining the emergence of crowd behaviors. Here, we experimentally addressed whether emotional homogeneity within a crowd might result from a cascade of local emotional transmissions where the perception of another's emotional expression produces, in the observer's face and body, sufficient information to allow for the transmission of the emotion to a third party. We reproduced a minimal element of a crowd situation and recorded the facial electromyographic activity and the skin conductance response of an individual C observing the face of an individual B watching an individual A displaying either joy or fear full body expressions. Critically, individual B did not know that she was being watched. We show that emotions of joy and fear displayed by A were spontaneously transmitted to C through B, even when the emotional information available in B's faces could not be explicitly recognized. These findings demonstrate that one is tuned to react to others' emotional signals and to unintentionally produce subtle but sufficient emotional cues to induce emotional states in others. This phenomenon could be the mark of a spontaneous cooperative behavior whose function is to communicate survival-value information to conspecifics.

  10. Emotional memory for musical excerpts in young and older adults.

    OpenAIRE

    Irene eAlonso; Irene eAlonso; Irene eAlonso; Delphine eDellacherie; Delphine eDellacherie; Séverine eSamson; Séverine eSamson

    2015-01-01

    The emotions evoked by music can enhance recognition of excerpts. It has been suggested that memory is better for high than for low arousing music (Eschrich et al., 2005; Samson et al., 2009), but it remains unclear whether positively (Eschrich et al., 2008) or negatively valenced music (Aubé et al., 2013; Vieillard and Gilet, 2013) may be better recognized. Moreover, we still know very little about the influence of age on emotional memory for music. To address these issues, we tested emotion...

  11. Emotion Recognition From Singing Voices Using Contemporary Commercial Music and Classical Styles.

    Science.gov (United States)

    Hakanpää, Tua; Waaramaa, Teija; Laukkanen, Anne-Maria

    2018-02-22

    This study examines the recognition of emotion in contemporary commercial music (CCM) and classical styles of singing. This information may be useful in improving the training of interpretation in singing. This is an experimental comparative study. Thirteen singers (11 female, 2 male) with a minimum of 3 years' professional-level singing studies (in CCM or classical technique or both) participated. They sang at three pitches (females: a, e1, a1, males: one octave lower) expressing anger, sadness, joy, tenderness, and a neutral state. Twenty-nine listeners listened to 312 short (0.63- to 4.8-second) voice samples, 135 of which were sung using a classical singing technique and 165 of which were sung in a CCM style. The listeners were asked which emotion they heard. Activity and valence were derived from the chosen emotions. The percentage of correct recognitions out of all the answers in the listening test (N = 9048) was 30.2%. The recognition percentage for the CCM-style singing technique was higher (34.5%) than for the classical-style technique (24.5%). Valence and activation were better perceived than the emotions themselves, and activity was better recognized than valence. A higher pitch was more likely to be perceived as joy or anger, and a lower pitch as sorrow. Both valence and activation were better recognized in the female CCM samples than in the other samples. There are statistically significant differences in the recognition of emotions between classical and CCM styles of singing. Furthermore, in the singing voice, pitch affects the perception of emotions, and valence and activity are more easily recognized than emotions. Copyright © 2018 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  12. Biologically inspired emotion recognition from speech

    Directory of Open Access Journals (Sweden)

    Buscicchio Cosimo

    2011-01-01

    Full Text Available Abstract Emotion recognition has become a fundamental task in human-computer interaction systems. In this article, we propose an emotion recognition approach based on biologically inspired methods. Specifically, emotion classification is performed using a long short-term memory (LSTM recurrent neural network which is able to recognize long-range dependencies between successive temporal patterns. We propose to represent data using features derived from two different models: mel-frequency cepstral coefficients (MFCC and the Lyon cochlear model. In the experimental phase, results obtained from the LSTM network and the two different feature sets are compared, showing that features derived from the Lyon cochlear model give better recognition results in comparison with those obtained with the traditional MFCC representation.

  13. The embodiment of emotion: language use during the feeling of social emotions predicts cortical somatosensory activity.

    Science.gov (United States)

    Saxbe, Darby E; Yang, Xiao-Fei; Borofsky, Larissa A; Immordino-Yang, Mary Helen

    2013-10-01

    Complex social emotions involve both abstract cognitions and bodily sensations, and individuals may differ on their relative reliance on these. We hypothesized that individuals' descriptions of their feelings during a semi-structured emotion induction interview would reveal two distinct psychological styles-a more abstract, cognitive style and a more body-based, affective style-and that these would be associated with somatosensory neural activity. We examined 28 participants' open-ended verbal responses to admiration- and compassion-provoking narratives in an interview and BOLD activity to the same narratives during subsequent functional magnetic resonance imaging scanning. Consistent with hypotheses, individuals' affective and cognitive word use were stable across emotion conditions, negatively correlated and unrelated to reported emotion strength in the scanner. Greater use of affective relative to cognitive words predicted more activation in SI, SII, middle anterior cingulate cortex and insula during emotion trials. The results suggest that individuals' verbal descriptions of their feelings reflect differential recruitment of neural regions supporting physical body awareness. Although somatosensation has long been recognized as an important component of emotion processing, these results offer 'proof of concept' that individual differences in open-ended speech reflect different processing styles at the neurobiological level. This study also demonstrates SI involvement during social emotional experience.

  14. Selective attention to emotional cues and emotion recognition in healthy subjects: the role of mineralocorticoid receptor stimulation.

    Science.gov (United States)

    Schultebraucks, Katharina; Deuter, Christian E; Duesenberg, Moritz; Schulze, Lars; Hellmann-Regen, Julian; Domke, Antonia; Lockenvitz, Lisa; Kuehl, Linn K; Otte, Christian; Wingenfeld, Katja

    2016-09-01

    Selective attention toward emotional cues and emotion recognition of facial expressions are important aspects of social cognition. Stress modulates social cognition through cortisol, which acts on glucocorticoid (GR) and mineralocorticoid receptors (MR) in the brain. We examined the role of MR activation on attentional bias toward emotional cues and on emotion recognition. We included 40 healthy young women and 40 healthy young men (mean age 23.9 ± 3.3), who either received 0.4 mg of the MR agonist fludrocortisone or placebo. A dot-probe paradigm was used to test for attentional biases toward emotional cues (happy and sad faces). Moreover, we used a facial emotion recognition task to investigate the ability to recognize emotional valence (anger and sadness) from facial expression in four graded categories of emotional intensity (20, 30, 40, and 80 %). In the emotional dot-probe task, we found a main effect of treatment and a treatment × valence interaction. Post hoc analyses revealed an attentional bias away from sad faces after placebo intake and a shift in selective attention toward sad faces compared to placebo. We found no attentional bias toward happy faces after fludrocortisone or placebo intake. In the facial emotion recognition task, there was no main effect of treatment. MR stimulation seems to be important in modulating quick, automatic emotional processing, i.e., a shift in selective attention toward negative emotional cues. Our results confirm and extend previous findings of MR function. However, we did not find an effect of MR stimulation on emotion recognition.

  15. Recognition of Facial Expressions and Prosodic Cues with Graded Emotional Intensities in Adults with Asperger Syndrome

    Science.gov (United States)

    Doi, Hirokazu; Fujisawa, Takashi X.; Kanai, Chieko; Ohta, Haruhisa; Yokoi, Hideki; Iwanami, Akira; Kato, Nobumasa; Shinohara, Kazuyuki

    2013-01-01

    This study investigated the ability of adults with Asperger syndrome to recognize emotional categories of facial expressions and emotional prosodies with graded emotional intensities. The individuals with Asperger syndrome showed poorer recognition performance for angry and sad expressions from both facial and vocal information. The group…

  16. Capturing the Family Context of Emotion Regulation: A Family Systems Model Comparison Approach

    Science.gov (United States)

    Fosco, Gregory M.; Grych, John H.

    2013-01-01

    Several dimensions of family functioning are recognized as formative influences on children's emotion regulation. Historically, they have been studied separately, limiting our ability to understand how they function within the family system. The present investigation tested models including family emotional climate, interparental conflict, and…

  17. Culture shapes 7-month-olds' perceptual strategies in discriminating facial expressions of emotion.

    Science.gov (United States)

    Geangu, Elena; Ichikawa, Hiroko; Lao, Junpeng; Kanazawa, So; Yamaguchi, Masami K; Caldara, Roberto; Turati, Chiara

    2016-07-25

    Emotional facial expressions are thought to have evolved because they play a crucial role in species' survival. From infancy, humans develop dedicated neural circuits [1] to exhibit and recognize a variety of facial expressions [2]. But there is increasing evidence that culture specifies when and how certain emotions can be expressed - social norms - and that the mature perceptual mechanisms used to transmit and decode the visual information from emotional signals differ between Western and Eastern adults [3-5]. Specifically, the mouth is more informative for transmitting emotional signals in Westerners and the eye region for Easterners [4], generating culture-specific fixation biases towards these features [5]. During development, it is recognized that cultural differences can be observed at the level of emotional reactivity and regulation [6], and to the culturally dominant modes of attention [7]. Nonetheless, to our knowledge no study has explored whether culture shapes the processing of facial emotional signals early in development. The data we report here show that, by 7 months, infants from both cultures visually discriminate facial expressions of emotion by relying on culturally distinct fixation strategies, resembling those used by the adults from the environment in which they develop [5]. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Frontal and temporal lobe contributions to emotional enhancement of memory in behavioral-variant frontotemporal dementia and Alzheimer's disease

    OpenAIRE

    Kumfor, Fiona; Irish, Muireann; Hodges, John R.; Piguet, Olivier

    2014-01-01

    Emotional events gain special priority in how they are remembered, with emotionally arousing events typically recalled more vividly and with greater confidence than non-emotional events. In dementia, memory and emotion processing are affected to varying degrees, however, whether emotional enhancement of memory for complex ecologically-valid events is differentially affected across dementia syndromes remains unclear, with previous studies examining effects of emotion on simple visual recogniti...

  19. VIRTUAL AVATAR FOR EMOTION RECOGNITION IN PATIENTS WITH SCHIZOPHRENIA: A PILOT STUDY

    Directory of Open Access Journals (Sweden)

    Samuel Marcos Pablos

    2016-08-01

    Full Text Available Persons who suffer from schizophrenia have difficulties in recognizing emotions in others’ facial expressions, which affects their capabilities for social interaction and hinders their social integration. Photographic images have traditionally been used to explore emotion recognition impairments in schizophrenia patients, which lack of the dynamism that is inherent to face to face social interactions. In order to overcome those inconveniences, in the present work the use of an animated, virtual face is approached. The avatar has the appearance of a highly realistic human face and is able to express different emotions dynamically, introducing some advantages over photograph-based approaches such as its dynamic appearance.We present the results of a pilot study in order to assess the validity of the interface as a tool for clinical psychiatrists. 20 subjects who suffer from schizophrenia of long evolution and 20 control subjects were invited to recognize a set of facial emotions showed by a virtual avatar and images. The objective of the study is to explore the possibilities of using a realistic-looking avatar for the assessment of emotion recognition deficits in patients who suffer schizophrenia. Our results suggest that the proposed avatar may be a suitable tool for the diagnosis and treatment of deficits in the facial recognition of emotions.

  20. Impaired recognition of facial emotions from low-spatial frequencies in Asperger syndrome.

    Science.gov (United States)

    Kätsyri, Jari; Saalasti, Satu; Tiippana, Kaisa; von Wendt, Lennart; Sams, Mikko

    2008-01-01

    The theory of 'weak central coherence' [Happe, F., & Frith, U. (2006). The weak coherence account: Detail-focused cognitive style in autism spectrum disorders. Journal of Autism and Developmental Disorders, 36(1), 5-25] implies that persons with autism spectrum disorders (ASDs) have a perceptual bias for local but not for global stimulus features. The recognition of emotional facial expressions representing various different levels of detail has not been studied previously in ASDs. We analyzed the recognition of four basic emotional facial expressions (anger, disgust, fear and happiness) from low-spatial frequencies (overall global shapes without local features) in adults with an ASD. A group of 20 participants with Asperger syndrome (AS) was compared to a group of non-autistic age- and sex-matched controls. Emotion recognition was tested from static and dynamic facial expressions whose spatial frequency contents had been manipulated by low-pass filtering at two levels. The two groups recognized emotions similarly from non-filtered faces and from dynamic vs. static facial expressions. In contrast, the participants with AS were less accurate than controls in recognizing facial emotions from very low-spatial frequencies. The results suggest intact recognition of basic facial emotions and dynamic facial information, but impaired visual processing of global features in ASDs.

  1. The Effects of Anxiety on the Recognition of Multisensory Emotional Cues with Different Cultural Familiarity

    Directory of Open Access Journals (Sweden)

    Ai Koizumi

    2011-10-01

    Full Text Available Anxious individuals have been shown to interpret others' facial expressions negatively. However, whether this negative interpretation bias depends on the modality and familiarity of emotional cues remains largely unknown. We examined whether trait-anxiety affects recognition of multisensory emotional cues (ie, face and voice, which were expressed by actors from either the same or different cultural background as the participants (ie, familiar in-group and unfamiliar out-group. The dynamic face and voice cues of the same actors were synchronized, and conveyed either congruent (eg, happy face and voice or incongruent emotions (eg, happy face and angry voice. Participants were to indicate the perceived emotion in one of the cues, while ignoring the other. The results showed that when recognizing emotions of in-group actors, highly anxious individuals, compared with low anxious ones, were more likely to interpret others' emotions in a negative manner, putting more weight on the to-be-ignored angry cues. This interpretation bias was found regardless of the cue modality. However, when recognizing emotions of out-group actors, low and high anxious individuals showed no difference in the interpretation of emotions irrespective of modality. These results suggest that trait-anxiety affects recognition of emotional expressions in a modality independent yet cultural familiarity dependent manner.

  2. Children with autism spectrum disorder are skilled at reading emotion body language.

    Science.gov (United States)

    Peterson, Candida C; Slaughter, Virginia; Brownell, Celia

    2015-11-01

    Autism is commonly believed to impair the ability to perceive emotions, yet empirical evidence is mixed. Because face processing may be difficult for those with autism spectrum disorder (ASD), we developed a novel test of recognizing emotion via static body postures (Body-Emotion test) and evaluated it with children aged 5 to 12 years in two studies. In Study 1, 34 children with ASD and 41 typically developing (TD) controls matched for age and verbal intelligence (VIQ [verbal IQ]) were tested on (a) our new Body-Emotion test, (b) a widely used test of emotion recognition using photos of eyes as stimuli (Baron-Cohen et al.'s "Reading Mind in the Eyes: Child" or RMEC [Journal of Developmental and Learning Disorders, 2001, Vol. 5, pp. 47-78]), (c) a well-validated theory of mind (ToM) battery, and (d) a teacher-rated empathy scale. In Study 2 (33 children with ASD and 31 TD controls), the RMEC test was simplified to the six basic human emotions. Results of both studies showed that children with ASD performed as well as their TD peers on the Body-Emotion test. Yet TD children outperformed the ASD group on ToM and on both the standard RMEC test and the simplified version. VIQ was not related to perceiving emotions via either body posture or eyes for either group. However, recognizing emotions from body posture was correlated with ToM, especially for children with ASD. Finally, reading emotions from body posture was easier than reading emotions from eyes for both groups. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Emotion rendering in auditory simulations of imagined walking styles

    DEFF Research Database (Denmark)

    Turchet, Luca; Rodá, Antonio

    2016-01-01

    This paper investigated how different emotional states of a walker can be rendered and recognized by means of footstep sounds synthesis algorithms. In a first experiment, participants were asked to render, according to imagined walking scenarios, five emotions (aggressive, happy, neutral, sad......, and tender) by manipulating the parameters of synthetic footstep sounds simulating various combinations of surface materials and shoes types. Results allowed to identify, for the involved emotions and sound conditions, the mean values and ranges of variation of two parameters, sound level and temporal...... distance between consecutive steps. Results were in accordance with those reported in previous studies on real walking, suggesting that expression of emotions in walking is independent from the real or imagined motor activity. In a second experiment participants were asked to identify the emotions...

  4. Recognition of facial and musical emotions in Parkinson's disease.

    Science.gov (United States)

    Saenz, A; Doé de Maindreville, A; Henry, A; de Labbey, S; Bakchine, S; Ehrlé, N

    2013-03-01

    Patients with amygdala lesions were found to be impaired in recognizing the fear emotion both from face and from music. In patients with Parkinson's disease (PD), impairment in recognition of emotions from facial expressions was reported for disgust, fear, sadness and anger, but no studies had yet investigated this population for the recognition of emotions from both face and music. The ability to recognize basic universal emotions (fear, happiness and sadness) from both face and music was investigated in 24 medicated patients with PD and 24 healthy controls. The patient group was tested for language (verbal fluency tasks), memory (digit and spatial span), executive functions (Similarities and Picture Completion subtests of the WAIS III, Brixton and Stroop tests), visual attention (Bells test), and fulfilled self-assessment tests for anxiety and depression. Results showed that the PD group was significantly impaired for recognition of both fear and sadness emotions from facial expressions, whereas their performance in recognition of emotions from musical excerpts was not different from that of the control group. The scores of fear and sadness recognition from faces were neither correlated to scores in tests for executive and cognitive functions, nor to scores in self-assessment scales. We attributed the observed dissociation to the modality (visual vs. auditory) of presentation and to the ecological value of the musical stimuli that we used. We discuss the relevance of our findings for the care of patients with PD. © 2012 The Author(s) European Journal of Neurology © 2012 EFNS.

  5. Recognition of emotions using multimodal physiological signals and an ensemble deep learning model.

    Science.gov (United States)

    Yin, Zhong; Zhao, Mengyuan; Wang, Yongxiong; Yang, Jingdong; Zhang, Jianhua

    2017-03-01

    Using deep-learning methodologies to analyze multimodal physiological signals becomes increasingly attractive for recognizing human emotions. However, the conventional deep emotion classifiers may suffer from the drawback of the lack of the expertise for determining model structure and the oversimplification of combining multimodal feature abstractions. In this study, a multiple-fusion-layer based ensemble classifier of stacked autoencoder (MESAE) is proposed for recognizing emotions, in which the deep structure is identified based on a physiological-data-driven approach. Each SAE consists of three hidden layers to filter the unwanted noise in the physiological features and derives the stable feature representations. An additional deep model is used to achieve the SAE ensembles. The physiological features are split into several subsets according to different feature extraction approaches with each subset separately encoded by a SAE. The derived SAE abstractions are combined according to the physiological modality to create six sets of encodings, which are then fed to a three-layer, adjacent-graph-based network for feature fusion. The fused features are used to recognize binary arousal or valence states. DEAP multimodal database was employed to validate the performance of the MESAE. By comparing with the best existing emotion classifier, the mean of classification rate and F-score improves by 5.26%. The superiority of the MESAE against the state-of-the-art shallow and deep emotion classifiers has been demonstrated under different sizes of the available physiological instances. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. How Psychological Stress Affects Emotional Prosody

    Science.gov (United States)

    Paulmann, Silke; Furnes, Desire; Bøkenes, Anne Ming; Cozzolino, Philip J.

    2016-01-01

    We explored how experimentally induced psychological stress affects the production and recognition of vocal emotions. In Study 1a, we demonstrate that sentences spoken by stressed speakers are judged by naïve listeners as sounding more stressed than sentences uttered by non-stressed speakers. In Study 1b, negative emotions produced by stressed speakers are generally less well recognized than the same emotions produced by non-stressed speakers. Multiple mediation analyses suggest this poorer recognition of negative stimuli was due to a mismatch between the variation of volume voiced by speakers and the range of volume expected by listeners. Together, this suggests that the stress level of the speaker affects judgments made by the receiver. In Study 2, we demonstrate that participants who were induced with a feeling of stress before carrying out an emotional prosody recognition task performed worse than non-stressed participants. Overall, findings suggest detrimental effects of induced stress on interpersonal sensitivity. PMID:27802287

  7. How Psychological Stress Affects Emotional Prosody.

    Science.gov (United States)

    Paulmann, Silke; Furnes, Desire; Bøkenes, Anne Ming; Cozzolino, Philip J

    2016-01-01

    We explored how experimentally induced psychological stress affects the production and recognition of vocal emotions. In Study 1a, we demonstrate that sentences spoken by stressed speakers are judged by naïve listeners as sounding more stressed than sentences uttered by non-stressed speakers. In Study 1b, negative emotions produced by stressed speakers are generally less well recognized than the same emotions produced by non-stressed speakers. Multiple mediation analyses suggest this poorer recognition of negative stimuli was due to a mismatch between the variation of volume voiced by speakers and the range of volume expected by listeners. Together, this suggests that the stress level of the speaker affects judgments made by the receiver. In Study 2, we demonstrate that participants who were induced with a feeling of stress before carrying out an emotional prosody recognition task performed worse than non-stressed participants. Overall, findings suggest detrimental effects of induced stress on interpersonal sensitivity.

  8. Enhanced Positive Emotional Reactivity Undermines Empathy in Behavioral Variant Frontotemporal Dementia

    Directory of Open Access Journals (Sweden)

    Alice Y. Hua

    2018-06-01

    Full Text Available Behavioral variant frontotemporal dementia (bvFTD is a neurodegenerative disease characterized by profound changes in emotions and empathy. Although most patients with bvFTD become less sensitive to negative emotional cues, some patients become more sensitive to positive emotional stimuli. We investigated whether dysregulated positive emotions in bvFTD undermine empathy by making it difficult for patients to share (emotional empathy, recognize (cognitive empathy, and respond (real-world empathy to emotions in others. Fifty-one participants (26 patients with bvFTD and 25 healthy controls viewed photographs of neutral, positive, negative, and self-conscious emotional faces and then identified the emotions displayed in the photographs. We used facial electromyography to measure automatic, sub-visible activity in two facial muscles during the task: Zygomaticus major (ZM, which is active during positive emotional reactions (i.e., smiling, and Corrugator supercilii (CS, which is active during negative emotional reactions (i.e., frowning. Participants rated their baseline positive and negative emotional experience before the task, and informants rated participants' real-world empathic behavior on the Interpersonal Reactivity Index. The majority of participants also underwent structural magnetic resonance imaging. A mixed effects model found a significant diagnosis X trial interaction: patients with bvFTD showed greater ZM reactivity to neutral, negative (disgust and surprise, self-conscious (proud, and positive (happy faces than healthy controls. There was no main effect of diagnosis or diagnosis X trial interaction on CS reactivity. Compared to healthy controls, patients with bvFTD had impaired emotion recognition. Multiple regression analyses revealed that greater ZM reactivity predicted worse negative emotion recognition and worse real-world empathy. At baseline, positive emotional experience was higher in bvFTD than healthy controls and also

  9. Emotional Intelligence as a Predictor of Student Success in First-Year Master of Social Work Students

    Science.gov (United States)

    Horne, Dana Meredith

    2017-01-01

    Emotional intelligence has been defined as "the ability to recognize the meanings of emotions and their relationships, and to reason and problem-solve on the basis of them" (Mayer, Caruso, & Salovey, 1999, p. 267). Despite the relevance of emotional intelligence to social work education, limited research has focused on the assessment…

  10. Did depressive symptoms affect recognition of emotional prosody in Parkinson’s disease?

    Directory of Open Access Journals (Sweden)

    Adriana Vélez Feijó

    2008-06-01

    Full Text Available Adriana Vélez Feijó1, Carlos RM Rieder3, Márcia LF Chaves21Medical Sciences Post-Graduate Course; 2Internal Medicine Department, School of Medicine, Universidade Federal do Rio Grande do Sul, Porto Alegre, RS, Brazil; 3Movement Disorders Clinic Coordinator, Hospital de Clínicas de Porto Alegre, Porto Alegre, RS, BrazilObjective: Evaluate the influence of depressive symptoms on the recognition of emotional prosody in Parkinson’s disease (PD patients, and identify types of emotion on spoken sentences.Methods: Thirty-five PD patients and 65 normal participants were studied. Dementia was checked with the Mini Mental State Examination, Clinical Dementia Rating scale, and DSM IV. Recognition of emotional prosody was tested by asking subjects to listen to 12 recorded statements with neutral affective content that were read with a strong affective expression. Subjects had to recognize the correct emotion by one of four descriptors (angry, sad, cheerful, and neutral. The Beck Depression Inventory (BDI was employed to rate depressive symptoms with the cutoff 14.Results: Total ratings of emotions correctly recognized by participants below and above the BDI cutoff were similar among PD patients and normal individuals. PD patients who correctly identified neutral and anger inflections presented higher rates of depressive symptoms (p = 0.011 and 0.044, respectively. No significant differences were observed in the normal group.Conclusions: Depression may modify some modalities of emotional prosody perception in PD, by increasing the perception of non-pleasant emotions or lack of affection, such as anger or indifference.Keywords: emotional prosody, Parkinson’s disease, depression, emotion

  11. Correlating Emotional Intelligence and Job Performance Among Jordanian Hospitals' Registered Nurses.

    Science.gov (United States)

    Al-Hamdan, Zaid; Oweidat, Islam Ali; Al-Faouri, Ibrahim; Codier, Estelle

    2017-01-01

    Emotional intelligence (EI) is an ability to recognize our and others' emotions, and manage emotions in ourselves and in relationships with other people. A large body of research evidence outside nursing shows that measured (EI) abilities correlated with employee performance, motivation, and job satisfaction; and preliminary nursing research evidence shows the correlation between EI ability and nurses' clinical performance. There is less research on the EI ability of Jordanian nurses, and the present study was undertaken to address this gap. A descriptive, cross-sectional, correlation comparative design (nonexperimental) was employed. Six Jordanian hospitals were included in the study. Two hundred fifty questionnaires were distributed to prospective participants. One hundred ninety-four questionnaires were returned, giving a response rate of 78%. EI was measured using the Genos Instrument. Clinical performance was measured using a self-report measure. Findings demonstrated significant positive relationships between all subscales of EI and job performance, ranging from r = .250, p = .000 to r = .193, p = .007. Regression analysis indicated working in medical-surgical wards, recognizing and expressing emotions scores (β = 0.186, p = .048), and controlling emotions (β = 0.255, p = .027) explained 19.1% of variance in nurses' job performance. The study findings confirm the correlation between nurse EI ability and clinical performance. © 2016 Wiley Periodicals, Inc.

  12. Effects of cue modality and emotional category on recognition of nonverbal emotional signals in schizophrenia.

    Science.gov (United States)

    Vogel, Bastian D; Brück, Carolin; Jacob, Heike; Eberle, Mark; Wildgruber, Dirk

    2016-07-07

    Impaired interpretation of nonverbal emotional cues in patients with schizophrenia has been reported in several studies and a clinical relevance of these deficits for social functioning has been assumed. However, it is unclear to what extent the impairments depend on specific emotions or specific channels of nonverbal communication. Here, the effect of cue modality and emotional categories on accuracy of emotion recognition was evaluated in 21 patients with schizophrenia and compared to a healthy control group (n = 21). To this end, dynamic stimuli comprising speakers of both genders in three different sensory modalities (auditory, visual and audiovisual) and five emotional categories (happy, alluring, neutral, angry and disgusted) were used. Patients with schizophrenia were found to be impaired in emotion recognition in comparison to the control group across all stimuli. Considering specific emotions more severe deficits were revealed in the recognition of alluring stimuli and less severe deficits in the recognition of disgusted stimuli as compared to all other emotions. Regarding cue modality the extent of the impairment in emotional recognition did not significantly differ between auditory and visual cues across all emotional categories. However, patients with schizophrenia showed significantly more severe disturbances for vocal as compared to facial cues when sexual interest is expressed (alluring stimuli), whereas more severe disturbances for facial as compared to vocal cues were observed when happiness or anger is expressed. Our results confirmed that perceptual impairments can be observed for vocal as well as facial cues conveying various social and emotional connotations. The observed differences in severity of impairments with most severe deficits for alluring expressions might be related to specific difficulties in recognizing the complex social emotional information of interpersonal intentions as compared to "basic" emotional states. Therefore

  13. Emotion perception after moderate-severe traumatic brain injury: The valence effect and the role of working memory, processing speed, and nonverbal reasoning.

    Science.gov (United States)

    Rosenberg, Hannah; Dethier, Marie; Kessels, Roy P C; Westbrook, R Frederick; McDonald, Skye

    2015-07-01

    Traumatic brain injury (TBI) impairs emotion perception. Perception of negative emotions (sadness, disgust, fear, and anger) is reportedly affected more than positive (happiness and surprise) ones. It has been argued that this reflects a specialized neural network underpinning negative emotions that is vulnerable to brain injury. However, studies typically do not equate for differential difficulty between emotions. We aimed to examine whether emotion recognition deficits in people with TBI were specific to negative emotions, while equating task difficulty, and to determine whether perception deficits might be accounted for by other cognitive processes. Twenty-seven people with TBI and 28 matched control participants identified 6 basic emotions at 2 levels of intensity (a) the conventional 100% intensity and (b) "equated intensity"-that is, an intensity that yielded comparable accuracy rates across emotions in controls. (a) At 100% intensity, the TBI group was impaired in recognizing anger, fear, and disgust but not happiness, surprise, or sadness and performed worse on negative than positive emotions. (b) At equated intensity, the TBI group was poorer than controls overall but not differentially poorer in recognizing negative emotions. Although processing speed and nonverbal reasoning were associated with emotion accuracy, injury severity by itself was a unique predictor. When task difficulty is taken into account, individuals with TBI show impairment in recognizing all facial emotions. There was no evidence for a specific impairment for negative emotions or any particular emotion. Impairment was accounted for by injury severity rather than being a secondary effect of reduced neuropsychological functioning. (c) 2015 APA, all rights reserved).

  14. Impaired Perception of Emotional Expression in Amyotrophic Lateral Sclerosis.

    Science.gov (United States)

    Oh, Seong Il; Oh, Ki Wook; Kim, Hee Jin; Park, Jin Seok; Kim, Seung Hyun

    2016-07-01

    The increasing recognition that deficits in social emotions occur in amyotrophic lateral sclerosis (ALS) is helping to explain the spectrum of neuropsychological dysfunctions, thus supporting the view of ALS as a multisystem disorder involving neuropsychological deficits as well as motor deficits. The aim of this study was to characterize the emotion perception abilities of Korean patients with ALS based on the recognition of facial expressions. Twenty-four patients with ALS and 24 age- and sex-matched healthy controls completed neuropsychological tests and facial emotion recognition tasks [ChaeLee Korean Facial Expressions of Emotions (ChaeLee-E)]. The ChaeLee-E test includes facial expressions for seven emotions: happiness, sadness, anger, disgust, fear, surprise, and neutral. The ability to perceive facial emotions was significantly worse among ALS patients performed than among healthy controls [65.2±18.0% vs. 77.1±6.6% (mean±SD), p=0.009]. Eight of the 24 patients (33%) scored below the 5th percentile score of controls for recognizing facial emotions. Emotion perception deficits occur in Korean ALS patients, particularly regarding facial expressions of emotion. These findings expand the spectrum of cognitive and behavioral dysfunction associated with ALS into emotion processing dysfunction.

  15. Evaluating music emotion recognition:Lessons from music genre recognition?

    OpenAIRE

    Sturm, Bob L.

    2013-01-01

    A fundamental problem with nearly all work in music genre recognition (MGR)is that evaluation lacks validity with respect to the principal goals of MGR. This problem also occurs in the evaluation of music emotion recognition (MER). Standard approaches to evaluation, though easy to implement, do not reliably differentiate between recognizing genre or emotion from music, or by virtue of confounding factors in signals (e.g., equalization). We demonstrate such problems for evaluating an MER syste...

  16. Facial emotion recognition in Parkinson's disease: A review and new hypotheses

    Science.gov (United States)

    Vérin, Marc; Sauleau, Paul; Grandjean, Didier

    2018-01-01

    Abstract Parkinson's disease is a neurodegenerative disorder classically characterized by motor symptoms. Among them, hypomimia affects facial expressiveness and social communication and has a highly negative impact on patients' and relatives' quality of life. Patients also frequently experience nonmotor symptoms, including emotional‐processing impairments, leading to difficulty in recognizing emotions from faces. Aside from its theoretical importance, understanding the disruption of facial emotion recognition in PD is crucial for improving quality of life for both patients and caregivers, as this impairment is associated with heightened interpersonal difficulties. However, studies assessing abilities in recognizing facial emotions in PD still report contradictory outcomes. The origins of this inconsistency are unclear, and several questions (regarding the role of dopamine replacement therapy or the possible consequences of hypomimia) remain unanswered. We therefore undertook a fresh review of relevant articles focusing on facial emotion recognition in PD to deepen current understanding of this nonmotor feature, exploring multiple significant potential confounding factors, both clinical and methodological, and discussing probable pathophysiological mechanisms. This led us to examine recent proposals about the role of basal ganglia‐based circuits in emotion and to consider the involvement of facial mimicry in this deficit from the perspective of embodied simulation theory. We believe our findings will inform clinical practice and increase fundamental knowledge, particularly in relation to potential embodied emotion impairment in PD. © 2018 The Authors. Movement Disorders published by Wiley Periodicals, Inc. on behalf of International Parkinson and Movement Disorder Society. PMID:29473661

  17. The use of emotions in narratives in Williams syndrome.

    Science.gov (United States)

    Van Herwegen, Jo; Aznar, Ana; Tenenbaum, Harriet

    2014-01-01

    Although individuals with Williams syndrome are very sociable, they tend to have limited contact and friendships with peers. In typically developing children the use of positive emotions (e.g., happy) has been argued to be related to peer relationships and popularity. The current study investigated the use and development of emotion words in Williams syndrome using cross-sectional developmental trajectories and examined children's use of different types of emotion words. Nineteen children with Williams syndrome (WS) and 20 typically developing (TD) children matched for chronological age told a story from a wordless picture book. Participants with WS produced a similar number of emotion words compared to the control group and the use of emotion words did not change when plotted against chronological age or vocabulary abilities in either group. However, participants with WS produced more emotion words about sadness. Links between emotion production and friendships as well as future studies are discussed. After reading this article, readers will be able to: explain the development of positive and negative emotions in Williams syndrome and recognize that emotion production is atypical in this population. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Empathy, but not mimicry restriction, influences the recognition of change in emotional facial expressions.

    Science.gov (United States)

    Kosonogov, Vladimir; Titova, Alisa; Vorobyeva, Elena

    2015-01-01

    The current study addressed the hypothesis that empathy and the restriction of facial muscles of observers can influence recognition of emotional facial expressions. A sample of 74 participants recognized the subjective onset of emotional facial expressions (anger, disgust, fear, happiness, sadness, surprise, and neutral) in a series of morphed face photographs showing a gradual change (frame by frame) from one expression to another. The high-empathy (as measured by the Empathy Quotient) participants recognized emotional facial expressions at earlier photographs from the series than did low-empathy ones, but there was no difference in the exploration time. Restriction of facial muscles of observers (with plasters and a stick in mouth) did not influence the responses. We discuss these findings in the context of the embodied simulation theory and previous data on empathy.

  19. Culture, attention, and emotion.

    Science.gov (United States)

    Grossmann, Igor; Ellsworth, Phoebe C; Hong, Ying-yi

    2012-02-01

    This research provides experimental evidence for cultural influence on one of the most basic elements of emotional processing: attention to positive versus negative stimuli. To this end, we focused on Russian culture, which is characterized by brooding and melancholy. In Study 1, Russians spent significantly more time looking at negative than positive pictures, whereas Americans did not show this tendency. In Study 2, Russian Latvians were randomly primed with symbols of each culture, after which we measured the speed of recognition for positive versus negative trait words. Biculturals were significantly faster in recognizing negative words (as compared with baseline) when primed with Russian versus Latvian cultural symbols. Greater identification with Russian culture facilitated this effect. We provide a theoretical discussion of mental processes underlying cultural differences in emotion research.

  20. Rehabilitation of pure alexia

    DEFF Research Database (Denmark)

    Starrfelt, Randi; Ólafsdóttir, Rannveig Rós; Arendt, Ida-Marie

    2013-01-01

    that pure alexia was an easy target for rehabilitation efforts. We review the literature on rehabilitation of pure alexia from 1990 to the present, and find that patients differ widely on several dimensions like alexia severity, and associated deficits. Many patients reported to have pure alexia......-designed and controlled studies of rehabilitation of pure alexia....

  1. Recognition of the Emotional Content of Music Depending on the Characteristics of the Musical Material and Experience of Students

    Directory of Open Access Journals (Sweden)

    Knyazeva T.S.,

    2015-02-01

    Full Text Available We studied the effect of the factors affecting the recognition of the emotional content of the music. We tested hypotheses about the influence of the valence of the music, ethnic style and the listening experience on the success of music recognition. The empirical study involved 26 Russian musicians (average age of 25,7 years. For the study of musical perception we used bipolar semantic differential. We revealed that the valence of music material affects the recognition of the emotional content of music, and the ethno style does not. It was found that senior students recognize the emotional context of the music more effectively. The results show the universal nature of emotional and musical ear, equally successfully recognizing music of different ethnic style, as well as support the notion of higher significance of negative valence of emotional content in the process of musical perception. A study of factors influencing the emotional understanding of music is important for the development of models of emotion recognition, theoretical constructs of emotional intelligence, and for the theory and practice of music education.

  2. The role of emotion in clinical decision making: an integrative literature review

    OpenAIRE

    Kozlowski, Desirée; Hutchinson, Marie; Hurley, John; Rowley, Joanne; Sutherland, Joanna

    2017-01-01

    Background Traditionally, clinical decision making has been perceived as a purely rational and cognitive process. Recently, a number of authors have linked emotional intelligence (EI) to clinical decision making (CDM) and calls have been made for an increased focus on EI skills for clinicians. The objective of this integrative literature review was to identify and synthesise the empirical evidence for a role of emotion in CDM. Methods A systematic search of the bibliographic databases PubMed,...

  3. Abnormal Facial Emotion Recognition in Depression: Serial Testing in an Ultra-Rapid-Cycling Patient.

    Science.gov (United States)

    George, Mark S.; Huggins, Teresa; McDermut, Wilson; Parekh, Priti I.; Rubinow, David; Post, Robert M.

    1998-01-01

    Mood disorder subjects have a selective deficit in recognizing human facial emotion. Whether the facial emotion recognition errors persist during normal mood states (i.e., are state vs. trait dependent) was studied in one male bipolar II patient. Results of five sessions are presented and discussed. (Author/EMK)

  4. Examination of emotion-induced changes in eating: A latent profile analysis of the Emotional Appetite Questionnaire.

    Science.gov (United States)

    Bourdier, L; Morvan, Y; Kotbagi, G; Kern, L; Romo, L; Berthoz, S

    2018-04-01

    It is now recognized that emotions can influence food intake. While some people report eating less when distressed, others report either no change of eating or eating more in the same condition. The question whether this interindividual variability also occurs in response to positive emotions has been overlooked in most studies on Emotional Eating (EE). Using the Emotional Appetite Questionnaire (EMAQ) and Latent Profile Analysis, this study aimed to examine the existence of latent emotion-induced changes in eating profiles, and explore how these profiles differ by testing their relations with 1) age and sex, 2) BMI and risk for eating disorders (ED) and 3) factors that are known to be associated with EE such as perceived positive/negative feelings, depression, anxiety, stress symptoms and impulsivity. Among 401 university students (245 females) who completed the EMAQ, 3 profiles emerged (P1:11.2%, P2:60.1%, P3:28.7%), with distinct patterns of eating behaviors in response to negative emotions and situations but few differences regarding positive ones. Negative emotional overeaters (P1) and negative emotional undereaters (P3) reported similar levels of emotional distress and positive feelings, and were at greater risk for ED. However, the people in the former profile i) reported decreasing their food intake in a positive context, ii) were in majority females, iii) had higher BMI and iv) were more prone to report acting rashly when experiencing negative emotions. Our findings suggest that a person-centred analysis of the EMAQ scores offers a promising way to capture the inter-individual variability of emotionally-driven eating behaviors. These observations also add to the growing literature underscoring the importance of further investigating the role of different facets of impulsivity in triggering overeating and to develop more targeted interventions of EE. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Oxytocin improves emotion recognition for older males.

    Science.gov (United States)

    Campbell, Anna; Ruffman, Ted; Murray, Janice E; Glue, Paul

    2014-10-01

    Older adults (≥60 years) perform worse than young adults (18-30 years) when recognizing facial expressions of emotion. The hypothesized cause of these changes might be declines in neurotransmitters that could affect information processing within the brain. In the present study, we examined the neuropeptide oxytocin that functions to increase neurotransmission. Research suggests that oxytocin benefits the emotion recognition of less socially able individuals. Men tend to have lower levels of oxytocin and older men tend to have worse emotion recognition than older women; therefore, there is reason to think that older men will be particularly likely to benefit from oxytocin. We examined this idea using a double-blind design, testing 68 older and 68 young adults randomly allocated to receive oxytocin nasal spray (20 international units) or placebo. Forty-five minutes afterward they completed an emotion recognition task assessing labeling accuracy for angry, disgusted, fearful, happy, neutral, and sad faces. Older males receiving oxytocin showed improved emotion recognition relative to those taking placebo. No differences were found for older females or young adults. We hypothesize that oxytocin facilitates emotion recognition by improving neurotransmission in the group with the worst emotion recognition. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Post-earthquake Distress and Development of Emotional Expertise in Young Adults

    Directory of Open Access Journals (Sweden)

    Francesca Pistoia

    2018-05-01

    Full Text Available After a natural disaster like an earthquake about 15% of the population experience a post-traumatic stress disorder (PTSD. However, even those without a diagnosis of PTSD can suffer from disorders of the affective sphere, including anxiety, depression and alteration of emotion recognition. The objective of this study was to investigate the neuropsychological and emotional profile of students living in the earthquake-affected areas of L’Aquila, Italy. A group of students living in L’Aquila at the time of the 2009 earthquake was recruited, and compared to a control group of students not living in any earthquake-affected areas. Participants were assessed by means of the Beck Depression Inventory (BDI scale, the State-Trait Anxiety Inventory (STAI, the Insomnia Severity Index (ISI, the Intolerance of Uncertainty Scale Short Form, the Uncertainty Response Scale (URS, the Anxiety Sensitivity Index 3 (ASI-3, and the Eysenck Personality Questionnaire-Revised Short Form (EPQ-RS. Participants also took part in two behavioral experiments aimed at evaluating their ability to recognize facial expressions (by means of the Ekman and Friesen Pictures of Facial Affect and to evaluate emotionally evocative scenes (by means of the International Affective Picture System (IAPS. Results showed that students living in the earthquake-affected areas had a general increase of anxiety and anticipation of threats. Moreover, students living in the earthquake-affected areas showed a significantly higher overall accuracy in recognizing facial expressions as compared to controls. No significant differences between the two groups were detected in the evaluation of emotionally evocative scenes. The novel result lies in the greater accuracy of earthquake victims in recognizing facial expressions, despite the lack of differences from controls in evaluating affective evocative scenes. The trauma exposure may have increased vigilance for threats in earthquake victims, leading

  7. Basic Emotions in the Nencki Affective Word List (NAWL BE): New Method of Classifying Emotional Stimuli.

    Science.gov (United States)

    Wierzba, Małgorzata; Riegel, Monika; Wypych, Marek; Jednoróg, Katarzyna; Turnau, Paweł; Grabowska, Anna; Marchewka, Artur

    2015-01-01

    The Nencki Affective Word List (NAWL) has recently been introduced as a standardized database of Polish words suitable for studying various aspects of language and emotions. Though the NAWL was originally based on the most commonly used dimensional approach, it is not the only way of studying emotions. Another framework is based on discrete emotional categories. Since the two perspectives are recognized as complementary, the aim of the present study was to supplement the NAWL database by the addition of categories corresponding to basic emotions. Thus, 2902 Polish words from the NAWL were presented to 265 subjects, who were instructed to rate them according to the intensity of each of the five basic emotions: happiness, anger, sadness, fear and disgust. The general characteristics of the present word database, as well as the relationships between the studied variables are shown to be consistent with typical patterns found in previous studies using similar databases for different languages. Here we present the Basic Emotions in the Nencki Affective Word List (NAWL BE) as a database of verbal material suitable for highly controlled experimental research. To make the NAWL more convenient to use, we introduce a comprehensive method of classifying stimuli to basic emotion categories. We discuss the advantages of our method in comparison to other methods of classification. Additionally, we provide an interactive online tool (http://exp.lobi.nencki.gov.pl/nawl-analysis) to help researchers browse and interactively generate classes of stimuli to meet their specific requirements.

  8. Risk and Emotion Among Healthy Volunteers in Clinical Trials

    NARCIS (Netherlands)

    Cottingham, M.D.; Fisher, J.A.

    2016-01-01

    Theorized as objective or constructed, risk is recognized as unequally distributed across social hierarchies. Yet the process by which social forces shape risk and risk emotions remains unknown. The pharmaceutical industry depends on healthy individuals to voluntarily test early-stage,

  9. Social-Emotional Development: Quirky or Time to Worry?

    Science.gov (United States)

    Kimball, Valerie

    2016-10-01

    Primary care pediatricians spend a significant amount of time discussing and answering questions from parents regarding the behavior of their children. Although a majority of young children present with developmentally appropriate behavior, it is important that primary care pediatricians recognize concerning conduct that may be suggestive of a behavior or emotional disorder. It is important that social-emotional development is closely monitored in conjunction with physical and cognitive growth and development at each well visit. [Pediatr Ann. 2016;45(10):e337-e339.]. Copyright 2016, SLACK Incorporated.

  10. Are neutral faces of children really emotionally neutral?

    OpenAIRE

    小松, 佐穂子; 箱田, 裕司; Komatsu, Sahoko; Hakoda, Yuji

    2012-01-01

    In this study, we investigated whether people recognize emotions from neutral faces of children (11 to 13 years old). We took facial images of 53 male and 54 female Japanese children who had been asked to keep a neutral facial expression. Then, we conducted an experiment in which 43 participants (19 to 34 years old) rated the strength of four emotions (happiness, surprise, sadness, and anger) for the facial images, using a 7- point scale. We found that (a) they rated both male and female face...

  11. Impaired Integration of Emotional Faces and Affective Body Context in a Rare Case of Developmental Visual Agnosia

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran. R.; Bentin, Shlomo

    2011-01-01

    In the current study we examined the recognition of facial expressions embedded in emotionally expressive bodies in case LG, an individual with a rare form of developmental visual agnosia who suffers from severe prosopagnosia. Neuropsychological testing demonstrated that LG‘s agnosia is characterized by profoundly impaired visual integration. Unlike individuals with typical developmental prosopagnosia who display specific difficulties with face identity (but typically not expression) recognition, LG was also impaired at recognizing isolated facial expressions. By contrast, he successfully recognized the expressions portrayed by faceless emotional bodies handling affective paraphernalia. When presented with contextualized faces in emotional bodies his ability to detect the emotion expressed by a face did not improve even if it was embedded in an emotionally-congruent body context. Furthermore, in contrast to controls, LG displayed an abnormal pattern of contextual influence from emotionally-incongruent bodies. The results are interpreted in the context of a general integration deficit in developmental visual agnosia, suggesting that impaired integration may extend from the level of the face to the level of the full person. PMID:21482423

  12. Facial Emotion Recognition in Children with High Functioning Autism and Children with Social Phobia

    Science.gov (United States)

    Wong, Nina; Beidel, Deborah C.; Sarver, Dustin E.; Sims, Valerie

    2012-01-01

    Recognizing facial affect is essential for effective social functioning. This study examines emotion recognition abilities in children aged 7-13 years with High Functioning Autism (HFA = 19), Social Phobia (SP = 17), or typical development (TD = 21). Findings indicate that all children identified certain emotions more quickly (e.g., happy [less…

  13. The list-composition effect in memory for emotional and neutral pictures: Differential contribution of ventral and dorsal attention networks to successful encoding.

    Science.gov (United States)

    Barnacle, Gemma E; Montaldi, Daniela; Talmi, Deborah; Sommer, Tobias

    2016-09-01

    The Emotional enhancement of memory (EEM) is observed in immediate free-recall memory tests when emotional and neutral stimuli are encoded and tested together ("mixed lists"), but surprisingly, not when they are encoded and tested separately ("pure lists"). Here our aim was to investigate whether the effect of list-composition (mixed versus pure lists) on the EEM is due to differential allocation of attention. We scanned participants with fMRI during encoding of semantically-related emotional (negative valence only) and neutral pictures. Analysis of memory performance data replicated previous work, demonstrating an interaction between list composition and emotional valence. In mixed lists, neural subsequent memory effects in the dorsal attention network were greater for neutral stimulus encoding, while neural subsequent memory effects for emotional stimuli were found in a region associated with the ventral attention network. These results imply that when life experiences include both emotional and neutral elements, memory for the latter is more highly correlated with neural activity representing goal-directed attention processing at encoding. Copyright © 2016. Published by Elsevier Ltd.

  14. Emotional, behavioral, and developmental features indicative of neglect or emotional abuse in preschool children: a systematic review.

    Science.gov (United States)

    Naughton, Aideen Mary; Maguire, Sabine Ann; Mann, Mala Kanthi; Lumb, Rebecca Caroline; Tempest, Vanessa; Gracias, Shirley; Kemp, Alison Mary

    2013-08-01

    Early intervention for neglect or emotional abuse in preschoolers may mitigate lifelong consequences, yet practitioners lack confidence in recognizing these children. To define the emotional, behavioral, and developmental features of neglect or emotional abuse in preschoolers. A literature search of 18 databases, 6 websites, and supplementary searching performed from January 1, 1960, to February 1, 2011, identified 22 669 abstracts. Standardized critical appraisal of 164 articles was conducted by 2 independent, trained reviewers. Inclusion criteria were children aged 0 to 6 years with confirmed neglect or emotional abuse who had emotional, behavioral, and developmental features recorded or for whom the carer-child interaction was documented. Twenty-eight case-control (matched for socioeconomic, educational level, and ethnicity), 1 cross-sectional, and 13 cohort studies were included. Key features in the child included the following: aggression (11 studies) exhibited as angry, disruptive behavior, conduct problems, oppositional behavior, and low ego control; withdrawal or passivity (12 studies), including negative self-esteem, anxious or avoidant behavior, poor emotional knowledge, and difficulties in interpreting emotional expressions in others; developmental delay (17 studies), particularly delayed language, cognitive function, and overall development quotient; poor peer interaction (5 studies), showing poor social interactions, unlikely to act to relieve distress in others; and transition (6 studies) from ambivalent to avoidant insecure attachment pattern and from passive to increasingly aggressive behavior and negative self-representation. Emotional knowledge, cognitive function, and language deteriorate without intervention. Poor sensitivity, hostility, criticism, or disinterest characterize maternal-child interactions. Preschool children who have been neglected or emotionally abused exhibit a range of serious emotional and behavioral difficulties and adverse

  15. The production and perception of emotionally expressive walking sounds: similarities between musical performance and everyday motor activity.

    Directory of Open Access Journals (Sweden)

    Bruno L Giordano

    Full Text Available Several studies have investigated the encoding and perception of emotional expressivity in music performance. A relevant question concerns how the ability to communicate emotions in music performance is acquired. In accordance with recent theories on the embodiment of emotion, we suggest here that both the expression and recognition of emotion in music might at least in part rely on knowledge about the sounds of expressive body movements. We test this hypothesis by drawing parallels between musical expression of emotions and expression of emotions in sounds associated with a non-musical motor activity: walking. In a combined production-perception design, two experiments were conducted, and expressive acoustical features were compared across modalities. An initial performance experiment tested for similar feature use in walking sounds and music performance, and revealed that strong similarities exist. Features related to sound intensity, tempo and tempo regularity were identified as been used similarly in both domains. Participants in a subsequent perception experiment were able to recognize both non-emotional and emotional properties of the sound-generating walkers. An analysis of the acoustical correlates of behavioral data revealed that variations in sound intensity, tempo, and tempo regularity were likely used to recognize expressed emotions. Taken together, these results lend support the motor origin hypothesis for the musical expression of emotions.

  16. Emotional memory for musical excerpts in young and older adults

    Science.gov (United States)

    Alonso, Irene; Dellacherie, Delphine; Samson, Séverine

    2015-01-01

    The emotions evoked by music can enhance recognition of excerpts. It has been suggested that memory is better for high than for low arousing music (Eschrich et al., 2005; Samson et al., 2009), but it remains unclear whether positively (Eschrich et al., 2008) or negatively valenced music (Aubé et al., 2013; Vieillard and Gilet, 2013) may be better recognized. Moreover, we still know very little about the influence of age on emotional memory for music. To address these issues, we tested emotional memory for music in young and older adults using musical excerpts varying in terms of arousal and valence. Participants completed immediate and 24 h delayed recognition tests. We predicted highly arousing excerpts to be better recognized by both groups in immediate recognition. We hypothesized that arousal may compensate consolidation deficits in aging, thus showing more prominent benefit of high over low arousing stimuli in older than younger adults on delayed recognition. We also hypothesized worst retention of negative excerpts for the older group, resulting in a recognition benefit for positive over negative excerpts specific to older adults. Our results suggest that although older adults had worse recognition than young adults overall, effects of emotion on memory do not seem to be modified by aging. Results on immediate recognition suggest that recognition of low arousing excerpts can be affected by valence, with better memory for positive relative to negative low arousing music. However, 24 h delayed recognition results demonstrate effects of emotion on memory consolidation regardless of age, with a recognition benefit for high arousal and for negatively valenced music. The present study highlights the role of emotion on memory consolidation. Findings are examined in light of the literature on emotional memory for music and for other stimuli. We finally discuss the implication of the present results for potential music interventions in aging and dementia. PMID

  17. Emotional memory for musical excerpts in young and older adults.

    Science.gov (United States)

    Alonso, Irene; Dellacherie, Delphine; Samson, Séverine

    2015-01-01

    The emotions evoked by music can enhance recognition of excerpts. It has been suggested that memory is better for high than for low arousing music (Eschrich et al., 2005; Samson et al., 2009), but it remains unclear whether positively (Eschrich et al., 2008) or negatively valenced music (Aubé et al., 2013; Vieillard and Gilet, 2013) may be better recognized. Moreover, we still know very little about the influence of age on emotional memory for music. To address these issues, we tested emotional memory for music in young and older adults using musical excerpts varying in terms of arousal and valence. Participants completed immediate and 24 h delayed recognition tests. We predicted highly arousing excerpts to be better recognized by both groups in immediate recognition. We hypothesized that arousal may compensate consolidation deficits in aging, thus showing more prominent benefit of high over low arousing stimuli in older than younger adults on delayed recognition. We also hypothesized worst retention of negative excerpts for the older group, resulting in a recognition benefit for positive over negative excerpts specific to older adults. Our results suggest that although older adults had worse recognition than young adults overall, effects of emotion on memory do not seem to be modified by aging. Results on immediate recognition suggest that recognition of low arousing excerpts can be affected by valence, with better memory for positive relative to negative low arousing music. However, 24 h delayed recognition results demonstrate effects of emotion on memory consolidation regardless of age, with a recognition benefit for high arousal and for negatively valenced music. The present study highlights the role of emotion on memory consolidation. Findings are examined in light of the literature on emotional memory for music and for other stimuli. We finally discuss the implication of the present results for potential music interventions in aging and dementia.

  18. Emotional memory for musical excerpts in young and older adults.

    Directory of Open Access Journals (Sweden)

    Irene eAlonso

    2015-03-01

    Full Text Available The emotions evoked by music can enhance recognition of excerpts. It has been suggested that memory is better for high than for low arousing music (Eschrich et al., 2005; Samson et al., 2009, but it remains unclear whether positively (Eschrich et al., 2008 or negatively valenced music (Aubé et al., 2013; Vieillard and Gilet, 2013 may be better recognized. Moreover, we still know very little about the influence of age on emotional memory for music. To address these issues, we tested emotional memory for music in young and older adults using musical excerpts varying in terms of arousal and valence. Participants completed immediate and 24h delayed recognition tests. We predicted highly arousing excerpts to be better recognized by both groups in immediate recognition. We hypothesized that arousal may compensate consolidation deficits in aging, thus showing more prominent benefit of high over low arousing stimuli in older than younger adults on delayed recognition. We also hypothesized worst retention of negative excerpts for the older group, resulting in a recognition benefit for positive over negative excerpts specific to older adults. Our results suggest that although older adults had worse recognition than young adults overall, effects of emotion on memory do not seem to be modified by aging. Results on immediate recognition suggest that recognition of low arousing excerpts can be affected by valence, with better memory for positive relative to negative low arousing music. However, 24h delayed recognition results demonstrate effects of emotion on memory consolidation regardless of age, with a recognition benefit for high arousal and for negatively valenced music. The present study highlights the role of emotion on memory consolidation. Findings are examined in light of to the literature on emotional memory for music and for other stimuli. We finally discuss the implication of the present results for potential music interventions in aging and

  19. Children's expressions of negative emotions and adults' responses during routine cardiac consultations.

    Science.gov (United States)

    Vatne, Torun M; Ruland, Cornelia M; Ørnes, Knut; Finset, Arnstein

    2012-03-01

    One function of expressing emotion is to receive support. The aim of this study was to assess how children with heart disease express negative emotions during routine consultations, and examine the interaction between children's expressions and adults' responses. Seventy children, aged 7-13 years, completed measures of anxiety and were videotaped during cardiology visits. Adult-child interactions were analyzed using the Verona Definitions of Emotional Sequences. Children expressed negative emotion, mainly in subtle ways; however, adults rarely recognized and responded to these expressions. The frequency of children's expressions and adults' responses were related to the child's age, level of anxiety, and verbal participation. Children do not openly express negative emotions frequently during routine cardiac consultations; they are more likely to provide subtle cues of negative emotion. When expression of negative emotions does occur, adults may consider using the opportunity to explore the child's emotional experiences.

  20. Emotion Attribution to a Non-Humanoid Robot in Different Social Situations

    Science.gov (United States)

    Lakatos, Gabriella; Gácsi, Márta; Konok, Veronika; Brúder, Ildikó; Bereczky, Boróka; Korondi, Péter; Miklósi, Ádám

    2014-01-01

    In the last few years there was an increasing interest in building companion robots that interact in a socially acceptable way with humans. In order to interact in a meaningful way a robot has to convey intentionality and emotions of some sort in order to increase believability. We suggest that human-robot interaction should be considered as a specific form of inter-specific interaction and that human–animal interaction can provide a useful biological model for designing social robots. Dogs can provide a promising biological model since during the domestication process dogs were able to adapt to the human environment and to participate in complex social interactions. In this observational study we propose to design emotionally expressive behaviour of robots using the behaviour of dogs as inspiration and to test these dog-inspired robots with humans in inter-specific context. In two experiments (wizard-of-oz scenarios) we examined humans' ability to recognize two basic and a secondary emotion expressed by a robot. In Experiment 1 we provided our companion robot with two kinds of emotional behaviour (“happiness” and “fear”), and studied whether people attribute the appropriate emotion to the robot, and interact with it accordingly. In Experiment 2 we investigated whether participants tend to attribute guilty behaviour to a robot in a relevant context by examining whether relying on the robot's greeting behaviour human participants can detect if the robot transgressed a predetermined rule. Results of Experiment 1 showed that people readily attribute emotions to a social robot and interact with it in accordance with the expressed emotional behaviour. Results of Experiment 2 showed that people are able to recognize if the robot transgressed on the basis of its greeting behaviour. In summary, our findings showed that dog-inspired behaviour is a suitable medium for making people attribute emotional states to a non-humanoid robot. PMID:25551218

  1. Emotion attribution to a non-humanoid robot in different social situations.

    Directory of Open Access Journals (Sweden)

    Gabriella Lakatos

    Full Text Available In the last few years there was an increasing interest in building companion robots that interact in a socially acceptable way with humans. In order to interact in a meaningful way a robot has to convey intentionality and emotions of some sort in order to increase believability. We suggest that human-robot interaction should be considered as a specific form of inter-specific interaction and that human-animal interaction can provide a useful biological model for designing social robots. Dogs can provide a promising biological model since during the domestication process dogs were able to adapt to the human environment and to participate in complex social interactions. In this observational study we propose to design emotionally expressive behaviour of robots using the behaviour of dogs as inspiration and to test these dog-inspired robots with humans in inter-specific context. In two experiments (wizard-of-oz scenarios we examined humans' ability to recognize two basic and a secondary emotion expressed by a robot. In Experiment 1 we provided our companion robot with two kinds of emotional behaviour ("happiness" and "fear", and studied whether people attribute the appropriate emotion to the robot, and interact with it accordingly. In Experiment 2 we investigated whether participants tend to attribute guilty behaviour to a robot in a relevant context by examining whether relying on the robot's greeting behaviour human participants can detect if the robot transgressed a predetermined rule. Results of Experiment 1 showed that people readily attribute emotions to a social robot and interact with it in accordance with the expressed emotional behaviour. Results of Experiment 2 showed that people are able to recognize if the robot transgressed on the basis of its greeting behaviour. In summary, our findings showed that dog-inspired behaviour is a suitable medium for making people attribute emotional states to a non-humanoid robot.

  2. When familiarity breeds accuracy: cultural exposure and facial emotion recognition.

    Science.gov (United States)

    Elfenbein, Hillary Anger; Ambady, Nalini

    2003-08-01

    Two studies provide evidence for the role of cultural familiarity in recognizing facial expressions of emotion. For Chinese located in China and the United States, Chinese Americans, and non-Asian Americans, accuracy and speed in judging Chinese and American emotions was greater with greater participant exposure to the group posing the expressions. Likewise, Tibetans residing in China and Africans residing in the United States were faster and more accurate when judging emotions expressed by host versus nonhost society members. These effects extended across generations of Chinese Americans, seemingly independent of ethnic or biological ties. Results suggest that the universal affect system governing emotional expression may be characterized by subtle differences in style across cultures, which become more familiar with greater cultural contact.

  3. Impaired emotion recognition is linked to alexithymia in heroin addicts

    Directory of Open Access Journals (Sweden)

    Giuseppe Craparo

    2016-04-01

    Full Text Available Several investigations document altered emotion processing in opiate addiction. Nevertheless, the origin of this phenomenon remains unclear. Here we examined the role of alexithymia in the ability (i.e., number of errors—accuracy and reaction times—RTs of thirty-one heroin addicts and thirty-one healthy controls to detect several affective expressions. Results show generally lower accuracy and higher RTs in the recognition of facial expressions of emotions for patients, compared to controls. The hierarchical multivariate regression analysis shows that alexithymia might be responsible of the between groups difference with respect to the RTs in emotion detection. Overall, we provide new insights in the clinical interpretation of affective deficits in heroin addicts suggesting a role of alexithymia in their ability to recognize emotions.

  4. Transcutaneous vagus nerve stimulation (tVNS) enhances recognition of emotions in faces but not bodies.

    Science.gov (United States)

    Sellaro, Roberta; de Gelder, Beatrice; Finisguerra, Alessandra; Colzato, Lorenza S

    2018-02-01

    The polyvagal theory suggests that the vagus nerve is the key phylogenetic substrate enabling optimal social interactions, a crucial aspect of which is emotion recognition. A previous study showed that the vagus nerve plays a causal role in mediating people's ability to recognize emotions based on images of the eye region. The aim of this study is to verify whether the previously reported causal link between vagal activity and emotion recognition can be generalized to situations in which emotions must be inferred from images of whole faces and bodies. To this end, we employed transcutaneous vagus nerve stimulation (tVNS), a novel non-invasive brain stimulation technique that causes the vagus nerve to fire by the application of a mild electrical stimulation to the auricular branch of the vagus nerve, located in the anterior protuberance of the outer ear. In two separate sessions, participants received active or sham tVNS before and while performing two emotion recognition tasks, aimed at indexing their ability to recognize emotions from facial and bodily expressions. Active tVNS, compared to sham stimulation, enhanced emotion recognition for whole faces but not for bodies. Our results confirm and further extend recent observations supporting a causal relationship between vagus nerve activity and the ability to infer others' emotional state, but restrict this association to situations in which the emotional state is conveyed by the whole face and/or by salient facial cues, such as eyes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Opportunities for emotional intelligence in the context of nursing

    Directory of Open Access Journals (Sweden)

    Lubica Ilievová

    2013-04-01

    Full Text Available Introduction: Emotional intelligence is the ability to recognize and control one´s own emotions as well as emotions of other people. There are two orientations in studying emotional intelligence. They differ in whether they relate abilities and personal characteristic features or not. Emotional intelligence usage is currently being understood as a fundamental requirement of nursing in care provision to patients.Methods: In a research conducted with a group of nursing students (n = 86, we were examining emotional intelligence as an ability and as a feature. We used SIT-EMO (Situational Test of Emotional Understanding scales in order to fi nd out emotional intelligence as an ability, and SEIS (Schutte Emotional Intelligence Scale, measuring emotional intelligence as a feature. In the context of nursing, we were finding out emotional self-effi cacy in relation to geriatric patients (ESE-GP. TEIQue-SF (Trait Emotional Intelligence Questionnaire – short form method was used to set up our own questionnaire.Results: We were fi nding out the extent of emotional intelligence and we were analyzing it from the viewpoint of its grasping as a feature, ability and emotional self-effi cacy in relation to geriatric patients. We found out lower levels in social awareness, emotional management and stress management dimensions of the nursing students.Conclusion: Emotional intelligence as an ability of the nursing students can be enhanced through psychological and social trainings. Emotional intelligence has an impact on social and communication skills, which are a precondition of effective nursing care.

  6. Recognizing the Centrality of Emotion in Diversity Courses: Commentary on "Gender in the Management Education Classroom"

    Science.gov (United States)

    Spelman, Duncan

    2010-01-01

    This commentary adds to the analysis and recommendations presented in "Gender in the Management Education Classroom" concerning a very challenging incident focused on powerful gender/diversity dynamics. It discusses the centrality of emotion in students' experiences of diversity discussions and calls for teachers to explicitly help students…

  7. Emotional Barriers to Successful Reemployment: Implications for Counselors.

    Science.gov (United States)

    Guindon, Mary H.; Smith, Barrett

    2002-01-01

    Common responses to job loss include stress reactions, depression and anxiety, and lowered self-esteem. This article describes these common reactions to job loss and unemployment, explains how to recognize their symptoms, and discusses ways counselors can address these emotional barriers to finding meaningful employment. (Contains 50 references.)…

  8. Predictable Chaos: A Review of the Effects of Emotions on Attention, Memory and Decision Making

    Science.gov (United States)

    LeBlanc, Vicki R.; McConnell, Meghan M.; Monteiro, Sandra D.

    2015-01-01

    Healthcare practice and education are highly emotional endeavors. While this is recognized by educators and researchers seeking to develop interventions aimed at improving wellness in health professionals and at providing them with skills to deal with emotional interpersonal situations, the field of health professions education has largely ignored…

  9. Associations between facial emotion recognition and young adolescents' behaviors in bullying.

    Directory of Open Access Journals (Sweden)

    Tiziana Pozzoli

    Full Text Available This study investigated whether different behaviors young adolescents can act during bullying episodes were associated with their ability to recognize morphed facial expressions of the six basic emotions, expressed at high and low intensity. The sample included 117 middle-school students (45.3% girls; mean age = 12.4 years who filled in a peer nomination questionnaire and individually performed a computerized emotion recognition task. Bayesian generalized mixed-effects models showed a complex picture, in which type and intensity of emotions, students' behavior and gender interacted in explaining recognition accuracy. Results were discussed with a particular focus on negative emotions and suggesting a "neutral" nature of emotion recognition ability, which does not necessarily lead to moral behavior but can also be used for pursuing immoral goals.

  10. Associations between facial emotion recognition and young adolescents’ behaviors in bullying

    Science.gov (United States)

    Gini, Gianluca; Altoè, Gianmarco

    2017-01-01

    This study investigated whether different behaviors young adolescents can act during bullying episodes were associated with their ability to recognize morphed facial expressions of the six basic emotions, expressed at high and low intensity. The sample included 117 middle-school students (45.3% girls; mean age = 12.4 years) who filled in a peer nomination questionnaire and individually performed a computerized emotion recognition task. Bayesian generalized mixed-effects models showed a complex picture, in which type and intensity of emotions, students’ behavior and gender interacted in explaining recognition accuracy. Results were discussed with a particular focus on negative emotions and suggesting a “neutral” nature of emotion recognition ability, which does not necessarily lead to moral behavior but can also be used for pursuing immoral goals. PMID:29131871

  11. Emotions in relation to healthcare encounters affecting self-esteem.

    Science.gov (United States)

    Räty, Lena; Gustafsson, Barbro

    2006-02-01

    This study identifies emotions in patients with epilepsy as a result of confirming and disconfirming healthcare experiences. A discussion of emotions as a motive for patients' goal-directed actions was a further aim of this study. The critical incident method was used for data collection. Emotions occurring in confirming and disconfirming healthcare encounters were analyzed using the Belief-Desire Theory of Emotions and were categorized as basic, complex, or self-evaluating. Confirming encounters aroused emotions like hope, a feeling of security, joy, relief, and pride, while disconfirming encounters aroused emotions like despair, fear, unrest, resignation, shame, and guilt. The emotions identified in the healthcare encounters were recognized as motives for action. An emotion such as a feeling of security aroused a desire in the patients to strengthen their positive self and motivated them to have a constructive and sympathetic attitude toward the healthcare experience. An emotion such as anger caused patients to strive to maintain their self-respect either by avoiding difficult situations and ignoring the problem (patients with a low self-esteem) or by trying to re-create a positive self-image (patients with a high self-esteem). Healthcare encounters between patient and caregiver considerably affect the patient's emotional status and thereby his or her well-being. The importance of establishing healthcare encounters that evoke positive emotions that strengthen patients' resources must be addressed in future nursing care.

  12. Does Facial Amimia Impact the Recognition of Facial Emotions? An EMG Study in Parkinson’s Disease

    Science.gov (United States)

    Argaud, Soizic; Delplanque, Sylvain; Houvenaghel, Jean-François; Auffret, Manon; Duprez, Joan; Vérin, Marc; Grandjean, Didier; Sauleau, Paul

    2016-01-01

    According to embodied simulation theory, understanding other people’s emotions is fostered by facial mimicry. However, studies assessing the effect of facial mimicry on the recognition of emotion are still controversial. In Parkinson’s disease (PD), one of the most distinctive clinical features is facial amimia, a reduction in facial expressiveness, but patients also show emotional disturbances. The present study used the pathological model of PD to examine the role of facial mimicry on emotion recognition by investigating EMG responses in PD patients during a facial emotion recognition task (anger, joy, neutral). Our results evidenced a significant decrease in facial mimicry for joy in PD, essentially linked to the absence of reaction of the zygomaticus major and the orbicularis oculi muscles in response to happy avatars, whereas facial mimicry for expressions of anger was relatively preserved. We also confirmed that PD patients were less accurate in recognizing positive and neutral facial expressions and highlighted a beneficial effect of facial mimicry on the recognition of emotion. We thus provide additional arguments for embodied simulation theory suggesting that facial mimicry is a potential lever for therapeutic actions in PD even if it seems not to be necessarily required in recognizing emotion as such. PMID:27467393

  13. Lesion Mapping the Four-Factor Structure of Emotional Intelligence

    Science.gov (United States)

    Operskalski, Joachim T.; Paul, Erick J.; Colom, Roberto; Barbey, Aron K.; Grafman, Jordan

    2015-01-01

    Emotional intelligence (EI) refers to an individual’s ability to process and respond to emotions, including recognizing the expression of emotions in others, using emotions to enhance thought and decision making, and regulating emotions to drive effective behaviors. Despite their importance for goal-directed social behavior, little is known about the neural mechanisms underlying specific facets of EI. Here, we report findings from a study investigating the neural bases of these specific components for EI in a sample of 130 combat veterans with penetrating traumatic brain injury. We examined the neural mechanisms underlying experiential (perceiving and using emotional information) and strategic (understanding and managing emotions) facets of EI. Factor scores were submitted to voxel-based lesion symptom mapping to elucidate their neural substrates. The results indicate that two facets of EI (perceiving and managing emotions) engage common and distinctive neural systems, with shared dependence on the social knowledge network, and selective engagement of the orbitofrontal and parietal cortex for strategic aspects of emotional information processing. The observed pattern of findings suggests that sub-facets of experiential and strategic EI can be characterized as separable but related processes that depend upon a core network of brain structures within frontal, temporal and parietal cortex. PMID:26858627

  14. Medical students' reflections on emotions concerning breaking bad news.

    Science.gov (United States)

    Toivonen, Asta Kristiina; Lindblom-Ylänne, Sari; Louhiala, Pekka; Pyörälä, Eeva

    2017-10-01

    To gain a deeper understanding of fourth year medical students' reflections on emotions in the context of breaking bad news (BBN). During the years 2010-2012, students reflected on their emotions concerning BBN in a learning assignment at the end of the communications skills course. The students were asked to write a description of how they felt about a BBN case. The reflections were analysed using qualitative content analysis. 351 students agreed to participate in the study. We recognized ten categories in students' reflections namely empathy, insecurity, anxiety, sadness, ambivalence, guilt, hope, frustration, gratefulness and emotional detachment. Most students expressed empathy, but there was a clear tension between feeling empathy and retaining professional distance by emotional detachment. Students experience strong and perplexing emotions during their studies, especially in challenging situations. A deeper understanding of students' emotions is valuable for supporting students' professional development and coping in their work in the future. Medical students need opportunities to reflect on emotional experiences during their education to find strategies for coping with them. Emotions should be actively discussed in studies where the issues of BBN are addressed. Teachers need education in attending emotional issues constructively. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Brain Structural Correlates of Emotion Recognition in Psychopaths.

    Directory of Open Access Journals (Sweden)

    Vanessa Pera-Guardiola

    Full Text Available Individuals with psychopathy present deficits in the recognition of facial emotional expressions. However, the nature and extent of these alterations are not fully understood. Furthermore, available data on the functional neural correlates of emotional face recognition deficits in adult psychopaths have provided mixed results. In this context, emotional face morphing tasks may be suitable for clarifying mild and emotion-specific impairments in psychopaths. Likewise, studies exploring corresponding anatomical correlates may be useful for disentangling available neurofunctional evidence based on the alleged neurodevelopmental roots of psychopathic traits. We used Voxel-Based Morphometry and a morphed emotional face expression recognition task to evaluate the relationship between regional gray matter (GM volumes and facial emotion recognition deficits in male psychopaths. In comparison to male healthy controls, psychopaths showed deficits in the recognition of sad, happy and fear emotional expressions. In subsequent brain imaging analyses psychopaths with better recognition of facial emotional expressions showed higher volume in the prefrontal cortex (orbitofrontal, inferior frontal and dorsomedial prefrontal cortices, somatosensory cortex, anterior insula, cingulate cortex and the posterior lobe of the cerebellum. Amygdala and temporal lobe volumes contributed to better emotional face recognition in controls only. These findings provide evidence suggesting that variability in brain morphometry plays a role in accounting for psychopaths' impaired ability to recognize emotional face expressions, and may have implications for comprehensively characterizing the empathy and social cognition dysfunctions typically observed in this population of subjects.

  16. One approach to design of speech emotion database

    Science.gov (United States)

    Uhrin, Dominik; Chmelikova, Zdenka; Tovarek, Jaromir; Partila, Pavol; Voznak, Miroslav

    2016-05-01

    This article describes a system for evaluating the credibility of recordings with emotional character. Sound recordings form Czech language database for training and testing systems of speech emotion recognition. These systems are designed to detect human emotions in his voice. The emotional state of man is useful in the security forces and emergency call service. Man in action (soldier, police officer and firefighter) is often exposed to stress. Information about the emotional state (his voice) will help to dispatch to adapt control commands for procedure intervention. Call agents of emergency call service must recognize the mental state of the caller to adjust the mood of the conversation. In this case, the evaluation of the psychological state is the key factor for successful intervention. A quality database of sound recordings is essential for the creation of the mentioned systems. There are quality databases such as Berlin Database of Emotional Speech or Humaine. The actors have created these databases in an audio studio. It means that the recordings contain simulated emotions, not real. Our research aims at creating a database of the Czech emotional recordings of real human speech. Collecting sound samples to the database is only one of the tasks. Another one, no less important, is to evaluate the significance of recordings from the perspective of emotional states. The design of a methodology for evaluating emotional recordings credibility is described in this article. The results describe the advantages and applicability of the developed method.

  17. [Impact of facial emotional recognition alterations in Dementia of the Alzheimer type].

    Science.gov (United States)

    Rubinstein, Wanda; Cossini, Florencia; Politis, Daniel

    2016-07-01

    Face recognition of basic emotions is independent of other deficits in dementia of the Alzheimer type. Among these deficits, there is disagreement about what emotions are more difficult to recognize. Our aim was to study the presence of alterations in the process of facial recognition of basic emotions, and to investigate if there were differences in the recognition of each type of emotion in Alzheimer's disease. With three tests of recognition of basic facial emotions we evaluated 29 patients who had been diagnosed with dementia of the Alzheimer type and 18 control subjects. Significant differences were obtained in tests of recognition of basic facial emotions and between each. Since the amygdala, one of the brain structures responsible for emotional reaction, is affected in the early stages of this disease, our findings become relevant to understand how this alteration of the process of emotional recognition impacts the difficulties these patients have in both interpersonal relations and behavioral disorders.

  18. Basic Emotions in the Nencki Affective Word List (NAWL BE: New Method of Classifying Emotional Stimuli.

    Directory of Open Access Journals (Sweden)

    Małgorzata Wierzba

    Full Text Available The Nencki Affective Word List (NAWL has recently been introduced as a standardized database of Polish words suitable for studying various aspects of language and emotions. Though the NAWL was originally based on the most commonly used dimensional approach, it is not the only way of studying emotions. Another framework is based on discrete emotional categories. Since the two perspectives are recognized as complementary, the aim of the present study was to supplement the NAWL database by the addition of categories corresponding to basic emotions. Thus, 2902 Polish words from the NAWL were presented to 265 subjects, who were instructed to rate them according to the intensity of each of the five basic emotions: happiness, anger, sadness, fear and disgust. The general characteristics of the present word database, as well as the relationships between the studied variables are shown to be consistent with typical patterns found in previous studies using similar databases for different languages. Here we present the Basic Emotions in the Nencki Affective Word List (NAWL BE as a database of verbal material suitable for highly controlled experimental research. To make the NAWL more convenient to use, we introduce a comprehensive method of classifying stimuli to basic emotion categories. We discuss the advantages of our method in comparison to other methods of classification. Additionally, we provide an interactive online tool (http://exp.lobi.nencki.gov.pl/nawl-analysis to help researchers browse and interactively generate classes of stimuli to meet their specific requirements.

  19. Color Spectrum Properties of Pure and Non-Pure LATEX in Discriminating Rubber Clone Series

    International Nuclear Information System (INIS)

    Noor Aishah Khairuzzaman; Hadzli Hashim; Nina Korlina Madzhi; Noor Ezan Abdullah; Faridatul Aima Ismail; Ahmad Faiz Sampian; Azhana Fatnin Che Will

    2015-01-01

    A study of color spectrum properties for pure and non-pure latex in discriminating rubber clone series has been presented in this paper. There were five types of clones from the same series being used as samples in this study named RRIM2002, RRIM2007, RRIM2008, RRIM2014, and RRIM3001. The main objective is to identify the significant color spectrum (RGB) from pure and non-pure latex that can discriminate rubber clone series. The significant information of color spectrum properties for pure and non-pure latex is determined by using spectrometer and Statistical Package for the Social Science (SPSS). Visible light spectrum (VIS) is used as a radiation light of the spectrometer to emit light to the surface of the latex sample. By using SPSS software, the further numerical analysis of color spectrum properties is being conducted. As the conclusion, blue color spectrum for non-pure is able to discriminate for all rubber clone series whereas only certain color spectrum can differentiate several clone series for pure latex. (author)

  20. Emotion and language: Valence and arousal affect word recognition

    Science.gov (United States)

    Brysbaert, Marc; Warriner, Amy Beth

    2014-01-01

    Emotion influences most aspects of cognition and behavior, but emotional factors are conspicuously absent from current models of word recognition. The influence of emotion on word recognition has mostly been reported in prior studies on the automatic vigilance for negative stimuli, but the precise nature of this relationship is unclear. Various models of automatic vigilance have claimed that the effect of valence on response times is categorical, an inverted-U, or interactive with arousal. The present study used a sample of 12,658 words, and included many lexical and semantic control factors, to determine the precise nature of the effects of arousal and valence on word recognition. Converging empirical patterns observed in word-level and trial-level data from lexical decision and naming indicate that valence and arousal exert independent monotonic effects: Negative words are recognized more slowly than positive words, and arousing words are recognized more slowly than calming words. Valence explained about 2% of the variance in word recognition latencies, whereas the effect of arousal was smaller. Valence and arousal do not interact, but both interact with word frequency, such that valence and arousal exert larger effects among low-frequency words than among high-frequency words. These results necessitate a new model of affective word processing whereby the degree of negativity monotonically and independently predicts the speed of responding. This research also demonstrates that incorporating emotional factors, especially valence, improves the performance of models of word recognition. PMID:24490848

  1. Teaching Children with Autism Spectrum Disorder to Recognize and Express Emotion: A Review of the Literature

    Science.gov (United States)

    Daou, Nidal; Hady, Ryma T.; Poulson, Claire L.

    2016-01-01

    The developmental literature has focused extensively on deficits in the expression and recognition of emotion in people with autism, and has reported on the use of interactive tools to address the problems of affect. The behavioral literature has offered interventions to teach children with autism to engage in appropriate affective displays, and…

  2. Sex differences in emotion recognition: Evidence for a small overall female superiority on facial disgust.

    Science.gov (United States)

    Connolly, Hannah L; Lefevre, Carmen E; Young, Andrew W; Lewis, Gary J

    2018-05-21

    Although it is widely believed that females outperform males in the ability to recognize other people's emotions, this conclusion is not well supported by the extant literature. The current study sought to provide a strong test of the female superiority hypothesis by investigating sex differences in emotion recognition for five basic emotions using stimuli well-calibrated for individual differences assessment, across two expressive domains (face and body), and in a large sample (N = 1,022: Study 1). We also assessed the stability and generalizability of our findings with two independent replication samples (N = 303: Study 2, N = 634: Study 3). In Study 1, we observed that females were superior to males in recognizing facial disgust and sadness. In contrast, males were superior to females in recognizing bodily happiness. The female superiority for recognition of facial disgust was replicated in Studies 2 and 3, and this observation also extended to an independent stimulus set in Study 2. No other sex differences were stable across studies. These findings provide evidence for the presence of sex differences in emotion recognition ability, but show that these differences are modest in magnitude and appear to be limited to facial disgust. We discuss whether this sex difference may reflect human evolutionary imperatives concerning reproductive fitness and child care. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  3. Emotional memory for musical excerpts in young and older adults

    OpenAIRE

    Alonso, Irene; Dellacherie, Delphine; Samson, S?verine

    2015-01-01

    International audience; The emotions evoked by music can enhance recognition of excerpts. It has been suggested that memory is better for high than for low arousing music (Eschrich et al., 2005; Samson et al., 2009), but it remains unclear whether positively (Eschrich et al., 2008) or negatively valenced music (Aubé et al., 2013; Vieillard and Gilet, 2013) may be better recognized. Moreover, we still know very little about the influence of age on emotional memory for music. To address these i...

  4. Differential judgement of static facial expressions of emotions in three cultures.

    Science.gov (United States)

    Huang, Y; Tang, S; Helmeste, D; Shioiri, T; Someya, T

    2001-10-01

    Judging facial expressions of emotions has important clinical value in the assessment of psychiatric patients. Judging facial emotional expressions in foreign patients however, is not always easy. Controversy has existed in previous reports on cultural differences in identifying static facial expressions of emotions. While it has been argued that emotional expressions on the face are universally recognized, experimental data obtained were not necessarily totally supportive. Using the data reported in the literature, our previous pilot study showed that the Japanese interpreted many emotional expressions differently from USA viewers of the same emotions. In order to explore such discrepancies further, we conducted the same experiments on Chinese subjects residing in Beijing. The data showed that, similar to the Japanese viewers, Chinese viewers also judged many static facial emotional expressions differently from USA viewers. The combined results of the Chinese and the Japanese experiments suggest a major cross-cultural difference between American and Asian viewers in identifying some static facial emotional expressions, particularly when the posed emotion has negative connotations. The results have important implications for cross-cultural communications when facial emotional expressions are presented as static images.

  5. Gender differences in identifying emotions from auditory and visual stimuli.

    Science.gov (United States)

    Waaramaa, Teija

    2017-12-01

    The present study focused on gender differences in emotion identification from auditory and visual stimuli produced by two male and two female actors. Differences in emotion identification from nonsense samples, language samples and prolonged vowels were investigated. It was also studied whether auditory stimuli can convey the emotional content of speech without visual stimuli, and whether visual stimuli can convey the emotional content of speech without auditory stimuli. The aim was to get a better knowledge of vocal attributes and a more holistic understanding of the nonverbal communication of emotion. Females tended to be more accurate in emotion identification than males. Voice quality parameters played a role in emotion identification in both genders. The emotional content of the samples was best conveyed by nonsense sentences, better than by prolonged vowels or shared native language of the speakers and participants. Thus, vocal non-verbal communication tends to affect the interpretation of emotion even in the absence of language. The emotional stimuli were better recognized from visual stimuli than auditory stimuli by both genders. Visual information about speech may not be connected to the language; instead, it may be based on the human ability to understand the kinetic movements in speech production more readily than the characteristics of the acoustic cues.

  6. The voice conveys emotion in ten globalized cultures and one remote village in Bhutan.

    Science.gov (United States)

    Cordaro, Daniel T; Keltner, Dacher; Tshering, Sumjay; Wangchuk, Dorji; Flynn, Lisa M

    2016-02-01

    With data from 10 different globalized cultures and 1 remote, isolated village in Bhutan, we examined universals and cultural variations in the recognition of 16 nonverbal emotional vocalizations. College students in 10 nations (Study 1) and villagers in remote Bhutan (Study 2) were asked to match emotional vocalizations to 1-sentence stories of the same valence. Guided by previous conceptualizations of recognition accuracy, across both studies, 7 of the 16 vocal burst stimuli were found to have strong or very strong recognition in all 11 cultures, 6 vocal bursts were found to have moderate recognition, and 4 were not universally recognized. All vocal burst stimuli varied significantly in terms of the degree to which they were recognized across the 11 cultures. Our discussion focuses on the implications of these results for current debates concerning the emotion conveyed in the voice. (c) 2016 APA, all rights reserved).

  7. Teachers' Emotions and Classroom Effectiveness: Implications from Recent Research

    Science.gov (United States)

    Sutton, Rosemary E.

    2005-01-01

    Cognition, motivation, and emotions are recognized by psychologists as the three fundamental classes of mental operations, yet most research in educational psychology has focused on the first two classes. Educational psychology textbooks for preservice teachers contain chapters on learning, problem solving, assessment, and motivation, but not on…

  8. The effect of emotion on keystroke: an experimental study using facial feedback hypothesis.

    Science.gov (United States)

    Tsui, Wei-Hsuan; Lee, Poming; Hsiao, Tzu-Chien

    2013-01-01

    The automatic emotion recognition technology is an important part of building intelligent systems to prevent the computers acting inappropriately. A novel approach for recognizing emotional state by their keystroke typing patterns on a standard keyboard was developed in recent years. However, there was very limited investigation about the phenomenon itself in the previous literatures. Hence, in our study, we conduct a controlled experiment to collect subjects' keystroke data in the different emotional states induced by facial feedback. We examine the difference of the keystroke data between positive and negative emotional states. The results prove the significance in the differences in the typing patterns under positive and negative emotions for all subjects. Our study provides an evidence for the reasonability about developing the technique of emotion recognition by keystroke.

  9. Emotional false memories in children with learning disabilities.

    Science.gov (United States)

    Mirandola, Chiara; Losito, Nunzia; Ghetti, Simona; Cornoldi, Cesare

    2014-02-01

    Research has shown that children with learning disabilities (LD) are less prone to evince associative illusions of memory as a result of impairments in their ability to engage in semantic processing. However, it is unclear whether this observation is true for scripted life events, especially if they include emotional content, or across a broad spectrum of learning disabilities. The present study addressed these issues by assessing recognition memory for script-like information in children with nonverbal learning disability (NLD), children with dyslexia, and typically developing children (N=51). Participants viewed photographs about 8 common events (e.g., family dinner), and embedded in each episode was either a negative or a neutral consequence of an unseen action. Children's memory was then tested on a yes/no recognition task that included old and new photographs. Results showed that the three groups performed similarly in recognizing target photographs, but exhibited differences in memory errors. Compared to other groups, children with NLD were more likely to falsely recognize photographs that depicted an unseen cause of an emotional seen event and associated more "Remember" responses to these errors. Children with dyslexia were equally likely to falsely recognize both unseen causes of seen photographs and photographs generally consistent with the script, whereas the other participant groups were more likely to falsely recognize unseen causes rather than script-consistent distractors. Results are interpreted in terms of mechanisms underlying false memories' formation in different clinical populations of children with LD. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Identifying emotional intelligence in professional nursing practice.

    Science.gov (United States)

    Kooker, Barbara Molina; Shoultz, Jan; Codier, Estelle E

    2007-01-01

    The National Center for Health Workforce Analysis projects that the shortage of registered nurses in the United States will double by 2010 and will nearly quadruple to 20% by 2015 (Bureau of Health Professionals Health Resources and Services Administration. [2002]. Projected supply, demand, and shortages of registered nurses, 2000-2020 [On-line]. Available: http:bhpr.hrsa.gov/healthworkforce/reports/rnprojects/report.htm). The purpose of this study was to use the conceptual framework of emotional intelligence to analyze nurses' stories about their practice to identify factors that could be related to improved nurse retention and patient/client outcomes. The stories reflected evidence of the competencies and domains of emotional intelligence and were related to nurse retention and improved outcomes. Nurses recognized their own strengths and limitations, displayed empathy and recognized client needs, nurtured relationships, used personal influence, and acted as change agents. Nurses were frustrated when organizational barriers conflicted with their knowledge/intuition about nursing practice, their communications were disregarded, or their attempts to create a shared vision and teamwork were ignored. Elements of professional nursing practice, such as autonomy, nurse satisfaction, respect, and the professional practice environment, were identified in the excerpts of the stories. The shortage of practicing nurses continues to be a national issue. The use of emotional intelligence concepts may provide fresh insights into ways to keep nurses engaged in practice and to improve nurse retention and patient/client outcomes.

  11. Play it again Sam: Brain Correlates of Emotional Music Recognition

    Directory of Open Access Journals (Sweden)

    Eckart eAltenmüller

    2014-02-01

    Full Text Available AbstractBackground: Music can elicit strong emotions and can be remembered in connection with these emotions even decades later. Yet, the brain correlates of episodic memory for highly emotional music compared with less emotional music have not been examined. We therefore used fMRI to investigate brain structures activated by emotional processing of short excerpts of film music successfully retrieved from episodic long-term memory.Methods: 18 non-musicians volunteers were exposed to 60 structurally similar pieces of film music of 10 second length with high arousal ratings and either less positive or very positive valence ratings. Two similar sets of 30 pieces were created. Each of these was presented to half of the participants during the encoding session outside of the scanner, while all stimuli were used during the second recognition session inside the MRI-scanner. During fMRI each stimulation period (10 sec was followed by a 20 sec resting period during which participants pressed either the old or the new to indicate whether they had heard the piece before. Results: Musical stimuli vs. silence activated the bilateral superior temporal gyrus, right insula, right middle frontal gyrus, bilateral medial frontal gyrus and the left anterior cerebellum. Old pieces led to activation in the left medial dorsal thalamus and left midbrain compared to new pieces. For recognized vs. not recognized old pieces a focused activation in the right inferior frontal gyrus and the left cerebellum was found. Positive pieces activated the left medial frontal gyrus, the left precuneus, the right superior frontal gyrus, the left posterior cingulate, the bilateral middle temporal gyrus, and the left thalamus compared to less positive pieces. Conclusion: Specific brain networks related to memory retrieval and emotional processing of symphonic film music were identified. The results imply that the valence of a music piece is important for memory performance.

  12. The Mediating Role of Principals' Transformational Leadership Behaviors in Promoting Teachers' Emotional Wellness at Work: A Study in Israeli Primary Schools

    Science.gov (United States)

    Berkovich, Izhak; Eyal, Ori

    2017-01-01

    The present study aims to examine whether principals' emotional intelligence (specifically, their ability to recognize emotions in others) makes them more effective transformational leaders, measured by the reframing of teachers' emotions. The study uses multisource data from principals and their teachers in 69 randomly sampled primary schools.…

  13. Capacity for Empathy and Emotional Contagion in Those With Psychopathic Personalities

    Directory of Open Access Journals (Sweden)

    Cherie Luckhurst

    2017-10-01

    Full Text Available People with psychopathic traits are sometimes adept at recognizing the emotions of others and using this knowledge in anti-social ways. However, data from incarcerated psychopaths suggest that they are incapable of true empathy. In this paper, we describe three studies that link psychopathic personality to emotional contagion and empathy, and we offer suggestions for reconciling the seemingly conflicting data. While most studies of psychopathic personality assess incarcerated respondents, the resulting data may not be generalizable to non-criminals; participants in these studies were recruited from the general population. The research confirms that empathy and emotional contagion are positively correlated and that each is negatively correlated with psychopathy, as expected. Unique to these studies is the finding that, when instructed, those with psychopathic traits can easily “catch” the emotions of others via the steps of the emotional contagion pathway, thus implying their capacity for empathy. However, without instruction, those with psychopathic traits did not automatically catch others’ emotions.

  14. Sixth form pure mathematics

    CERN Document Server

    Plumpton, C

    1968-01-01

    Sixth Form Pure Mathematics, Volume 1, Second Edition, is the first of a series of volumes on Pure Mathematics and Theoretical Mechanics for Sixth Form students whose aim is entrance into British and Commonwealth Universities or Technical Colleges. A knowledge of Pure Mathematics up to G.C.E. O-level is assumed and the subject is developed by a concentric treatment in which each new topic is used to illustrate ideas already treated. The major topics of Algebra, Calculus, Coordinate Geometry, and Trigonometry are developed together. This volume covers most of the Pure Mathematics required for t

  15. Caring More and Knowing More Reduces Age-Related Differences in Emotion Perception

    Science.gov (United States)

    Stanley, Jennifer Tehan; Isaacowitz, Derek M.

    2015-01-01

    Traditional emotion perception tasks show that older adults are less accurate than young adults at recognizing facial expressions of emotion. Recently, we proposed that socioemotional factors might explain why older adults seem impaired in lab tasks but less so in everyday life (Isaacowitz & Stanley, 2011). Thus, in the present research we empirically tested whether socioemotional factors such as motivation and familiarity can alter this pattern of age effects. In one task, accountability instructions eliminated age differences in the traditional emotion perception task. Using a novel emotion perception paradigm featuring spontaneous dynamic facial expressions of a familiar romantic partner versus a same-age stranger, we found that age differences in emotion perception accuracy were attenuated in the familiar partner condition, relative to the stranger condition. Taken together, the results suggest that both overall accuracy as well as specific patterns of age effects differ appreciably between traditional emotion perception tasks and emotion perception within a socioemotional context. PMID:26030775

  16. Caring more and knowing more reduces age-related differences in emotion perception.

    Science.gov (United States)

    Stanley, Jennifer Tehan; Isaacowitz, Derek M

    2015-06-01

    Traditional emotion perception tasks show that older adults are less accurate than are young adults at recognizing facial expressions of emotion. Recently, we proposed that socioemotional factors might explain why older adults seem impaired in lab tasks but less so in everyday life (Isaacowitz & Stanley, 2011). Thus, in the present research we empirically tested whether socioemotional factors such as motivation and familiarity can alter this pattern of age effects. In 1 task, accountability instructions eliminated age differences in the traditional emotion perception task. Using a novel emotion perception paradigm featuring spontaneous dynamic facial expressions of a familiar romantic partner versus a same-age stranger, we found that age differences in emotion perception accuracy were attenuated in the familiar partner condition, relative to the stranger condition. Taken together, the results suggest that both overall accuracy as well as specific patterns of age effects differ appreciably between traditional emotion perception tasks and emotion perception within a socioemotional context. (c) 2015 APA, all rights reserved.

  17. Cultural differences in gaze and emotion recognition: Americans contrast more than Chinese.

    Science.gov (United States)

    Stanley, Jennifer Tehan; Zhang, Xin; Fung, Helene H; Isaacowitz, Derek M

    2013-02-01

    We investigated the influence of contextual expressions on emotion recognition accuracy and gaze patterns among American and Chinese participants. We expected Chinese participants would be more influenced by, and attend more to, contextual information than Americans. Consistent with our hypothesis, Americans were more accurate than Chinese participants at recognizing emotions embedded in the context of other emotional expressions. Eye-tracking data suggest that, for some emotions, Americans attended more to the target faces, and they made more gaze transitions to the target face than Chinese. For all emotions except anger and disgust, Americans appeared to use more of a contrasting strategy where each face was individually contrasted with the target face, compared with Chinese who used less of a contrasting strategy. Both cultures were influenced by contextual information, although the benefit of contextual information depended upon the perceptual dissimilarity of the contextual emotions to the target emotion and the gaze pattern employed during the recognition task. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  18. Child's recognition of emotions in robot's face and body

    NARCIS (Netherlands)

    Cohen, I.; Looije, R.; Neerincx, M.A.

    2011-01-01

    Social robots can comfort and support children who have to cope with chronic diseases. In previous studies, a "facial robot", the iCat, proved to show well-recognized emotional expressions that are important in social interactions. The question is if a mobile robot without a face, the Nao, can

  19. A Model of the Perception of Facial Expressions of Emotion by Humans: Research Overview and Perspectives.

    Science.gov (United States)

    Martinez, Aleix; Du, Shichuan

    2012-05-01

    In cognitive science and neuroscience, there have been two leading models describing how humans perceive and classify facial expressions of emotion-the continuous and the categorical model. The continuous model defines each facial expression of emotion as a feature vector in a face space. This model explains, for example, how expressions of emotion can be seen at different intensities. In contrast, the categorical model consists of C classifiers, each tuned to a specific emotion category. This model explains, among other findings, why the images in a morphing sequence between a happy and a surprise face are perceived as either happy or surprise but not something in between. While the continuous model has a more difficult time justifying this latter finding, the categorical model is not as good when it comes to explaining how expressions are recognized at different intensities or modes. Most importantly, both models have problems explaining how one can recognize combinations of emotion categories such as happily surprised versus angrily surprised versus surprise. To resolve these issues, in the past several years, we have worked on a revised model that justifies the results reported in the cognitive science and neuroscience literature. This model consists of C distinct continuous spaces. Multiple (compound) emotion categories can be recognized by linearly combining these C face spaces. The dimensions of these spaces are shown to be mostly configural. According to this model, the major task for the classification of facial expressions of emotion is precise, detailed detection of facial landmarks rather than recognition. We provide an overview of the literature justifying the model, show how the resulting model can be employed to build algorithms for the recognition of facial expression of emotion, and propose research directions in machine learning and computer vision researchers to keep pushing the state of the art in these areas. We also discuss how the model can

  20. Clinicians' recognition and management of emotions during difficult healthcare conversations.

    Science.gov (United States)

    Martin, Elliott B; Mazzola, Natalia M; Brandano, Jessica; Luff, Donna; Zurakowski, David; Meyer, Elaine C

    2015-10-01

    To examine the most commonly reported emotions encountered among healthcare practitioners when holding difficult conversations, including frequency and impact on care delivery. Interprofessional learners from a range of experience levels and specialties completed self-report questionnaires prior to simulation-based communication workshops. Clinicians were asked to describe up to three emotions they experienced when having difficult healthcare conversations; subsequent questions used Likert-scales to measure frequency of each emotion, and whether care was affected. 152 participants completed questionnaires, including physicians, nurses, and psychosocial professionals. Most commonly reported emotions were anxiety, sadness, empathy, frustration, and insecurity. There were significant differences in how clinicians perceived these different emotions affecting care. Empathy and anxiety were emotions perceived to influence care more than sadness, frustration, and insecurity. Most clinicians, regardless of clinical experience and discipline, find their emotional state influences the quality of their care delivery. Most clinicians rate themselves as somewhat to quite capable of recognizing and managing their emotions, acknowledging significant room to grow. Further education designed to increase clinicians' recognition of, reflection on, and management of emotion would likely prove helpful in improving their ability to navigate difficult healthcare conversations. Interventions aimed at anxiety management are particularly needed. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. Mere social categorization modulates identification of facial expressions of emotion.

    Science.gov (United States)

    Young, Steven G; Hugenberg, Kurt

    2010-12-01

    The ability of the human face to communicate emotional states via facial expressions is well known, and past research has established the importance and universality of emotional facial expressions. However, recent evidence has revealed that facial expressions of emotion are most accurately recognized when the perceiver and expresser are from the same cultural ingroup. The current research builds on this literature and extends this work. Specifically, we find that mere social categorization, using a minimal-group paradigm, can create an ingroup emotion-identification advantage even when the culture of the target and perceiver is held constant. Follow-up experiments show that this effect is supported by differential motivation to process ingroup versus outgroup faces and that this motivational disparity leads to more configural processing of ingroup faces than of outgroup faces. Overall, the results point to distinct processing modes for ingroup and outgroup faces, resulting in differential identification accuracy for facial expressions of emotion. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  2. A new method for face detection in colour images for emotional bio-robots

    Institute of Scientific and Technical Information of China (English)

    HAPESHI; Kevin

    2010-01-01

    Emotional bio-robots have become a hot research topic in last two decades. Though there have been some progress in research, design and development of various emotional bio-robots, few of them can be used in practical applications. The study of emotional bio-robots demands multi-disciplinary co-operation. It involves computer science, artificial intelligence, 3D computation, engineering system modelling, analysis and simulation, bionics engineering, automatic control, image processing and pattern recognition etc. Among them, face detection belongs to image processing and pattern recognition. An emotional robot must have the ability to recognize various objects, particularly, it is very important for a bio-robot to be able to recognize human faces from an image. In this paper, a face detection method is proposed for identifying any human faces in colour images using human skin model and eye detection method. Firstly, this method can be used to detect skin regions from the input colour image after normalizing its luminance. Then, all face candidates are identified using an eye detection method. Comparing with existing algorithms, this method only relies on the colour and geometrical data of human face rather than using training datasets. From experimental results, it is shown that this method is effective and fast and it can be applied to the development of an emotional bio-robot with further improvements of its speed and accuracy.

  3. Emotional language processing in autism spectrum disorders: a systematic review

    Science.gov (United States)

    Lartseva, Alina; Dijkstra, Ton; Buitelaar, Jan K.

    2015-01-01

    In his first description of Autism Spectrum Disorders (ASD), Kanner emphasized emotional impairments by characterizing children with ASD as indifferent to other people, self-absorbed, emotionally cold, distanced, and retracted. Thereafter, emotional impairments became regarded as part of the social impairments of ASD, and research mostly focused on understanding how individuals with ASD recognize visual expressions of emotions from faces and body postures. However, it still remains unclear how emotions are processed outside of the visual domain. This systematic review aims to fill this gap by focusing on impairments of emotional language processing in ASD. We systematically searched PubMed for papers published between 1990 and 2013 using standardized search terms. Studies show that people with ASD are able to correctly classify emotional language stimuli as emotionally positive or negative. However, processing of emotional language stimuli in ASD is associated with atypical patterns of attention and memory performance, as well as abnormal physiological and neural activity. Particularly, younger children with ASD have difficulties in acquiring and developing emotional concepts, and avoid using these in discourse. These emotional language impairments were not consistently associated with age, IQ, or level of development of language skills. We discuss how emotional language impairments fit with existing cognitive theories of ASD, such as central coherence, executive dysfunction, and weak Theory of Mind. We conclude that emotional impairments in ASD may be broader than just a mere consequence of social impairments, and should receive more attention in future research. PMID:25610383

  4. Emotional language processing in autism spectrum disorders: a systematic review.

    Science.gov (United States)

    Lartseva, Alina; Dijkstra, Ton; Buitelaar, Jan K

    2014-01-01

    In his first description of Autism Spectrum Disorders (ASD), Kanner emphasized emotional impairments by characterizing children with ASD as indifferent to other people, self-absorbed, emotionally cold, distanced, and retracted. Thereafter, emotional impairments became regarded as part of the social impairments of ASD, and research mostly focused on understanding how individuals with ASD recognize visual expressions of emotions from faces and body postures. However, it still remains unclear how emotions are processed outside of the visual domain. This systematic review aims to fill this gap by focusing on impairments of emotional language processing in ASD. We systematically searched PubMed for papers published between 1990 and 2013 using standardized search terms. Studies show that people with ASD are able to correctly classify emotional language stimuli as emotionally positive or negative. However, processing of emotional language stimuli in ASD is associated with atypical patterns of attention and memory performance, as well as abnormal physiological and neural activity. Particularly, younger children with ASD have difficulties in acquiring and developing emotional concepts, and avoid using these in discourse. These emotional language impairments were not consistently associated with age, IQ, or level of development of language skills. We discuss how emotional language impairments fit with existing cognitive theories of ASD, such as central coherence, executive dysfunction, and weak Theory of Mind. We conclude that emotional impairments in ASD may be broader than just a mere consequence of social impairments, and should receive more attention in future research.

  5. Emotional language processing in Autism Spectrum Disorders: A systematic review

    Directory of Open Access Journals (Sweden)

    Alina eLartseva

    2015-01-01

    Full Text Available In his first description of Autism Spectrum Disorders (ASD, Kanner emphasized emotional impairments by characterizing children with ASD as indifferent to other people, self-absorbed, emotionally cold, distanced, and retracted. Thereafter, emotional impairments became regarded as part of the social impairments of ASD, and research mostly focused on understanding how individuals with ASD recognize visual expressions of emotions from faces and body postures. However, it still remains unclear how emotions are processed outside of the visual domain. This systematic review aims to fill this gap by focusing on impairments of emotional language processing in ASD.We systematically searched PubMed for papers published between 1990 and 2013 using standardized search terms. Studies show that people with ASD are able to correctly classify emotional language stimuli as emotionally positive or negative. However, processing of emotional language stimuli in ASD is associated with atypical patterns of attention and memory performance, as well as abnormal physiological and neural activity. Particularly, younger children with ASD have difficulties in acquiring and developing emotional concepts, and avoid using these in discourse. These emotional language impairments were not consistently associated with age, IQ, or level of development of language skills.We discuss how emotional language impairments fit with existing cognitive theories of ASD, such as central coherence, executive dysfunction, and weak Theory of Mind. We conclude that emotional impairments in ASD may be broader than just a mere consequence of social impairments, and should receive more attention in future research.

  6. Non-critical pure spinor superstrings

    International Nuclear Information System (INIS)

    Adam, Ido; Grassi, Pietro Antonio; Mazzucato, Luca; Oz, Yaron; Yankielowicz, Shimon

    2007-01-01

    We construct non-critical pure spinor superstrings in two, four and six dimensions. We find explicitly the map between the RNS variables and the pure spinor ones in the linear dilaton background. The RNS variables map onto a patch of the pure spinor space and the holomorphic top form on the pure spinor space is an essential ingredient of the mapping. A basic feature of the map is the requirement of doubling the superspace, which we analyze in detail. We study the structure of the non-critical pure spinor space, which is different from the ten-dimensional one, and its quantum anomalies. We compute the pure spinor lowest lying BRST cohomology and find an agreement with the RNS spectra. The analysis is generalized to curved backgrounds and we construct as an example the non-critical pure spinor type IIA superstring on AdS 4 with RR 4-form flux

  7. Emotion Recognition as a Real Strength in Williams Syndrome: Evidence From a Dynamic Non-verbal Task

    Directory of Open Access Journals (Sweden)

    Laure Ibernon

    2018-04-01

    Full Text Available The hypersocial profile characterizing individuals with Williams syndrome (WS, and particularly their attraction to human faces and their desire to form relationships with other people, could favor the development of their emotion recognition capacities. This study seeks to better understand the development of emotion recognition capacities in WS. The ability to recognize six emotions was assessed in 15 participants with WS. Their performance was compared to that of 15 participants with Down syndrome (DS and 15 typically developing (TD children of the same non-verbal developmental age, as assessed with Raven’s Colored Progressive Matrices (RCPM; Raven et al., 1998. The analysis of the three groups’ results revealed that the participants with WS performed better than the participants with DS and also than the TD children. Individuals with WS performed at a similar level to TD participants in terms of recognizing different types of emotions. The study of development trajectories confirmed that the participants with WS presented the same development profile as the TD participants. These results seem to indicate that the recognition of emotional facial expressions constitutes a real strength in people with WS.

  8. A small-world network model of facial emotion recognition.

    Science.gov (United States)

    Takehara, Takuma; Ochiai, Fumio; Suzuki, Naoto

    2016-01-01

    Various models have been proposed to increase understanding of the cognitive basis of facial emotions. Despite those efforts, interactions between facial emotions have received minimal attention. If collective behaviours relating to each facial emotion in the comprehensive cognitive system could be assumed, specific facial emotion relationship patterns might emerge. In this study, we demonstrate that the frameworks of complex networks can effectively capture those patterns. We generate 81 facial emotion images (6 prototypes and 75 morphs) and then ask participants to rate degrees of similarity in 3240 facial emotion pairs in a paired comparison task. A facial emotion network constructed on the basis of similarity clearly forms a small-world network, which features an extremely short average network distance and close connectivity. Further, even if two facial emotions have opposing valences, they are connected within only two steps. In addition, we show that intermediary morphs are crucial for maintaining full network integration, whereas prototypes are not at all important. These results suggest the existence of collective behaviours in the cognitive systems of facial emotions and also describe why people can efficiently recognize facial emotions in terms of information transmission and propagation. For comparison, we construct three simulated networks--one based on the categorical model, one based on the dimensional model, and one random network. The results reveal that small-world connectivity in facial emotion networks is apparently different from those networks, suggesting that a small-world network is the most suitable model for capturing the cognitive basis of facial emotions.

  9. Facial emotion recognition in patients with focal and diffuse axonal injury.

    Science.gov (United States)

    Yassin, Walid; Callahan, Brandy L; Ubukata, Shiho; Sugihara, Genichi; Murai, Toshiya; Ueda, Keita

    2017-01-01

    Facial emotion recognition impairment has been well documented in patients with traumatic brain injury. Studies exploring the neural substrates involved in such deficits have implicated specific grey matter structures (e.g. orbitofrontal regions), as well as diffuse white matter damage. Our study aims to clarify whether different types of injuries (i.e. focal vs. diffuse) will lead to different types of impairments on facial emotion recognition tasks, as no study has directly compared these patients. The present study examined performance and response patterns on a facial emotion recognition task in 14 participants with diffuse axonal injury (DAI), 14 with focal injury (FI) and 22 healthy controls. We found that, overall, participants with FI and DAI performed more poorly than controls on the facial emotion recognition task. Further, we observed comparable emotion recognition performance in participants with FI and DAI, despite differences in the nature and distribution of their lesions. However, the rating response pattern between the patient groups was different. This is the first study to show that pure DAI, without gross focal lesions, can independently lead to facial emotion recognition deficits and that rating patterns differ depending on the type and location of trauma.

  10. Emotional Component in Teaching and Learning

    Science.gov (United States)

    Ponnambalam, Michael

    2018-02-01

    The laws of physics are often seen as objective truth, pure and simple. Hence, they tend to appear cerebral and cold. However, their presentation is necessarily subjective and may vary from being boring to being exciting. A detailed analysis of physics education reform efforts over the last three decades finds that interactive instruction results in greater learning gains than the traditional lecture format. In interactive engagement, the emotional component plays a far greater role than acknowledged by many. As an experienced physics teacher [(i) Four decades of teaching and research in four continents (teaching all courses to undergraduate physics majors and algebra-based physics to high school seniors as well as college freshmen), (ii) 11 years of volunteer work in Physics Popularization in six countries to many thousands of students in elementary, middle, and high schools as well as colleges and universities, and (iii) eight years as a Master Teacher and mentor], I feel that the emotional component in teaching and learning physics has been neglected. This paper presents the role of the emotional component in transforming ordinary teaching and learning of physics into an enjoyable and exciting experience for students as well as teachers.

  11. Emotion differentiation and intensity during acute tobacco abstinence: A comparison of heavy and light smokers.

    Science.gov (United States)

    Sheets, Erin S; Bujarski, Spencer; Leventhal, Adam M; Ray, Lara A

    2015-08-01

    The ability to recognize and label discrete emotions, termed emotion differentiation, is particularly pertinent to overall emotion regulation abilities. Patterns of deficient emotion differentiation have been associated with mood and anxiety disorders but have yet to be examined in relation to nicotine dependence. This study employed ecological momentary assessment to examine smokers' subjective experience of discrete emotions during 24-h of forced tobacco abstinence. Thirty daily smokers rated their emotions up to 23 times over the 24-hour period, and smoking abstinence was biologically verified. From these data, we computed individual difference measures of emotion differentiation, overall emotion intensity, and emotional variability. As hypothesized, heavy smokers reported poorer negative emotion differentiation than light smokers (d=0.55), along with more intense negative emotion (d=0.97) and greater negative emotion variability (d=0.97). No differences were observed in positive emotion differentiation. Across the sample, poorer negative emotion differentiation was associated with greater endorsement of psychological motives to smoke, including negative and positive reinforcement motives, while positive emotion differentiation was not. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Disrupted neural processing of emotional faces in psychopathy.

    Science.gov (United States)

    Contreras-Rodríguez, Oren; Pujol, Jesus; Batalla, Iolanda; Harrison, Ben J; Bosque, Javier; Ibern-Regàs, Immaculada; Hernández-Ribas, Rosa; Soriano-Mas, Carles; Deus, Joan; López-Solà, Marina; Pifarré, Josep; Menchón, José M; Cardoner, Narcís

    2014-04-01

    Psychopaths show a reduced ability to recognize emotion facial expressions, which may disturb the interpersonal relationship development and successful social adaptation. Behavioral hypotheses point toward an association between emotion recognition deficits in psychopathy and amygdala dysfunction. Our prediction was that amygdala dysfunction would combine deficient activation with disturbances in functional connectivity with cortical regions of the face-processing network. Twenty-two psychopaths and 22 control subjects were assessed and functional magnetic resonance maps were generated to identify both brain activation and task-induced functional connectivity using psychophysiological interaction analysis during an emotional face-matching task. Results showed significant amygdala activation in control subjects only, but differences between study groups did not reach statistical significance. In contrast, psychopaths showed significantly increased activation in visual and prefrontal areas, with this latest activation being associated with psychopaths' affective-interpersonal disturbances. Psychophysiological interaction analyses revealed a reciprocal reduction in functional connectivity between the left amygdala and visual and prefrontal cortices. Our results suggest that emotional stimulation may evoke a relevant cortical response in psychopaths, but a disruption in the processing of emotional faces exists involving the reciprocal functional interaction between the amygdala and neocortex, consistent with the notion of a failure to integrate emotion into cognition in psychopathic individuals.

  13. Integrating Cognition and Emotion: Yirat Shamayim and the Taxonomies

    Science.gov (United States)

    Scheindlin, Laurence

    2008-01-01

    Following Bennett Solomon's suggestion of the "integrating individual"--one who possesses the skill and interest to incorporate new knowledge into a larger and unified life-picture--this article explores how recognizing the coupling of the affective and cognitive can influence Jewish education. Emotions help construct our daily perceptions and our…

  14. Emotional geographies of sociospatial exclusion of homeless people in urban Copenhagen

    DEFF Research Database (Denmark)

    Fahnøe, Kristian Relsted

    participant observation of encounters between social workers and homeless people was the primary method. Additionally, interviews were conducted on site with homeless people. During the observed encounter and the interviews the homeless people’s accounts highlighted how emotional experiences were an integral...... of avoidance and withdrawal. The analysis links these emotions to the symbolic and material aspects of the spaces. By doing this the paper aims to show how the lives of homeless are shaped by a form of socio-spatial exclusion that works through emotions rather than just direct regulation and policing of spaces....... Thus, the paper contends that these emotional dynamics need to be recognized in order to advance our understanding of the lives of homeless. And such emotional dynamics also need to be taken into account in policy making processes that aim to assist homeless people as well as social work practices...

  15. General and specific responsiveness of the amygdala during explicit emotion recognition in females and males

    Directory of Open Access Journals (Sweden)

    Windischberger Christian

    2009-08-01

    Full Text Available Abstract Background The ability to recognize emotions in facial expressions relies on an extensive neural network with the amygdala as the key node as has typically been demonstrated for the processing of fearful stimuli. A sufficient characterization of the factors influencing and modulating amygdala function, however, has not been reached now. Due to lacking or diverging results on its involvement in recognizing all or only certain negative emotions, the influence of gender or ethnicity is still under debate. This high-resolution fMRI study addresses some of the relevant parameters, such as emotional valence, gender and poser ethnicity on amygdala activation during facial emotion recognition in 50 Caucasian subjects. Stimuli were color photographs of emotional Caucasian and African American faces. Results Bilateral amygdala activation was obtained to all emotional expressions (anger, disgust, fear, happy, and sad and neutral faces across all subjects. However, only in males a significant correlation of amygdala activation and behavioral response to fearful stimuli was observed, indicating higher amygdala responses with better fear recognition, thus pointing to subtle gender differences. No significant influence of poser ethnicity on amygdala activation occurred, but analysis of recognition accuracy revealed a significant impact of poser ethnicity that was emotion-dependent. Conclusion Applying high-resolution fMRI while subjects were performing an explicit emotion recognition task revealed bilateral amygdala activation to all emotions presented and neutral expressions. This mechanism seems to operate similarly in healthy females and males and for both in-group and out-group ethnicities. Our results support the assumption that an intact amygdala response is fundamental in the processing of these salient stimuli due to its relevance detecting function.

  16. Emotional Freedom Technique (EFT Scope and Practice Areas

    Directory of Open Access Journals (Sweden)

    Pinar IRMAK VURAL

    2018-06-01

    Full Text Available Emotional Freedom Technique (EFT is a kind of practice of energy psychotherapy consisting of cognitive and somatic components that are used to improve personal negative emotions and related emotional and physical disorders. Stress hormones are secreted in the brain when a person is stressed, the amygdala and other responsive cerebral segments are activated. If the stress can not be effectively coped with, the physical and psychological consequences that will become chronic in course of time. There are essential steps to follow in EFT; firstly person create a setup sentence for sending a message to the emotional body (subconscious and then twelve meridians of energy end point (acupressure points is tapped on. There are different protocols for application purposes. EFT can be performed in psychological and physical areas, which are very common in children and adults, and there is not reported of any adverse effects in randomized controlled trials. In this review, PubMed, Google's Academic and related literature sources were examined and it was determined that the EFT had research results in a variety of subjects. Emotions can be recognized, accepted and transformed with the EFT application.

  17. Prosody recognition and audiovisual emotion matching in schizophrenia: the contribution of cognition and psychopathology.

    Science.gov (United States)

    Castagna, Filomena; Montemagni, Cristiana; Maria Milani, Anna; Rocca, Giuseppe; Rocca, Paola; Casacchia, Massimo; Bogetto, Filippo

    2013-02-28

    This study aimed to evaluate the ability to decode emotion in the auditory and audiovisual modality in a group of patients with schizophrenia, and to explore the role of cognition and psychopathology in affecting these emotion recognition abilities. Ninety-four outpatients in a stable phase and 51 healthy subjects were recruited. Patients were assessed through a psychiatric evaluation and a wide neuropsychological battery. All subjects completed the comprehensive affect testing system (CATS), a group of computerized tests designed to evaluate emotion perception abilities. With respect to the controls, patients were not impaired in the CATS tasks involving discrimination of nonemotional prosody, naming of emotional stimuli expressed by voice and judging the emotional content of a sentence, whereas they showed a specific impairment in decoding emotion in a conflicting auditory condition and in the multichannel modality. Prosody impairment was affected by executive functions, attention and negative symptoms, while deficit in multisensory emotion recognition was affected by executive functions and negative symptoms. These emotion recognition deficits, rather than being associated purely with emotion perception disturbances in schizophrenia, are affected by core symptoms of the illness. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  18. Preserved re-experience of discrete emotions: Amnesia and executive function.

    Science.gov (United States)

    Stanciu, Marian Andrei; Rafal, Robert D; Turnbull, Oliver H

    2018-02-07

    Amnesic patients can re-experience emotions elicited by forgotten events, suggesting that brain systems for episodic and emotional memory are independent. However, the range of such emotional memories remains under-investigated (most studies employing just positive-negative emotion dyads), and executive function may also play a role in the re-experience of emotions. This is the first investigation of the intensity of the emotional re-experience of a range of discrete emotions (anger, fear, sadness, and happiness) for a group of amnesic patients. Twenty Korsakoff syndrome (KS) patients and 20 neurologically normal controls listened to four novel emotional vignettes selectively eliciting the four basic emotions. Emotional experience was measured using pen-and-paper Visual Analogue Mood Scales and episodic memory using verbal recollections. After 30 min, the recollection of stories was severely impaired for the patient group, but the emotional re-experience was no different from that of controls. Notably, there was no relationship between episodic recall and the intensity of the four emotions, such that even profoundly amnesic patients reported moderate levels of the target emotion. Exploratory analyses revealed negative correlations between the intensity of basic emotions and executive functions (e.g., cognitive flexibility and response inhibition) for controls but not patients. The results suggest that discrete emotions can be re-experienced independently of episodic memory, and that the re-experience of certain discrete emotions appears to be dampened by executive control. KS patients with absent or mild cognitive symptoms should benefit from emotion-regulation interventions aimed at reducing the recognized affective burden associated with their episodic memory deficit. © 2018 The British Psychological Society.

  19. Emotions Are Rising: The Growing Field of Affect Neuropsychology.

    Science.gov (United States)

    McDonald, Skye

    2017-10-01

    Thirty years ago, the neuropsychology of emotion started to emerge as a mainstream topic. Careful examination of individual patients showed that emotion, like memory, language, and so on, could be differentially affected by brain disorders, especially in the right hemisphere. Since then, there has been accelerating interest in uncovering the neural architecture of emotion, and the major steps in this process of discovery over the past 3 decades are detailed in this review. In the 1990s, magnetic resonance imaging (MRI) scans provided precise delineation of lesions in the amygdala, medial prefrontal cortex, insula and somatosensory cortex as underpinning emotion disorders. At the same time, functional MRI revealed activation that was bilateral and also lateralized according to task demands. In the 2000s, converging evidence suggested at least two routes to emotional responses: subcortical, automatic and autonomic responses and slower, cortical responses mediating cognitive processing. The discovery of mirror neurons in the 1990s reinvigorated older views that simulation was the means to recognize emotions and empathize with others. More recently, psychophysiological research, revisiting older Russian paradigms, has contributed new insights into how autonomic and other physiological indices contribute to decision making (the somatic marker theory), emotional simulation, and social cognition. Finally, this review considers the extent to which these seismic changes in understanding emotional processes in clinical disorders have been reflected in neuropsychological practice. (JINS, 2017, 23, 719-731).

  20. Emotion's influence on memory for spatial and temporal context.

    Science.gov (United States)

    Schmidt, Katherine; Patnaik, Pooja; Kensinger, Elizabeth A

    2011-02-01

    Individuals report remembering emotional items vividly. It is debated whether this report reflects enhanced memory accuracy or a bias to believe emotional memories are vivid. We hypothesized emotion would enhance memory accuracy, improving memory for contextual details. The hallmark of episodic memory is that items are remembered in a spatial and temporal context, so we examined whether an item's valence (positive, negative) or arousal (high, low) would influence its ability to be remembered with those contextual details. Across two experiments, high-arousal items were remembered with spatial and temporal context more often than low-arousal items. Item valence did not influence memory for those details, although positive high-arousal items were recognized or recalled more often than negative items. These data suggest that emotion does not just bias participants to believe they have a vivid memory; rather, the arousal elicited by an event can benefit memory for some types of contextual details. © 2010 Psychology Press, an imprint of the Taylor & Francis Group, an Informa business

  1. Gender and the capacity to identify facial emotional expressions

    Directory of Open Access Journals (Sweden)

    Carolina Baptista Menezes

    Full Text Available Recognizing emotional expressions is enabled by a fundamental sociocognitive mechanism of human nature. This study compared 114 women and 104 men on the identification of basic emotions on a recognition task that used culturally adapted and validated faces to the Brazilian context. It was also investigated whether gender differences on emotion recognition would vary according to different exposure times. Women were generally better at detecting facial expressions, but an interaction suggested that the female superiority was particularly observed for anger, disgust, and surprise; results did not change according to age or time exposure. However, regardless of sex, total accuracy improved as presentation times increased, but only fear and anger significantly differed between the presentation times. Hence, in addition to the support of the evolutionary hypothesis of the female superiority in detecting facial expressions of emotions, recognition of facial expressions also depend on the time available to correctly identify an expression.

  2. Empathy in adolescence: Relations with emotion awareness and social roles.

    Science.gov (United States)

    Rieffe, Carolien; Camodeca, Marina

    2016-09-01

    In this study, we aimed at gaining a better understanding of the individual differences contributing to feelings of empathy in adolescents. Therefore, we examined the extent to which emotion awareness (e.g., recognizing and appreciating one's own and the emotions of others) and a tendency for certain social roles (e.g., helping or teasing peers when being bullied) are related to adolescents' levels of empathy. The sample was comprised of 182 adolescents aged between 11 and 16. Empathy and emotion awareness were assessed using self-report measures. Peer reports were used to indicate adolescents' different social roles: Bullying, defending the victim, and outsider behaviour. Outcomes demonstrated that evaluating one's own and the emotions of others, and more defending nominations were associated with both affective and cognitive empathy, whereas aspects of emotion awareness which are linked with internalizing symptoms were related to empathic distress, suggesting maladaptive emotion appraisal. Furthermore, outsider behaviour was associated with empathic distress, emphasizing a self-focused orientation. In contrast, more bullying was negatively associated with cognitive empathy. Overall, these outcomes demonstrate that, besides social roles, emotion awareness is an important factor for adaptive empathic reactions, whereas emotion dysregulation might cause distress when witnessing the negative feelings of others. © 2016 The British Psychological Society.

  3. A STUDY ON THE LINKAGE BETWEEN EMOTIONAL INTELLIGENCE AND JOB PERFORMANCE OF SOFTWARE PROFESSIONALS IN TAMILNADU

    OpenAIRE

    B. Rajkumar

    2018-01-01

    Emotional intelligence (EI), a recent construct which predicts various performance and leadership traits helps companies to deploy quality work force. Emotional Intelligence (EI) has emerged as a theme of widespread interest in psychological research in recent years. It affects the day-to-day life of everyone. EI is the ability to recognize our own potential as well manages everything as per situation. At work place, emotions are mainly based on two prospectors, namely, sociological and psych...

  4. Cultural relativity in perceiving emotion from vocalizations.

    Science.gov (United States)

    Gendron, Maria; Roberson, Debi; van der Vyver, Jacoba Marieta; Barrett, Lisa Feldman

    2014-04-01

    A central question in the study of human behavior is whether certain emotions, such as anger, fear, and sadness, are recognized in nonverbal cues across cultures. We predicted and found that in a concept-free experimental task, participants from an isolated cultural context (the Himba ethnic group from northwestern Namibia) did not freely label Western vocalizations with expected emotion terms. Responses indicate that Himba participants perceived more basic affective properties of valence (positivity or negativity) and to some extent arousal (high or low activation). In a second, concept-embedded task, we manipulated whether the target and foil on a given trial matched in both valence and arousal, neither valence nor arousal, valence only, or arousal only. Himba participants achieved above-chance accuracy only when foils differed from targets in valence only. Our results indicate that the voice can reliably convey affective meaning across cultures, but that perceptions of emotion from the voice are culturally variable.

  5. Pure Erythroleukemia (Variant Acute Myeloid Leukemia-vAML-M6) with Deletion of Chromosome 20, Mainly Presenting as Late Erythroblasts, a Unique Case Report with Review of Literature.

    Science.gov (United States)

    Rasool, Javid; Geelani, Sajad; Khursheed; Yasir; Lone, Mohd Suhail; Shaban, Mohd

    2014-03-01

    Acute erythroleukemia is characterized by a predominant immature erythroid population and accounts for approximately 2-5 % of all cases of acute leukemia. Two subtypes are recognized based on the presence or absence of a significant myeloid component: erythroleukemia and pure erythroid leukemia. Erythroleukemia is predominantly a disease of adults, while pure erythroid leukemia can be seen in any age including children. Here is a case of pure erythroleukemia presenting mainly as late erythroblasts which was diagnosed on bone marrow examination, cytochemistry and was confirmed on immunophenotyping. Possibly this is the only case so for demonstrating deletion of long arm of chromosome 20 in pure erythroleukemia.

  6. Perception of emotional facial expressions in individuals with high Autism-spectrum Quotient (AQ

    Directory of Open Access Journals (Sweden)

    Ervin Poljac

    2012-10-01

    Full Text Available Autism is characterized by difficulties in social interaction, communication, restrictive and repetitive behaviours and specific impairments in emotional processing. The present study employed The Autism Spectrum Quotient (Baron-Cohen et al. 2006 to quantify autistic traits in a group of 260 healthy individuals and to investigate whether this measure is related to the perception of facial emotional expressions. The emotional processing of twelve participants that scored significantly higher than the average on the AQ was compared to twelve participants with significantly lower AQ scores. Perception of emotional expressions was estimated by The Facial Recognition Task (Montagne et al. 2007. There were significant differences between the two groups with regard to accuracy and sensitivity of the perception of emotional facial expressions. Specifically, the group with high AQ score was less accurate and needed higher emotional content to recognize emotions of anger, disgust, happiness and sadness. This result implies a selective impairment that might be helpful in understanding the psychopathology of autism spectrum disorders.

  7. The role of emotion in clinical decision making: an integrative literature review.

    Science.gov (United States)

    Kozlowski, Desirée; Hutchinson, Marie; Hurley, John; Rowley, Joanne; Sutherland, Joanna

    2017-12-15

    Traditionally, clinical decision making has been perceived as a purely rational and cognitive process. Recently, a number of authors have linked emotional intelligence (EI) to clinical decision making (CDM) and calls have been made for an increased focus on EI skills for clinicians. The objective of this integrative literature review was to identify and synthesise the empirical evidence for a role of emotion in CDM. A systematic search of the bibliographic databases PubMed, PsychINFO, and CINAHL (EBSCO) was conducted to identify empirical studies of clinician populations. Search terms were focused to identify studies reporting clinician emotion OR clinician emotional intelligence OR emotional competence AND clinical decision making OR clinical reasoning. Twenty three papers were retained for synthesis. These represented empirical work from qualitative, quantitative, and mixed-methods approaches and comprised work with a focus on experienced emotion and on skills associated with emotional intelligence. The studies examined nurses (10), physicians (7), occupational therapists (1), physiotherapists (1), mixed clinician samples (3), and unspecified infectious disease experts (1). We identified two main themes in the context of clinical decision making: the subjective experience of emotion; and, the application of emotion and cognition in CDM. Sub-themes under the subjective experience of emotion were: emotional response to contextual pressures; emotional responses to others; and, intentional exclusion of emotion from CDM. Under the application of emotion and cognition in CDM, sub-themes were: compassionate emotional labour - responsiveness to patient emotion within CDM; interdisciplinary tension regarding the significance and meaning of emotion in CDM; and, emotion and moral judgement. Clinicians' experienced emotions can and do affect clinical decision making, although acknowledgement of that is far from universal. Importantly, this occurs in the in the absence of a

  8. Building emotional intelligence: a strategy for emerging nurse leaders to reduce workplace bullying.

    Science.gov (United States)

    Bennett, Karen; Sawatzky, Jo-Ann V

    2013-01-01

    Bullying is one of the most concerning forms of aggression in health care organizations. Conceptualized as an emotion-based response, bullying is often triggered by today's workplace challenges. Unfortunately, workplace bullying is an escalating problem in nursing. Bullying contributes to unhealthy and toxic environments, which in turn contribute to ineffective patient care, increased stress, and decreased job satisfaction among health care providers. These equate to a poor workforce environment, which in turn increases hospital costs when nurses choose to leave. Nurse managers are in positions of power to recognize and address negative workplace behaviors, such as bullying. However, emerging leaders in particular may not be equipped with the tools to deal with bullying and consequently may choose to overlook it. Substantive evidence from other disciplines supports the contention that individuals with greater emotional intelligence are better equipped to recognize early signs of negative behavior, such as bullying. Therefore, fostering emotional intelligence in emerging nurse leaders may lead to less bullying and more positive workplace environments for nurses in the future.

  9. Social appraisal influences recognition of emotions.

    Science.gov (United States)

    Mumenthaler, Christian; Sander, David

    2012-06-01

    The notion of social appraisal emphasizes the importance of a social dimension in appraisal theories of emotion by proposing that the way an individual appraises an event is influenced by the way other individuals appraise and feel about the same event. This study directly tested this proposal by asking participants to recognize dynamic facial expressions of emotion (fear, happiness, or anger in Experiment 1; fear, happiness, anger, or neutral in Experiment 2) in a target face presented at the center of a screen while a contextual face, which appeared simultaneously in the periphery of the screen, expressed an emotion (fear, happiness, anger) or not (neutral) and either looked at the target face or not. We manipulated gaze direction to be able to distinguish between a mere contextual effect (gaze away from both the target face and the participant) and a specific social appraisal effect (gaze toward the target face). Results of both experiments provided evidence for a social appraisal effect in emotion recognition, which differed from the mere effect of contextual information: Whereas facial expressions were identical in both conditions, the direction of the gaze of the contextual face influenced emotion recognition. Social appraisal facilitated the recognition of anger, happiness, and fear when the contextual face expressed the same emotion. This facilitation was stronger than the mere contextual effect. Social appraisal also allowed better recognition of fear when the contextual face expressed anger and better recognition of anger when the contextual face expressed fear. 2012 APA, all rights reserved

  10. Recognition of Facial Expressions of Different Emotional Intensities in Patients with Frontotemporal Lobar Degeneration

    Directory of Open Access Journals (Sweden)

    Roy P. C. Kessels

    2007-01-01

    Full Text Available Behavioural problems are a key feature of frontotemporal lobar degeneration (FTLD. Also, FTLD patients show impairments in emotion processing. Specifically, the perception of negative emotional facial expressions is affected. Generally, however, negative emotional expressions are regarded as more difficult to recognize than positive ones, which thus may have been a confounding factor in previous studies. Also, ceiling effects are often present on emotion recognition tasks using full-blown emotional facial expressions. In the present study with FTLD patients, we examined the perception of sadness, anger, fear, happiness, surprise and disgust at different emotional intensities on morphed facial expressions to take task difficulty into account. Results showed that our FTLD patients were specifically impaired at the recognition of the emotion anger. Also, the patients performed worse than the controls on recognition of surprise, but performed at control levels on disgust, happiness, sadness and fear. These findings corroborate and extend previous results showing deficits in emotion perception in FTLD.

  11. Putting emotions in routes: the influence of emotionally laden landmarks on spatial memory.

    Science.gov (United States)

    Ruotolo, F; Claessen, M H G; van der Ham, I J M

    2018-04-16

    The aim of this study was to assess how people memorize spatial information of emotionally laden landmarks along a route and if the emotional value of the landmarks affects the way metric and configurational properties of the route itself are represented. Three groups of participants were asked to watch a movie of a virtual walk along a route. The route could contain positive, negative, or neutral landmarks. Afterwards, participants were asked to: (a) recognize the landmarks; (b) imagine to walk distances between landmarks; (c) indicate the position of the landmarks along the route; (d) judge the length of the route; (e) draw the route. Results showed that participants who watched the route with positive landmarks were more accurate in locating the landmarks along the route and drawing the route. On the other hand, participants in the negative condition judged the route as longer than participants in the other two conditions and were less accurate in mentally reproducing distances between landmarks. The data will be interpreted in the light of the "feelings-as-information theory" by Schwarz (2010) and the most recent evidence about the effect of emotions on spatial memory. In brief, the evidence collected in this study supports the idea that spatial cognition emerges from the interaction between an organism and contextual characteristics.

  12. Scale of Academic Emotion in Science Education: Development and Validation

    Science.gov (United States)

    Chiang, Wen-Wei; Liu, Chia-Ju

    2014-01-01

    Contemporary research into science education has generally been conducted from the perspective of "conceptual change" in learning. This study sought to extend previous work by recognizing that human rationality can be influenced by the emotions generated by the learning environment and specific actions related to learning. Methods used…

  13. EMOTIONAL INTELIGENCE THE PRESCPECTIVE OF DANIEL GOLEMAN AND ITS RELEVANCE IN ISLAMIC EDUCATION

    Directory of Open Access Journals (Sweden)

    Ivan Riyadi

    2016-01-01

    Full Text Available This articel was conducted based on the consideration that the current emotional intelligence is still indispensable in shaping the behavior of students. With specifying on the subjects of Islamic education. This article tried to connect the emotional intelligence of high school students on Islamic Education. This article examines to determine how Islamic religious education policies that have been implemented in high school and to determine the relevance of emotional intelligence of high school students against the teachings of Islam. To get a complete this article. libarary research approach.  Data was collected through literature study includes studying, studying and citing theories or concepts from a number of literature. Books, journals, magazines and others. It can be applied to educate children who are emotionally intelligent with the ability to recognize self-managing emotions productively utilize emotions, empathy, and the ability to build social relationships.

  14. Emotional voice processing: investigating the role of genetic variation in the serotonin transporter across development.

    Directory of Open Access Journals (Sweden)

    Tobias Grossmann

    Full Text Available The ability to effectively respond to emotional information carried in the human voice plays a pivotal role for social interactions. We examined how genetic factors, especially the serotonin transporter genetic variation (5-HTTLPR, affect the neurodynamics of emotional voice processing in infants and adults by measuring event-related brain potentials (ERPs. The results revealed that infants distinguish between emotions during an early perceptual processing stage, whereas adults recognize and evaluate the meaning of emotions during later semantic processing stages. While infants do discriminate between emotions, only in adults was genetic variation associated with neurophysiological differences in how positive and negative emotions are processed in the brain. This suggests that genetic association with neurocognitive functions emerges during development, emphasizing the role that variation in serotonin plays in the maturation of brain systems involved in emotion recognition.

  15. Predictable chaos: a review of the effects of emotions on attention, memory and decision making.

    Science.gov (United States)

    LeBlanc, Vicki R; McConnell, Meghan M; Monteiro, Sandra D

    2015-03-01

    Healthcare practice and education are highly emotional endeavors. While this is recognized by educators and researchers seeking to develop interventions aimed at improving wellness in health professionals and at providing them with skills to deal with emotional interpersonal situations, the field of health professions education has largely ignored the role that emotions play on cognitive processes. The purpose of this review is to provide an introduction to the broader field of emotions, with the goal of better understanding the integral relationship between emotions and cognitive processes. Individuals, at any given time, are in an emotional state. This emotional state influences how they perceive the world around them, what they recall from it, as well as the decisions they make. Rather than treating emotions as undesirable forces that wreak havoc on the rational being, the field of health professions education could be enriched by a greater understanding of how these emotions can shape cognitive processes in increasingly predictable ways.

  16. Arousal Rather than Basic Emotions Influence Long-Term Recognition Memory in Humans.

    Science.gov (United States)

    Marchewka, Artur; Wypych, Marek; Moslehi, Abnoos; Riegel, Monika; Michałowski, Jarosław M; Jednoróg, Katarzyna

    2016-01-01

    Emotion can influence various cognitive processes, however its impact on memory has been traditionally studied over relatively short retention periods and in line with dimensional models of affect. The present study aimed to investigate emotional effects on long-term recognition memory according to a combined framework of affective dimensions and basic emotions. Images selected from the Nencki Affective Picture System were rated on the scale of affective dimensions and basic emotions. After 6 months, subjects took part in a surprise recognition test during an fMRI session. The more negative the pictures the better they were remembered, but also the more false recognitions they provoked. Similar effects were found for the arousal dimension. Recognition success was greater for pictures with lower intensity of happiness and with higher intensity of surprise, sadness, fear, and disgust. Consecutive fMRI analyses showed a significant activation for remembered (recognized) vs. forgotten (not recognized) images in anterior cingulate and bilateral anterior insula as well as in bilateral caudate nuclei and right thalamus. Further, arousal was found to be the only subjective rating significantly modulating brain activation. Higher subjective arousal evoked higher activation associated with memory recognition in the right caudate and the left cingulate gyrus. Notably, no significant modulation was observed for other subjective ratings, including basic emotion intensities. These results emphasize the crucial role of arousal for long-term recognition memory and support the hypothesis that the memorized material, over time, becomes stored in a distributed cortical network including the core salience network and basal ganglia.

  17. Arousal rather than basic emotions influence long-term recognition memory in humans.

    Directory of Open Access Journals (Sweden)

    Artur Marchewka

    2016-10-01

    Full Text Available Emotion can influence various cognitive processes, however its impact on memory has been traditionally studied over relatively short retention periods and in line with dimensional models of affect. The present study aimed to investigate emotional effects on long-term recognition memory according to a combined framework of affective dimensions and basic emotions. Images selected from the Nencki Affective Picture System were rated on the scale of affective dimensions and basic emotions. After six months, subjects took part in a surprise recognition test during an fMRI session. The more negative the pictures the better they were remembered, but also the more false recognitions they provoked. Similar effects were found for the arousal dimension. Recognition success was greater for pictures with lower intensity of happiness and with higher intensity of surprise, sadness, fear, and disgust. Consecutive fMRI analyses showed a significant activation for remembered (recognized vs. forgotten (not recognized images in anterior cingulate and bilateral anterior insula as well as in bilateral caudate nuclei and right thalamus. Further, arousal was found to be the only subjective rating significantly modulating brain activation. Higher subjective arousal evoked higher activation associated with memory recognition in the right caudate and the left cingulate gyrus. Notably, no significant modulation was observed for other subjective ratings, including basic emotion intensities. These results emphasize the crucial role of arousal for long-term recognition memory and support the hypothesis that the memorized material, over time, becomes stored in a distributed cortical network including the core salience network and basal ganglia.

  18. Evaluating the Emotional State of a User Using a Webcam

    Directory of Open Access Journals (Sweden)

    Martin Magdin

    2016-09-01

    Full Text Available In online learning is more difficult for teachers identify to see how individual students behave. Student’s emotions like self-esteem, motivation, commitment, and others that are believed to be determinant in student’s performance can not be ignored, as they are known (affective states and also learning styles to greatly influence student’s learning. The ability of the computer to evaluate the emotional state of the user is getting bigger attention. By evaluating the emotional state, there is an attempt to overcome the barrier between man and non-emotional machine. Recognition of a real time emotion in e-learning by using webcams is research area in the last decade. Improving learning through webcams and microphones offers relevant feedback based upon learner’s facial expressions and verbalizations. The majority of current software does not work in real time – scans face and progressively evaluates its features. The designed software works by the use neural networks in real time which enable to apply the software into various fields of our lives and thus actively influence its quality. Validation of face emotion recognition software was annotated by using various experts. These expert findings were contrasted with the software results. An overall accuracy of our software based on the requested emotions and the recognized emotions is 78%. Online evaluation of emotions is an appropriate technology for enhancing the quality and efficacy of e-learning by including the learner´s emotional states.

  19. Multimodal emotional state recognition using sequence-dependent deep hierarchical features.

    Science.gov (United States)

    Barros, Pablo; Jirak, Doreen; Weber, Cornelius; Wermter, Stefan

    2015-12-01

    Emotional state recognition has become an important topic for human-robot interaction in the past years. By determining emotion expressions, robots can identify important variables of human behavior and use these to communicate in a more human-like fashion and thereby extend the interaction possibilities. Human emotions are multimodal and spontaneous, which makes them hard to be recognized by robots. Each modality has its own restrictions and constraints which, together with the non-structured behavior of spontaneous expressions, create several difficulties for the approaches present in the literature, which are based on several explicit feature extraction techniques and manual modality fusion. Our model uses a hierarchical feature representation to deal with spontaneous emotions, and learns how to integrate multiple modalities for non-verbal emotion recognition, making it suitable to be used in an HRI scenario. Our experiments show that a significant improvement of recognition accuracy is achieved when we use hierarchical features and multimodal information, and our model improves the accuracy of state-of-the-art approaches from 82.5% reported in the literature to 91.3% for a benchmark dataset on spontaneous emotion expressions. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. Emotion Knowledge and Attentional Differences in Preschoolers Showing Context-Inappropriate Anger.

    Science.gov (United States)

    Locke, Robin L; Lang, Nichole J

    2016-08-01

    Some children show anger inappropriate for the situation based on the predominant incentives, which is called context-inappropriate anger. Children need to attend to and interpret situational incentives for appropriate emotional responses. We examined associations of context-inappropriate anger with emotion recognition and attention problems in 43 preschoolers (42% male; M age = 55.1 months, SD = 4.1). Parents rated context-inappropriate anger across situations. Teachers rated attention problems using the Child Behavior Checklist-Teacher Report Form. Emotion recognition was ability to recognize emotional faces using the Emotion Matching Test. Anger perception bias was indicated by anger to non-anger situations using an adapted Affect Knowledge Test. 28% of children showed context-inappropriate anger, which correlated with lower emotion recognition (β = -.28) and higher attention problems (β = .36). Higher attention problems correlated with more anger perception bias (β = .32). This cross-sectional, correlational study provides preliminary findings that children with context-inappropriate anger showed more attention problems, which suggests that both "problems" tend to covary and associate with deficits or biases in emotion knowledge. © The Author(s) 2016.

  1. Mapping the impairment in decoding static facial expressions of emotion in prosopagnosia.

    Science.gov (United States)

    Fiset, Daniel; Blais, Caroline; Royer, Jessica; Richoz, Anne-Raphaëlle; Dugas, Gabrielle; Caldara, Roberto

    2017-08-01

    Acquired prosopagnosia is characterized by a deficit in face recognition due to diverse brain lesions, but interestingly most prosopagnosic patients suffering from posterior lesions use the mouth instead of the eyes for face identification. Whether this bias is present for the recognition of facial expressions of emotion has not yet been addressed. We tested PS, a pure case of acquired prosopagnosia with bilateral occipitotemporal lesions anatomically sparing the regions dedicated for facial expression recognition. PS used mostly the mouth to recognize facial expressions even when the eye area was the most diagnostic. Moreover, PS directed most of her fixations towards the mouth. Her impairment was still largely present when she was instructed to look at the eyes, or when she was forced to look at them. Control participants showed a performance comparable to PS when only the lower part of the face was available. These observations suggest that the deficits observed in PS with static images are not solely attentional, but are rooted at the level of facial information use. This study corroborates neuroimaging findings suggesting that the Occipital Face Area might play a critical role in extracting facial features that are integrated for both face identification and facial expression recognition in static images. © The Author (2017). Published by Oxford University Press.

  2. Facial Emotions Recognition using Gabor Transform and Facial Animation Parameters with Neural Networks

    Science.gov (United States)

    Harit, Aditya; Joshi, J. C., Col; Gupta, K. K.

    2018-03-01

    The paper proposed an automatic facial emotion recognition algorithm which comprises of two main components: feature extraction and expression recognition. The algorithm uses a Gabor filter bank on fiducial points to find the facial expression features. The resulting magnitudes of Gabor transforms, along with 14 chosen FAPs (Facial Animation Parameters), compose the feature space. There are two stages: the training phase and the recognition phase. Firstly, for the present 6 different emotions, the system classifies all training expressions in 6 different classes (one for each emotion) in the training stage. In the recognition phase, it recognizes the emotion by applying the Gabor bank to a face image, then finds the fiducial points, and then feeds it to the trained neural architecture.

  3. Test battery for measuring the perception and recognition of facial expressions of emotion

    Science.gov (United States)

    Wilhelm, Oliver; Hildebrandt, Andrea; Manske, Karsten; Schacht, Annekathrin; Sommer, Werner

    2014-01-01

    Despite the importance of perceiving and recognizing facial expressions in everyday life, there is no comprehensive test battery for the multivariate assessment of these abilities. As a first step toward such a compilation, we present 16 tasks that measure the perception and recognition of facial emotion expressions, and data illustrating each task's difficulty and reliability. The scoring of these tasks focuses on either the speed or accuracy of performance. A sample of 269 healthy young adults completed all tasks. In general, accuracy and reaction time measures for emotion-general scores showed acceptable and high estimates of internal consistency and factor reliability. Emotion-specific scores yielded lower reliabilities, yet high enough to encourage further studies with such measures. Analyses of task difficulty revealed that all tasks are suitable for measuring emotion perception and emotion recognition related abilities in normal populations. PMID:24860528

  4. Americans and Palestinians judge spontaneous facial expressions of emotion.

    Science.gov (United States)

    Kayyal, Mary H; Russell, James A

    2013-10-01

    The claim that certain emotions are universally recognized from facial expressions is based primarily on the study of expressions that were posed. The current study was of spontaneous facial expressions shown by aborigines in Papua New Guinea (Ekman, 1980); 17 faces claimed to convey one (or, in the case of blends, two) basic emotions and five faces claimed to show other universal feelings. For each face, participants rated the degree to which each of the 12 predicted emotions or feelings was conveyed. The modal choice for English-speaking Americans (n = 60), English-speaking Palestinians (n = 60), and Arabic-speaking Palestinians (n = 44) was the predicted label for only 4, 5, and 4, respectively, of the 17 faces for basic emotions, and for only 2, 2, and 2, respectively, of the 5 faces for other feelings. Observers endorsed the predicted emotion or feeling moderately often (65%, 55%, and 44%), but also denied it moderately often (35%, 45%, and 56%). They also endorsed more than one (or, for blends, two) label(s) in each face-on average, 2.3, 2.3, and 1.5 of basic emotions and 2.6, 2.2, and 1.5 of other feelings. There were both similarities and differences across culture and language, but the emotional meaning of a facial expression is not well captured by the predicted label(s) or, indeed, by any single label.

  5. Contribution of Prosody in Audio-Visual Integration to Emotional Perception of Virtual Characters

    Directory of Open Access Journals (Sweden)

    Ekaterina Volkova

    2011-10-01

    Full Text Available Recent technology provides us with realistic looking virtual characters. Motion capture and elaborate mathematical models supply data for natural looking, controllable facial and bodily animations. With the help of computational linguistics and artificial intelligence, we can automatically assign emotional categories to appropriate stretches of text for a simulation of those social scenarios where verbal communication is important. All this makes virtual characters a valuable tool for creation of versatile stimuli for research on the integration of emotion information from different modalities. We conducted an audio-visual experiment to investigate the differential contributions of emotional speech and facial expressions on emotion identification. We used recorded and synthesized speech as well as dynamic virtual faces, all enhanced for seven emotional categories. The participants were asked to recognize the prevalent emotion of paired faces and audio. Results showed that when the voice was recorded, the vocalized emotion influenced participants' emotion identification more than the facial expression. However, when the voice was synthesized, facial expression influenced participants' emotion identification more than vocalized emotion. Additionally, individuals did worse on identifying either the facial expression or vocalized emotion when the voice was synthesized. Our experimental method can help to determine how to improve synthesized emotional speech.

  6. Scale of Academic Emotion in Science Education: Development and Validation

    Science.gov (United States)

    Chiang, Wen-Wei; Liu, Chia-Ju

    2014-04-01

    Contemporary research into science education has generally been conducted from the perspective of 'conceptual change' in learning. This study sought to extend previous work by recognizing that human rationality can be influenced by the emotions generated by the learning environment and specific actions related to learning. Methods used in educational psychology were adopted to investigate the emotional experience of science students as affected by gender, teaching methods, feedback, and learning tasks. A multidisciplinary research approach combining brain activation measurement with multivariate psychological data theory was employed in the development of a questionnaire intended to reveal the academic emotions of university students in three situations: attending science class, learning scientific subjects, and problem solving. The reliability and validity of the scale was evaluated using exploratory and confirmatory factor analyses. Results revealed differences between the genders in positive-activating and positive-deactivating academic emotions in all three situations; however, these differences manifested primarily during preparation for Science tests. In addition, the emotions experienced by male students were more intense than those of female students. Finally, the negative-deactivating emotions associated with participation in Science tests were more intense than those experienced by simply studying science. This study provides a valuable tool with which to evaluate the emotional response of students to a range of educational situations.

  7. Unaltered emotional experience in Parkinson's disease: Pupillometry and behavioral evidence.

    Science.gov (United States)

    Schwartz, Rachel; Rothermich, Kathrin; Kotz, Sonja A; Pell, Marc D

    2018-04-01

    Recognizing emotions in others is a pivotal part of socioemotional functioning and plays a central role in social interactions. It has been shown that individuals suffering from Parkinson's disease (PD) are less accurate at identifying basic emotions such as fear, sadness, and happiness; however, previous studies have predominantly assessed emotion processing using unimodal stimuli (e.g., pictures) that do not reflect the complexity of real-world processing demands. Dynamic, naturalistic stimuli (e.g., movies) have been shown to elicit stronger subjective emotional experiences than unimodal stimuli and can facilitate emotion recognition. In this experiment, pupil measurements of PD patients and matched healthy controls (HC) were recorded while they watched short film clips. Participants' task was to identify the emotion elicited by each clip and rate the intensity of their emotional response. We explored (a) how PD affects subjective emotional experience in response to dynamic, ecologically valid film stimuli, and (b) whether there are PD-related changes in pupillary response, which may contribute to the differences in emotion processing reported in the literature. Behavioral results showed that identification of the felt emotion as well as perceived intensity varies by emotion, but no significant group effect was found. Pupil measurements revealed differences in dilation depending on the emotion evoked by the film clips (happy, tender, sadness, fear, and neutral) for both groups. Our results suggest that differences in emotional response may be negligible when PD patients and healthy controls are presented with dynamic, ecologically valid emotional stimuli. Given the limited data available on pupil response in PD, this study provides new evidence to suggest that the PD-related deficits in emotion processing reported in the literature may not translate to real-world differences in physiological or subjective emotion processing in early-stage PD patients.

  8. Mixtures of maximally entangled pure states

    Energy Technology Data Exchange (ETDEWEB)

    Flores, M.M., E-mail: mflores@nip.up.edu.ph; Galapon, E.A., E-mail: eric.galapon@gmail.com

    2016-09-15

    We study the conditions when mixtures of maximally entangled pure states remain entangled. We found that the resulting mixed state remains entangled when the number of entangled pure states to be mixed is less than or equal to the dimension of the pure states. For the latter case of mixing a number of pure states equal to their dimension, we found that the mixed state is entangled provided that the entangled pure states to be mixed are not equally weighted. We also found that one can restrict the set of pure states that one can mix from in order to ensure that the resulting mixed state is genuinely entangled. Also, we demonstrate how these results could be applied as a way to detect entanglement in mixtures of the entangled pure states with noise.

  9. The Relationship Between Family Functioning and Adolescent Depressive Symptoms: The Role of Emotional Clarity.

    Science.gov (United States)

    Freed, Rachel D; Rubenstein, Liza M; Daryanani, Issar; Olino, Thomas M; Alloy, Lauren B

    2016-03-01

    Emotion regulation has been implicated in the etiology of depression. A first step in adaptive emotion regulation involves emotional clarity, the ability to recognize and differentiate one's emotional experience. As family members are critical in facilitating emotional understanding and communication, we examined the impact of family functioning on adolescent emotional clarity and depressive symptoms. We followed 364 adolescents (ages 14-17; 52.5% female; 51.4 % Caucasian, 48.6% African American) and their mothers over 2 years (3 time points) and assessed emotional clarity, depressive symptoms, and adolescents' and mothers' reports of family functioning. Emotional clarity mediated the relationship between adolescents' reports of family functioning and depressive symptoms at all time points cross-sectionally, and according to mothers' reports of family functioning at Time 1 only. There was no evidence of longitudinal mediation for adolescents' or mothers' reports of family functioning. Thus, family functioning, emotional clarity, and depressive symptoms are strongly related constructs during various time points in adolescence, which has important implications for intervention, especially within the family unit.

  10. Yoga therapy for promoting emotional sensitivity in University students.

    Science.gov (United States)

    Ganpat, Tikhe Sham; Dash, Sasmita; Ramarao, Nagendra Hongasandra

    2014-01-01

    Students need emotional intelligence (EI) for their better academic excellence. There are three important psychological dimensions of EI: Emotional sensitivity (ES), emotional maturity (EM) and emotional competency (EC), which motivate students to recognize truthfully, interpret honestly and handle tactfully the dynamics of their behavioral pattern. The study was designed to assess ES in the students undergoing yoga therapy program in the form of yoga instructor's course (YIC) module. One hundred and eighty four YIC students with 25.77 ± 4.85 years of mean age participated in this study of 21 days duration (a single group pre-post design). The ES data was collected before (pre) and after (post) YIC module using Emotional Quotient test developed by Dr Dalip Singh and Dr N K Chadha. Means, standard deviations, Kolmogorov-Smirnov test, and Wilcoxon signed rank test were used for analyzing the data with the help of SPSS 16. The data analysis showed 3.63% significant increase (P < 0.01) in ES. The present study suggests that YIC module can result in improvement of ES among university students, thus paving the way for their academic success. Additional well-designed studies are needed before a strong recommendation can be made.

  11. Hooked on a feeling: emotional labor as an occupational hazard of the post-industrial age.

    Science.gov (United States)

    Andrews, Bonnie K; Karcz, Susan; Rosenberg, Beth

    2008-01-01

    Emotional labor is a subtle but serious occupational hazard that is likely to spread rapidly as the global service economy continues to grow. Emotional labor requires more than just acting friendly and being helpful to customers; the worker must manage his or her emotions to create a company-dictated experience for customers. The practice of emotional labor in an unsupportive work environment produces work-related stress, which has a wide range of potentially serious health effects. Though many employers do not acknowledge the existence of emotional labor, it is a real occupational hazard that may generate life-altering effects on physical and emotional health. While no official regulations or identification standards specify emotional labor as an occupational hazard, some guidelines exist regarding its outcome: occupational stress. Emotional labor should be recognized as an occupational hazard by the Occupational Safety and Health Administration (OSHA), but this hazard does not lend itself to regulation through standards. The business culture that demands its performance is questioned.

  12. Education technology with continuous real time monitoring of the current functional and emotional students' states

    Science.gov (United States)

    Alyushin, M. V.; Kolobashkina, L. V.

    2017-01-01

    The education technology with continuous monitoring of the current functional and emotional students' states is suggested. The application of this technology allows one to increase the effectiveness of practice through informed planning of the training load. For monitoring the current functional and emotional students' states non-contact remote technologies of person bioparameters registration are encouraged to use. These technologies are based on recording and processing in real time the main person bioparameters in a purely passive mode. Experimental testing of this technology has confirmed its effectiveness.

  13. Encoding conditions affect recognition of vocally expressed emotions across cultures.

    Science.gov (United States)

    Jürgens, Rebecca; Drolet, Matthis; Pirow, Ralph; Scheiner, Elisabeth; Fischer, Julia

    2013-01-01

    Although the expression of emotions in humans is considered to be largely universal, cultural effects contribute to both emotion expression and recognition. To disentangle the interplay between these factors, play-acted and authentic (non-instructed) vocal expressions of emotions were used, on the assumption that cultural effects may contribute differentially to the recognition of staged and spontaneous emotions. Speech tokens depicting four emotions (anger, sadness, joy, fear) were obtained from German radio archives and re-enacted by professional actors, and presented to 120 participants from Germany, Romania, and Indonesia. Participants in all three countries were poor at distinguishing between play-acted and spontaneous emotional utterances (58.73% correct on average with only marginal cultural differences). Nevertheless, authenticity influenced emotion recognition: across cultures, anger was recognized more accurately when play-acted (z = 15.06, p emotions, indicating a moderate in-group advantage. There was no difference between Romanian and Indonesian subjects in the overall emotion recognition. Differential cultural effects became particularly apparent in terms of differential biases in emotion attribution. While all participants labeled play-acted expressions as anger more frequently than expected, German participants exhibited a further bias toward choosing anger for spontaneous stimuli. In contrast to the German sample, Romanian and Indonesian participants were biased toward choosing sadness. These results support the view that emotion recognition rests on a complex interaction of human universals and cultural specificities. Whether and in which way the observed biases are linked to cultural differences in self-construal remains an issue for further investigation.

  14. On the Role of Crossmodal Prediction in Audiovisual Emotion Perception

    Directory of Open Access Journals (Sweden)

    Sarah eJessen

    2013-07-01

    Full Text Available Humans rely on multiple sensory modalities to determine the emotional state of others. In fact, such multisensory perception may be one of the mechanisms explaining the ease and efficiency by which others’ emotions are recognized. But how and when exactly do the different modalities interact? One aspect in multisensory perception that has received increasing interest in recent years is the concept of crossmodal prediction. In emotion perception, as in most other settings, visual information precedes the auditory one. Thereby, leading in visual information can facilitate subsequent auditory processing. While this mechanism has often been described in audiovisual speech perception, it has not been addressed so far in audiovisual emotion perception. Based on the current state of the art in (a crossmodal prediction and (b multisensory emotion perception research, we propose that it is essential to consider the former in order to fully understand the latter. Focusing on electroencephalographic (EEG and magnetoencephalographic (MEG studies, we provide a brief overview of the current research in both fields. In discussing these findings, we suggest that emotional visual information may allow for a more reliable prediction of auditory information compared to non-emotional visual information. In support of this hypothesis, we present a re-analysis of a previous data set that shows an inverse correlation between the N1 response in the EEG and the duration of visual emotional but not non-emotional information. If the assumption that emotional content allows for more reliable predictions can be corroborated in future studies, crossmodal prediction is a crucial factor in our understanding of multisensory emotion perception.

  15. On the role of crossmodal prediction in audiovisual emotion perception.

    Science.gov (United States)

    Jessen, Sarah; Kotz, Sonja A

    2013-01-01

    Humans rely on multiple sensory modalities to determine the emotional state of others. In fact, such multisensory perception may be one of the mechanisms explaining the ease and efficiency by which others' emotions are recognized. But how and when exactly do the different modalities interact? One aspect in multisensory perception that has received increasing interest in recent years is the concept of cross-modal prediction. In emotion perception, as in most other settings, visual information precedes the auditory information. Thereby, leading in visual information can facilitate subsequent auditory processing. While this mechanism has often been described in audiovisual speech perception, so far it has not been addressed in audiovisual emotion perception. Based on the current state of the art in (a) cross-modal prediction and (b) multisensory emotion perception research, we propose that it is essential to consider the former in order to fully understand the latter. Focusing on electroencephalographic (EEG) and magnetoencephalographic (MEG) studies, we provide a brief overview of the current research in both fields. In discussing these findings, we suggest that emotional visual information may allow more reliable predicting of auditory information compared to non-emotional visual information. In support of this hypothesis, we present a re-analysis of a previous data set that shows an inverse correlation between the N1 EEG response and the duration of visual emotional, but not non-emotional information. If the assumption that emotional content allows more reliable predicting can be corroborated in future studies, cross-modal prediction is a crucial factor in our understanding of multisensory emotion perception.

  16. Dissociation between Emotional Remapping of Fear and Disgust in Alexithymia.

    Directory of Open Access Journals (Sweden)

    Cristina Scarpazza

    Full Text Available There is growing evidence that individuals are able to understand others' emotions because they "embody" them, i.e., re-experience them by activating a representation of the observed emotion within their own body. One way to study emotion embodiment is provided by a multisensory stimulation paradigm called emotional visual remapping of touch (eVRT, in which the degree of embodiment/remapping of emotions is measured as enhanced detection of near-threshold tactile stimuli on one's own face while viewing different emotional facial expressions. Here, we measured remapping of fear and disgust in participants with low (LA and high (HA levels of alexithymia, a personality trait characterized by a difficulty in recognizing emotions. The results showed that fear is remapped in LA but not in HA participants, while disgust is remapped in HA but not in LA participants. To investigate the hypothesis that HA might exhibit increased responses to emotional stimuli producing a heightened physical and visceral sensations, i.e., disgust, in a second experiment we investigated participants' interoceptive abilities and the link between interoception and emotional modulations of VRT. The results showed that participants' disgust modulations of VRT correlated with their ability to perceive bodily signals. We suggest that the emotional profile of HA individuals on the eVRT task could be related to their abnormal tendency to be focalized on their internal bodily signals, and to experience emotions in a "physical" way. Finally, we speculated that these results in HA could be due to a enhancement of insular activity during the perception of disgusted faces.

  17. Facial emotional recognition in schizophrenia: preliminary results of the virtual reality program for facial emotional recognition

    Directory of Open Access Journals (Sweden)

    Teresa Souto

    2013-01-01

    Full Text Available BACKGROUND: Significant deficits in emotional recognition and social perception characterize patients with schizophrenia and have direct negative impact both in inter-personal relationships and in social functioning. Virtual reality, as a methodological resource, might have a high potential for assessment and training skills in people suffering from mental illness. OBJECTIVES: To present preliminary results of a facial emotional recognition assessment designed for patients with schizophrenia, using 3D avatars and virtual reality. METHODS: Presentation of 3D avatars which reproduce images developed with the FaceGen® software and integrated in a three-dimensional virtual environment. Each avatar was presented to a group of 12 patients with schizophrenia and a reference group of 12 subjects without psychiatric pathology. RESULTS: The results show that the facial emotions of happiness and anger are better recognized by both groups and that the major difficulties arise in fear and disgust recognition. Frontal alpha electroencephalography variations were found during the presentation of anger and disgust stimuli among patients with schizophrenia. DISCUSSION: The developed program evaluation module can be of surplus value both for patient and therapist, providing the task execution in a non anxiogenic environment, however similar to the actual experience.

  18. The autistic child's appraisal of expressions of emotion: a further study.

    Science.gov (United States)

    Hobson, R P

    1986-09-01

    Autistic and matched non-autistic retarded children were selected for their ability to recognize the correspondence between schematic drawings and videotaped scenes involving people. The subjects of both groups were able to choose schematic drawings of gestures for a person's gestures of emotion enacted on videotape. However, the autistic children were significantly impaired in choosing which of the drawings of gestures should 'go with' videotaped vocalizations and facial expressions characteristic of four emotional states. The results were found to be consistent with results from a previous, related study in which the same subjects had chosen drawn or photographed faces to indicate their judgements of the same videotapes of emotional expression. It is suggested that these findings reflect an important aspect of autistic children's social disability.

  19. Le difficoltà emotive nello sviluppo: il caso dell’alessitimia e dell’autolesionismo. Dalla ricerca psicologica e neuroscientifica alla psicoterapia

    Directory of Open Access Journals (Sweden)

    Antonella Marchetti

    2013-12-01

    Full Text Available Emotional Developmental Disorders: The Case of Alexithymia and Self-harm. From Psychological and Neuroscientific Research to Psychotherapy - The paper examines the contribution that psychological and neuroscientific research on emotional developmental problems may offer for psychotherapy. Alexithymia – i.e. the inability to recognize and express one’s own emotions – and self-harm in adolescence are considered as examples of emotional problems. Starting from this analysis, we address some currently debated theoretical and methodological issues in psychotherapy.

  20. Brain correlates of musical and facial emotion recognition: evidence from the dementias.

    Science.gov (United States)

    Hsieh, S; Hornberger, M; Piguet, O; Hodges, J R

    2012-07-01

    The recognition of facial expressions of emotion is impaired in semantic dementia (SD) and is associated with right-sided brain atrophy in areas known to be involved in emotion processing, notably the amygdala. Whether patients with SD also experience difficulty recognizing emotions conveyed by other media, such as music, is unclear. Prior studies have used excerpts of known music from classical or film repertoire but not unfamiliar melodies designed to convey distinct emotions. Patients with SD (n = 11), Alzheimer's disease (n = 12) and healthy control participants (n = 20) underwent tests of emotion recognition in two modalities: unfamiliar musical tunes and unknown faces as well as volumetric MRI. Patients with SD were most impaired with the recognition of facial and musical emotions, particularly for negative emotions. Voxel-based morphometry showed that the labelling of emotions, regardless of modality, correlated with the degree of atrophy in the right temporal pole, amygdala and insula. The recognition of musical (but not facial) emotions was also associated with atrophy of the left anterior and inferior temporal lobe, which overlapped with regions correlating with standardized measures of verbal semantic memory. These findings highlight the common neural substrates supporting the processing of emotions by facial and musical stimuli but also indicate that the recognition of emotions from music draws upon brain regions that are associated with semantics in language. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Analysis of physiological signals for recognition of boredom, pain, and surprise emotions.

    Science.gov (United States)

    Jang, Eun-Hye; Park, Byoung-Jun; Park, Mi-Sook; Kim, Sang-Hyeob; Sohn, Jin-Hun

    2015-06-18

    The aim of the study was to examine the differences of boredom, pain, and surprise. In addition to that, it was conducted to propose approaches for emotion recognition based on physiological signals. Three emotions, boredom, pain, and surprise, are induced through the presentation of emotional stimuli and electrocardiography (ECG), electrodermal activity (EDA), skin temperature (SKT), and photoplethysmography (PPG) as physiological signals are measured to collect a dataset from 217 participants when experiencing the emotions. Twenty-seven physiological features are extracted from the signals to classify the three emotions. The discriminant function analysis (DFA) as a statistical method, and five machine learning algorithms (linear discriminant analysis (LDA), classification and regression trees (CART), self-organizing map (SOM), Naïve Bayes algorithm, and support vector machine (SVM)) are used for classifying the emotions. The result shows that the difference of physiological responses among emotions is significant in heart rate (HR), skin conductance level (SCL), skin conductance response (SCR), mean skin temperature (meanSKT), blood volume pulse (BVP), and pulse transit time (PTT), and the highest recognition accuracy of 84.7% is obtained by using DFA. This study demonstrates the differences of boredom, pain, and surprise and the best emotion recognizer for the classification of the three emotions by using physiological signals.

  2. Emotion Analysis of Telephone Complaints from Customer Based on Affective Computing.

    Science.gov (United States)

    Gong, Shuangping; Dai, Yonghui; Ji, Jun; Wang, Jinzhao; Sun, Hai

    2015-01-01

    Customer complaint has been the important feedback for modern enterprises to improve their product and service quality as well as the customer's loyalty. As one of the commonly used manners in customer complaint, telephone communication carries rich emotional information of speeches, which provides valuable resources for perceiving the customer's satisfaction and studying the complaint handling skills. This paper studies the characteristics of telephone complaint speeches and proposes an analysis method based on affective computing technology, which can recognize the dynamic changes of customer emotions from the conversations between the service staff and the customer. The recognition process includes speaker recognition, emotional feature parameter extraction, and dynamic emotion recognition. Experimental results show that this method is effective and can reach high recognition rates of happy and angry states. It has been successfully applied to the operation quality and service administration in telecom and Internet service company.

  3. Emotion Analysis of Telephone Complaints from Customer Based on Affective Computing

    Directory of Open Access Journals (Sweden)

    Shuangping Gong

    2015-01-01

    Full Text Available Customer complaint has been the important feedback for modern enterprises to improve their product and service quality as well as the customer’s loyalty. As one of the commonly used manners in customer complaint, telephone communication carries rich emotional information of speeches, which provides valuable resources for perceiving the customer’s satisfaction and studying the complaint handling skills. This paper studies the characteristics of telephone complaint speeches and proposes an analysis method based on affective computing technology, which can recognize the dynamic changes of customer emotions from the conversations between the service staff and the customer. The recognition process includes speaker recognition, emotional feature parameter extraction, and dynamic emotion recognition. Experimental results show that this method is effective and can reach high recognition rates of happy and angry states. It has been successfully applied to the operation quality and service administration in telecom and Internet service company.

  4. Evolving Judgments of Terror Risks: Foresight, Hindsight, and Emotion--A Reanalysis

    Science.gov (United States)

    Fischhoff, Baruch; Gonzalez, Roxana M.; Lerner, Jennifer S.; Small, Deborah A.

    2012-01-01

    The authors examined the evolution of cognitive and emotional responses to terror risks for a nationally representative sample of Americans between late 2001 and late 2002. Respondents' risk judgments changed in ways consistent with their reported personal experiences. However, they did not recognize these changes, producing hindsight bias in…

  5. Helping Students with Emotional and Behavioral Disorders Solve Mathematics Word Problems

    Science.gov (United States)

    Alter, Peter

    2012-01-01

    The author presents a strategy for helping students with emotional and behavioral disorders become more proficient at solving math word problems. Math word problems require students to go beyond simple computation in mathematics (e.g., adding, subtracting, multiplying, and dividing) and use higher level reasoning that includes recognizing relevant…

  6. The need to nurse the nurse: emotional labor in neonatal intensive care.

    Science.gov (United States)

    Cricco-Lizza, Roberta

    2014-05-01

    In this 14-month ethnographic study, I examined the emotional labor and coping strategies of 114, level-4, neonatal intensive care unit (NICU) nurses. Emotional labor was an underrecognized component in the care of vulnerable infants and families. The nature of this labor was contextualized within complex personal, professional, and organizational layers of demand on the emotions of NICU nurses. Coping strategies included talking with the sisterhood of nurses, being a super nurse, using social talk and humor, taking breaks, offering flexible aid, withdrawing from emotional pain, transferring out of the NICU, attending memorial services, and reframing loss to find meaning in work. The organization had strong staffing, but emotional labor was not recognized, supported, or rewarded. The findings can contribute to the development of interventions to nurse the nurse, and to ultimately facilitate NICU nurses' nurturance of stressed families. These have implications for staff retention, job satisfaction, and delivery of care.

  7. Emotional prosody perception and its association with pragmatic language in school-aged children with high-function autism.

    Science.gov (United States)

    Wang, Jia-En; Tsao, Feng-Ming

    2015-02-01

    Emotional prosody perception is essential for social communication, but it is still an open issue whether children with high-function autism (HFA) exhibit any prosodic perception deficits or experience selective impairments in recognizing the prosody of positive emotions. Moreover, the associations between prosody perception, pragmatic language, and social adaptation in children with HFA have not been fully explored. This study investigated whether emotional prosody perception for words and sentences in children with HFA (n=25, 6-11 years of age) differed from age-matched, typically developing children (TD, n=25) when presented with an emotional prosody identification task. The Children's Communication Checklist and Vineland Adaptive Behavior Scale were used to assess pragmatic and social adaption abilities. Results show that children with HFA performed poorer than TD children in identifying happy prosody in both emotionally neutral and relevant utterances. In contrast, children with HFA did not exhibit any deficits in identifying sad and angry prosody. Results of correlation analyses revealed a positive association between happy prosody identification and pragmatic function. The findings indicate that school-aged children with HFA experience difficulties in recognizing happy prosody, and that this limitation in prosody perception is associated with their pragmatic and social adaption performances. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Emotional collectives: How groups shape emotions and emotions shape groups.

    Science.gov (United States)

    van Kleef, Gerben A; Fischer, Agneta H

    2016-01-01

    Group settings are epicentres of emotional activity. Yet, the role of emotions in groups is poorly understood. How do group-level phenomena shape group members' emotional experience and expression? How are emotional expressions recognised, interpreted and shared in group settings? And how do such expressions influence the emotions, cognitions and behaviours of fellow group members and outside observers? To answer these and other questions, we draw on relevant theoretical perspectives (e.g., intergroup emotions theory, social appraisal theory and emotions as social information theory) and recent empirical findings regarding the role of emotions in groups. We organise our review according to two overarching themes: how groups shape emotions and how emotions shape groups. We show how novel empirical approaches break important new ground in uncovering the role of emotions in groups. Research on emotional collectives is thriving and constitutes a key to understanding the social nature of emotions.

  9. Unforgettable film music: the role of emotion in episodic long-term memory for music.

    Science.gov (United States)

    Eschrich, Susann; Münte, Thomas F; Altenmüller, Eckart O

    2008-05-28

    Specific pieces of music can elicit strong emotions in listeners and, possibly in connection with these emotions, can be remembered even years later. However, episodic memory for emotional music compared with less emotional music has not yet been examined. We investigated whether emotional music is remembered better than less emotional music. Also, we examined the influence of musical structure on memory performance. Recognition of 40 musical excerpts was investigated as a function of arousal, valence, and emotional intensity ratings of the music. In the first session the participants judged valence and arousal of the musical pieces. One week later, participants listened to the 40 old and 40 new musical excerpts randomly interspersed and were asked to make an old/new decision as well as to indicate arousal and valence of the pieces. Musical pieces that were rated as very positive were recognized significantly better. Musical excerpts rated as very positive are remembered better. Valence seems to be an important modulator of episodic long-term memory for music. Evidently, strong emotions related to the musical experience facilitate memory formation and retrieval.

  10. Emotional Intelligence in Language Instruction in Oman: The Missing Link?

    Science.gov (United States)

    Balasubramanian, Chandrika; Al-Mahrooqi, Rahma

    2016-01-01

    The field of English Language Teaching (ELT) has long sought to identify traits of good language learners, in an effort to teach these traits to less successful language learners (Rubin, 1975). Emotional Intelligence has recently come to the forefront of research on language learning and teaching, and is now increasingly recognized as an important…

  11. The role of emotions in spatial prisoner's dilemma game with voluntary participation

    Science.gov (United States)

    Wang, Lu; Ye, Shun-Qiang; Cheong, Kang Hao; Bao, Wei; Xie, Neng-gang

    2018-01-01

    This paper develops an extended emotional model in the voluntary prisoner's dilemma game. It fills a gap in the traditional imitation mechanism by assuming that players do not simply imitate pure strategies, but imitate the emotional profiles of one another instead. The relationship between emotional profiles and strategy are constructed and Monte Carlo simulations are performed on a square lattice. Simulation results reveal that with an increase in the temptation parameter T (1 ⩽ T ⩽ 2), the order-chaos-order transition occurs. When T is around 1.2, we find that a bifurcation occurs. From a social system perspective, as T increases, the system changes from a benign one (respect for the 'successful' people and sympathy towards the weak one) to one that is of vicious nature (that is, bully the weak and fear the strong).

  12. Computation of emotions in man and machines.

    Science.gov (United States)

    Robinson, Peter; el Kaliouby, Rana

    2009-12-12

    The importance of emotional expression as part of human communication has been understood since Aristotle, and the subject has been explored scientifically since Charles Darwin and others in the nineteenth century. Advances in computer technology now allow machines to recognize and express emotions, paving the way for improved human-computer and human-human communications. Recent advances in psychology have greatly improved our understanding of the role of affect in communication, perception, decision-making, attention and memory. At the same time, advances in technology mean that it is becoming possible for machines to sense, analyse and express emotions. We can now consider how these advances relate to each other and how they can be brought together to influence future research in perception, attention, learning, memory, communication, decision-making and other applications. The computation of emotions includes both recognition and synthesis, using channels such as facial expressions, non-verbal aspects of speech, posture, gestures, physiology, brain imaging and general behaviour. The combination of new results in psychology with new techniques of computation is leading to new technologies with applications in commerce, education, entertainment, security, therapy and everyday life. However, there are important issues of privacy and personal expression that must also be considered.

  13. Mapping correspondence between facial mimicry and emotion recognition in healthy subjects.

    Science.gov (United States)

    Ponari, Marta; Conson, Massimiliano; D'Amico, Nunzia Pina; Grossi, Dario; Trojano, Luigi

    2012-12-01

    We aimed at verifying the hypothesis that facial mimicry is causally and selectively involved in emotion recognition. For this purpose, in Experiment 1, we explored the effect of tonic contraction of muscles in upper or lower half of participants' face on their ability to recognize emotional facial expressions. We found that the "lower" manipulation specifically impaired recognition of happiness and disgust, the "upper" manipulation impaired recognition of anger, while both manipulations affected recognition of fear; recognition of surprise and sadness were not affected by either blocking manipulations. In Experiment 2, we verified whether emotion recognition is hampered by stimuli in which an upper or lower half-face showing an emotional expression is combined with a neutral half-face. We found that the neutral lower half-face interfered with recognition of happiness and disgust, whereas the neutral upper half impaired recognition of anger; recognition of fear and sadness was impaired by both manipulations, whereas recognition of surprise was not affected by either manipulation. Taken together, the present findings support simulation models of emotion recognition and provide insight into the role of mimicry in comprehension of others' emotional facial expressions. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  14. Maternal Attachment Representation and Neurophysiological Processing during the Perception of Infants' Emotional Expressions.

    Directory of Open Access Journals (Sweden)

    Rainer Leyh

    Full Text Available The perception of infant emotions is an integral part of sensitive caregiving within the mother-child relationship, a maternal ability which develops in mothers during their own attachment history. In this study we address the association between maternal attachment representation and brain activity underlying the perception of infant emotions. Event related potentials (ERPs of 32 primiparous mothers were assessed during a three stimulus oddball task presenting negative, positive and neutral emotion expressions of infants as target, deviant or standard stimuli. Attachment representation was assessed with the Adult Attachment Interview during pregnancy. Securely attached mothers recognized emotions of infants more accurately than insecurely attached mothers. ERPs yielded amplified N170 amplitudes for insecure mothers when focusing on negative infant emotions. Secure mothers showed enlarged P3 amplitudes to target emotion expressions of infants compared to insecure mothers, especially within conditions with frequent negative infant emotions. In these conditions, P3 latencies were prolonged in insecure mothers. In summary, maternal attachment representation was found associated with brain activity during the perception of infant emotions. This further clarifies psychological mechanisms contributing to maternal sensitivity.

  15. Reasoning strategies modulate gender differences in emotion processing.

    Science.gov (United States)

    Markovits, Henry; Trémolière, Bastien; Blanchette, Isabelle

    2018-01-01

    The dual strategy model of reasoning has proposed that people's reasoning can be understood asa combination of two different ways of processing information related to problem premises: a counterexample strategy that examines information for explicit potential counterexamples and a statistical strategy that uses associative access to generate a likelihood estimate of putative conclusions. Previous studies have examined this model in the context of basic conditional reasoning tasks. However, the information processing distinction that underlies the dual strategy model can be seen asa basic description of differences in reasoning (similar to that described by many general dual process models of reasoning). In two studies, we examine how these differences in reasoning strategy may relate to processing very different information, specifically we focus on previously observed gender differences in processing negative emotions. Study 1 examined the intensity of emotional reactions to a film clip inducing primarily negative emotions. Study 2 examined the speed at which participants determine the emotional valence of sequences of negative images. In both studies, no gender differences were observed among participants using a counterexample strategy. Among participants using a statistical strategy, females produce significantly stronger emotional reactions than males (in Study 1) and were faster to recognize the valence of negative images than were males (in Study 2). Results show that the processing distinction underlying the dual strategy model of reasoning generalizes to the processing of emotions. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Neurons in the human amygdala selective for perceived emotion

    Science.gov (United States)

    Wang, Shuo; Tudusciuc, Oana; Mamelak, Adam N.; Ross, Ian B.; Adolphs, Ralph; Rutishauser, Ueli

    2014-01-01

    The human amygdala plays a key role in recognizing facial emotions and neurons in the monkey and human amygdala respond to the emotional expression of faces. However, it remains unknown whether these responses are driven primarily by properties of the stimulus or by the perceptual judgments of the perceiver. We investigated these questions by recording from over 200 single neurons in the amygdalae of 7 neurosurgical patients with implanted depth electrodes. We presented degraded fear and happy faces and asked subjects to discriminate their emotion by button press. During trials where subjects responded correctly, we found neurons that distinguished fear vs. happy emotions as expressed by the displayed faces. During incorrect trials, these neurons indicated the patients’ subjective judgment. Additional analysis revealed that, on average, all neuronal responses were modulated most by increases or decreases in response to happy faces, and driven predominantly by judgments about the eye region of the face stimuli. Following the same analyses, we showed that hippocampal neurons, unlike amygdala neurons, only encoded emotions but not subjective judgment. Our results suggest that the amygdala specifically encodes the subjective judgment of emotional faces, but that it plays less of a role in simply encoding aspects of the image array. The conscious percept of the emotion shown in a face may thus arise from interactions between the amygdala and its connections within a distributed cortical network, a scheme also consistent with the long response latencies observed in human amygdala recordings. PMID:24982200

  17. A change in strategy: Static emotion recognition in Malaysian Chinese

    Directory of Open Access Journals (Sweden)

    Chrystalle B.Y. Tan

    2015-12-01

    Full Text Available Studies have shown that while East Asians focused on the center of the face to recognize identities, participants adapted their strategy by focusing more on the eyes to identify emotions, suggesting that the eyes may contain salient information pertaining to emotional state in Eastern cultures. However, Western Caucasians employ the same strategy by moving between the eyes and mouth to identify both identities and emotions. Malaysian Chinese have been shown to focus on the eyes and nose more than the mouth during face recognition task, which represents an intermediate between Eastern and Western looking strategies. The current study examined whether Malaysian Chinese continue to employ an intermediate strategy or shift towards an Eastern or Western pattern (by fixating more on the eyes or mouth respectively during an emotion recognition task. Participants focused more on the eyes, followed by the nose then mouth. Directing attention towards the eye region resulted in better recognition of certain own- than other-race emotions. Although the fixation patterns appear similar for both tasks, further analyses showed that fixations on the eyes were reduced whereas fixations on the nose and mouth were increased during emotion recognition, indicating that participants adapt looking strategies based on their aims.

  18. Semantic Analysis of Learners’ Emotional Tendencies on Online MOOC Education

    Directory of Open Access Journals (Sweden)

    Ling Wang

    2018-06-01

    Full Text Available As a new education product in the information age, Massive Open Online Courses (MOOCs command momentous public attention for their unexpected rise and flexible application. However, the striking contrast between the high rate of registration and the low rate of completion has put their development into a bottleneck. In this paper, we present a semantic analysis model (SMA to track the emotional tendencies of learners in order to analyze the acceptance of the courses based on big data from homework completion, comments, forums and other real-time update information on the MOOC platforms. Through emotional quantification and machine learning calculations, graduation probability can be predicted for different stages of learning in real time. Especially for learners with emotional tendencies, customized instruction could be made in order to improve completion and graduation rates. Furthermore, we classified the learners into four categories according to course participation time series and emotional states. In the experiments, we made a comprehensive evaluation of the students’ overall learning status by kinds of learners and emotional tendencies. Our proposed method can effectively recognize learners’ emotional tendencies by semantic analysis, providing an effective solution for MOOC personalized teaching, which can help achieve education for sustainable development.

  19. Children's Recognition of Emotional Facial Expressions Through Photographs and Drawings.

    Science.gov (United States)

    Brechet, Claire

    2017-01-01

    The author's purpose was to examine children's recognition of emotional facial expressions, by comparing two types of stimulus: photographs and drawings. The author aimed to investigate whether drawings could be considered as a more evocative material than photographs, as a function of age and emotion. Five- and 7-year-old children were presented with photographs and drawings displaying facial expressions of 4 basic emotions (i.e., happiness, sadness, anger, and fear) and were asked to perform a matching task by pointing to the face corresponding to the target emotion labeled by the experimenter. The photographs we used were selected from the Radboud Faces Database and the drawings were designed on the basis of both the facial components involved in the expression of these emotions and the graphic cues children tend to use when asked to depict these emotions in their own drawings. Our results show that drawings are better recognized than photographs, for sadness, anger, and fear (with no difference for happiness, due to a ceiling effect). And that the difference between the 2 types of stimuli tends to be more important for 5-year-olds compared to 7-year-olds. These results are discussed in view of their implications, both for future research and for practical application.

  20. Recognition of a Baby's Emotional Cry towards Robotics Baby Caregiver

    Directory of Open Access Journals (Sweden)

    Shota Yamamoto

    2013-02-01

    Full Text Available We developed a method for pattern recognition of baby's emotions (discomfortable, hungry, or sleepy expressed in the baby's cries. A 32-dimensional fast Fourier transform is performed for sound form clips, detected by our reported method and used as training data. The power of the sound form judged as a silent region is subtracted from each power of the frequency element. The power of each frequency element after the subtraction is treated as one of the elements of the feature vector. We perform principal component analysis (PCA for the feature vectors of the training data. The emotion of the baby is recognized by the nearest neighbor criterion applied to the feature vector obtained from the test data of sound form clips after projecting the feature vector on the PCA space from the training data. Then, the emotion with the highest frequency among the recognition results for a sound form clip is judged as the emotion expressed by the baby's cry. We successfully applied the proposed method to pattern recognition of baby's emotions. The present investigation concerns the first stage of the development of a robotics baby caregiver that has the ability to detect baby's emotions. In this first stage, we have developed a method for detecting baby's emotions. We expect that the proposed method could be used in robots that can help take care of babies.

  1. Perceptions of Emotion from Facial Expressions are Not Culturally Universal: Evidence from a Remote Culture

    Science.gov (United States)

    Gendron, Maria; Roberson, Debi; van der Vyver, Jacoba Marietta; Barrett, Lisa Feldman

    2014-01-01

    It is widely believed that certain emotions are universally recognized in facial expressions. Recent evidence indicates that Western perceptions (e.g., scowls as anger) depend on cues to US emotion concepts embedded in experiments. Since such cues are standard feature in methods used in cross-cultural experiments, we hypothesized that evidence of universality depends on this conceptual context. In our study, participants from the US and the Himba ethnic group sorted images of posed facial expressions into piles by emotion type. Without cues to emotion concepts, Himba participants did not show the presumed “universal” pattern, whereas US participants produced a pattern with presumed universal features. With cues to emotion concepts, participants in both cultures produced sorts that were closer to the presumed “universal” pattern, although substantial cultural variation persisted. Our findings indicate that perceptions of emotion are not universal, but depend on cultural and conceptual contexts. PMID:24708506

  2. Understanding the interplay of cancer patients' instrumental concerns and emotions.

    Science.gov (United States)

    Brandes, Kim; van der Goot, Margot J; Smit, Edith G; van Weert, Julia C M; Linn, Annemiek J

    2017-05-01

    1) to assess patients' descriptions of concerns, and 2) to inform a conceptual framework in which the impact of the nature of concerns on doctor-patient communication is specified. Six focus groups were conducted with 39 cancer patients and survivors. In these focus groups participants were asked to describe their concerns during and after their illness. Concerns were described as instrumental concerns (e.g., receiving insufficient information) and emotions (e.g., sadness). Patients frequently explained their concerns as an interplay of instrumental concerns and emotions. Examples of the interplay were "receiving incorrect information" and "frustration", and "difficulties with searching, finding and judging of information" and "fear". Instrumental concerns need to be taken into account in the operationalization of concerns in research. Based on the interplay, the conceptual framework suggests that patients can express instrumental concerns as emotions and emotions as instrumental concerns. Consequently, providers can respond with instrumental and emotional communication when patients express an interplay of concerns. The results of this study can be used to support providers in recognizing concerns that are expressed by patients in consultations. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Exploring Cultural Differences in the Recognition of the Self-Conscious Emotions.

    Directory of Open Access Journals (Sweden)

    Joanne M Chung

    Full Text Available Recent research suggests that the self-conscious emotions of embarrassment, shame, and pride have distinct, nonverbal expressions that can be recognized in the United States at above-chance levels. However, few studies have examined the recognition of these emotions in other cultures, and little research has been conducted in Asia. Consequently the cross-cultural generalizability of self-conscious emotions has not been firmly established. Additionally, there is no research that examines cultural variability in the recognition of the self-conscious emotions. Cultural values and exposure to Western culture have been identified as contributors to variability in recognition rates for the basic emotions; we sought to examine this for the self-conscious emotions using the University of California, Davis Set of Emotion Expressions (UCDSEE. The present research examined recognition of the self-conscious emotion expressions in South Korean college students and found that recognition rates were very high for pride, low but above chance for shame, and near zero for embarrassment. To examine what might be underlying the recognition rates we found in South Korea, recognition of self-conscious emotions and several cultural values were examined in a U.S. college student sample of European Americans, Asian Americans, and Asian-born individuals. Emotion recognition rates were generally similar between the European Americans and Asian Americans, and higher than emotion recognition rates for Asian-born individuals. These differences were not explained by cultural values in an interpretable manner, suggesting that exposure to Western culture is a more important mediator than values.

  4. Exploring Cultural Differences in the Recognition of the Self-Conscious Emotions.

    Science.gov (United States)

    Chung, Joanne M; Robins, Richard W

    2015-01-01

    Recent research suggests that the self-conscious emotions of embarrassment, shame, and pride have distinct, nonverbal expressions that can be recognized in the United States at above-chance levels. However, few studies have examined the recognition of these emotions in other cultures, and little research has been conducted in Asia. Consequently the cross-cultural generalizability of self-conscious emotions has not been firmly established. Additionally, there is no research that examines cultural variability in the recognition of the self-conscious emotions. Cultural values and exposure to Western culture have been identified as contributors to variability in recognition rates for the basic emotions; we sought to examine this for the self-conscious emotions using the University of California, Davis Set of Emotion Expressions (UCDSEE). The present research examined recognition of the self-conscious emotion expressions in South Korean college students and found that recognition rates were very high for pride, low but above chance for shame, and near zero for embarrassment. To examine what might be underlying the recognition rates we found in South Korea, recognition of self-conscious emotions and several cultural values were examined in a U.S. college student sample of European Americans, Asian Americans, and Asian-born individuals. Emotion recognition rates were generally similar between the European Americans and Asian Americans, and higher than emotion recognition rates for Asian-born individuals. These differences were not explained by cultural values in an interpretable manner, suggesting that exposure to Western culture is a more important mediator than values.

  5. Exploring Cultural Differences in the Recognition of the Self-Conscious Emotions

    Science.gov (United States)

    Chung, Joanne M.; Robins, Richard W.

    2015-01-01

    Recent research suggests that the self-conscious emotions of embarrassment, shame, and pride have distinct, nonverbal expressions that can be recognized in the United States at above-chance levels. However, few studies have examined the recognition of these emotions in other cultures, and little research has been conducted in Asia. Consequently the cross-cultural generalizability of self-conscious emotions has not been firmly established. Additionally, there is no research that examines cultural variability in the recognition of the self-conscious emotions. Cultural values and exposure to Western culture have been identified as contributors to variability in recognition rates for the basic emotions; we sought to examine this for the self-conscious emotions using the University of California, Davis Set of Emotion Expressions (UCDSEE). The present research examined recognition of the self-conscious emotion expressions in South Korean college students and found that recognition rates were very high for pride, low but above chance for shame, and near zero for embarrassment. To examine what might be underlying the recognition rates we found in South Korea, recognition of self-conscious emotions and several cultural values were examined in a U.S. college student sample of European Americans, Asian Americans, and Asian-born individuals. Emotion recognition rates were generally similar between the European Americans and Asian Americans, and higher than emotion recognition rates for Asian-born individuals. These differences were not explained by cultural values in an interpretable manner, suggesting that exposure to Western culture is a more important mediator than values. PMID:26309215

  6. Sensing Emotion in Voices: Negativity Bias and Gender Differences in a Validation Study of the Oxford Vocal (?OxVoc?) Sounds Database

    OpenAIRE

    Young, Katherine S.; Parsons, Christine E.; LeBeau, Richard T.; Tabak, Benjamin A.; Sewart, Amy R.; Stein, Alan; Kringelbach, Morten L.; Craske, Michelle G.

    2016-01-01

    Emotional expressions are an essential element of human interactions. Recent work has increasingly recognized that emotional vocalizations can color and shape interactions between individuals. Here we present data on the psychometric properties of a recently developed database of authentic nonlinguistic emotional vocalizations from human adults and infants (the Oxford Vocal 'OxVoc' Sounds Database; Parsons, Young, Craske, Stein, & Kringelbach, 2014). In a large sample (n = 562), we demonstrat...

  7. Perspectives on the Pure-Tone Audiogram.

    Science.gov (United States)

    Musiek, Frank E; Shinn, Jennifer; Chermak, Gail D; Bamiou, Doris-Eva

    The pure-tone audiogram, though fundamental to audiology, presents limitations, especially in the case of central auditory involvement. Advances in auditory neuroscience underscore the considerably larger role of the central auditory nervous system (CANS) in hearing and related disorders. Given the availability of behavioral audiological tests and electrophysiological procedures that can provide better insights as to the function of the various components of the auditory system, this perspective piece reviews the limitations of the pure-tone audiogram and notes some of the advantages of other tests and procedures used in tandem with the pure-tone threshold measurement. To review and synthesize the literature regarding the utility and limitations of the pure-tone audiogram in determining dysfunction of peripheral sensory and neural systems, as well as the CANS, and to identify other tests and procedures that can supplement pure-tone thresholds and provide enhanced diagnostic insight, especially regarding problems of the central auditory system. A systematic review and synthesis of the literature. The authors independently searched and reviewed literature (journal articles, book chapters) pertaining to the limitations of the pure-tone audiogram. The pure-tone audiogram provides information as to hearing sensitivity across a selected frequency range. Normal or near-normal pure-tone thresholds sometimes are observed despite cochlear damage. There are a surprising number of patients with acoustic neuromas who have essentially normal pure-tone thresholds. In cases of central deafness, depressed pure-tone thresholds may not accurately reflect the status of the peripheral auditory system. Listening difficulties are seen in the presence of normal pure-tone thresholds. Suprathreshold procedures and a variety of other tests can provide information regarding other and often more central functions of the auditory system. The audiogram is a primary tool for determining type

  8. Personality styles in a non-clinical sample : The role of emotion dysregulation and impulsivity

    NARCIS (Netherlands)

    Velotti, P.; Garofalo, C.

    2015-01-01

    Theories of personality and personality disorders are increasingly considering the centrality of emotion regulation and its dimensions. Impulsivity as well is recognized as a personality trait underlying diverse symptom presentations. Although research in this field has mainly regarded borderline

  9. Emotion Index of Cover Song Music Video Clips based on Facial Expression Recognition

    DEFF Research Database (Denmark)

    Kavallakis, George; Vidakis, Nikolaos; Triantafyllidis, Georgios

    2017-01-01

    This paper presents a scheme of creating an emotion index of cover song music video clips by recognizing and classifying facial expressions of the artist in the video. More specifically, it fuses effective and robust algorithms which are employed for expression recognition, along with the use...... of a neural network system using the features extracted by the SIFT algorithm. Also we support the need of this fusion of different expression recognition algorithms, because of the way that emotions are linked to facial expressions in music video clips....

  10. The Curvilinear Relationship between Age and Emotional Aperture: The Moderating Role of Agreeableness

    OpenAIRE

    Faber, Anna; Walter, Frank

    2017-01-01

    The capability to correctly recognize collective emotion expressions [i.e., emotional aperture (EA)] is crucial for effective social and work-related interactions. Yet, little remains known about the antecedents of this ability. The present study therefore aims to shed new light onto key aspects that may promote or diminish an individual’s EA. We examine the role of age for this ability in an online sample of 181 participants (with an age range of 18–72 years, located in Germany), and we inve...

  11. The contemptuous separation: Facial expressions of emotion and breakups in young adulthood.

    Science.gov (United States)

    Heshmati, Saeideh; Sbarra, David A; Mason, Ashley E

    2017-06-01

    The importance of studying specific and expressed emotions after a stressful life event is well known, yet few studies have moved beyond assessing self-reported emotional responses to a romantic breakup. This study examined associations between computer-recognized facial expressions and self-reported breakup-related distress among recently separated college-aged young adults ( N = 135; 37 men) on four visits across 9 weeks. Participants' facial expressions were coded using the Computer Expression Recognition Toolbox while participants spoke about their breakups. Of the seven expressed emotions studied, only Contempt showed a unique association with breakup-related distress over time. At baseline, greater Contempt was associated with less breakup-related distress; however, over time, greater Contempt was associated with greater breakup-related distress.

  12. Generalized pure Lovelock gravity

    Science.gov (United States)

    Concha, Patrick; Rodríguez, Evelyn

    2017-11-01

    We present a generalization of the n-dimensional (pure) Lovelock Gravity theory based on an enlarged Lorentz symmetry. In particular, we propose an alternative way to introduce a cosmological term. Interestingly, we show that the usual pure Lovelock gravity is recovered in a matter-free configuration. The five and six-dimensional cases are explicitly studied.

  13. A DESIGN FRAMEWORK FOR HUMAN EMOTION RECOGNITION USING ELECTROCARDIOGRAM AND SKIN CONDUCTANCE RESPONSE SIGNALS

    Directory of Open Access Journals (Sweden)

    KHAIRUN NISA’ MINHAD

    2017-11-01

    Full Text Available Identification of human emotional state while driving a vehicle can help in understanding the human behaviour. Based on this identification, a response system can be developed in order to mitigate the impact that may be resulted from the behavioural changes. However, the adaptation of emotions to the environment at most scenarios is subjective to an individual’s perspective. Many factors, mainly cultural and geography, gender, age, life style and history, level of education and professional status, can affect the detection of human emotional affective states. This work investigated sympathetic responses toward human emotions defined by using electrocardiography (ECG and skin conductance response (SCR signals recorded simultaneously. This work aimed to recognize ECG and SCR patterns of the investigated emotions measured using selected sensor. A pilot study was conducted to evaluate the proposed framework. Initial results demonstrated the importance of suitability of the stimuli used to evoke the emotions and high opportunity for the ECG and SCR signals to be used in the automotive real-time emotion recognition systems.

  14. Strength Is in Numbers: Can Concordant Artificial Listeners Improve Prediction of Emotion from Speech?

    Science.gov (United States)

    Martinelli, Eugenio; Mencattini, Arianna; Daprati, Elena; Di Natale, Corrado

    2016-01-01

    Humans can communicate their emotions by modulating facial expressions or the tone of their voice. Albeit numerous applications exist that enable machines to read facial emotions and recognize the content of verbal messages, methods for speech emotion recognition are still in their infancy. Yet, fast and reliable applications for emotion recognition are the obvious advancement of present 'intelligent personal assistants', and may have countless applications in diagnostics, rehabilitation and research. Taking inspiration from the dynamics of human group decision-making, we devised a novel speech emotion recognition system that applies, for the first time, a semi-supervised prediction model based on consensus. Three tests were carried out to compare this algorithm with traditional approaches. Labeling performances relative to a public database of spontaneous speeches are reported. The novel system appears to be fast, robust and less computationally demanding than traditional methods, allowing for easier implementation in portable voice-analyzers (as used in rehabilitation, research, industry, etc.) and for applications in the research domain (such as real-time pairing of stimuli to participants' emotional state, selective/differential data collection based on emotional content, etc.).

  15. A Sex-Shop’s Advertisement as a Trigger of Reflection on the Emotional Perception of Motor Disability

    Directory of Open Access Journals (Sweden)

    Małgorzata Lewartowska-Zychowicz

    2016-12-01

    Full Text Available The article discusses a study of physical disability triggered by a sexshop’s controversial window-display which included typical items related to disability and illness. The study’s aim is to recognize the specificity of emotional perception, as reported by respondents with and without motor disability, against the background of cultural patterns of emotionality and contemporary discourses on people with disabilities.

  16. Emotion Recognition in Frontotemporal Dementia and Alzheimer's Disease: A New Film-Based Assessment

    Science.gov (United States)

    Goodkind, Madeleine S.; Sturm, Virginia E.; Ascher, Elizabeth A.; Shdo, Suzanne M.; Miller, Bruce L.; Rankin, Katherine P.; Levenson, Robert W.

    2015-01-01

    Deficits in recognizing others' emotions are reported in many psychiatric and neurological disorders, including autism, schizophrenia, behavioral variant frontotemporal dementia (bvFTD) and Alzheimer's disease (AD). Most previous emotion recognition studies have required participants to identify emotional expressions in photographs. This type of assessment differs from real-world emotion recognition in important ways: Images are static rather than dynamic, include only 1 modality of emotional information (i.e., visual information), and are presented absent a social context. Additionally, existing emotion recognition batteries typically include multiple negative emotions, but only 1 positive emotion (i.e., happiness) and no self-conscious emotions (e.g., embarrassment). We present initial results using a new task for assessing emotion recognition that was developed to address these limitations. In this task, respondents view a series of short film clips and are asked to identify the main characters' emotions. The task assesses multiple negative, positive, and self-conscious emotions based on information that is multimodal, dynamic, and socially embedded. We evaluate this approach in a sample of patients with bvFTD, AD, and normal controls. Results indicate that patients with bvFTD have emotion recognition deficits in all 3 categories of emotion compared to the other groups. These deficits were especially pronounced for negative and self-conscious emotions. Emotion recognition in this sample of patients with AD was indistinguishable from controls. These findings underscore the utility of this approach to assessing emotion recognition and suggest that previous findings that recognition of positive emotion was preserved in dementia patients may have resulted from the limited sampling of positive emotion in traditional tests. PMID:26010574

  17. Emotion recognition in frontotemporal dementia and Alzheimer's disease: A new film-based assessment.

    Science.gov (United States)

    Goodkind, Madeleine S; Sturm, Virginia E; Ascher, Elizabeth A; Shdo, Suzanne M; Miller, Bruce L; Rankin, Katherine P; Levenson, Robert W

    2015-08-01

    Deficits in recognizing others' emotions are reported in many psychiatric and neurological disorders, including autism, schizophrenia, behavioral variant frontotemporal dementia (bvFTD) and Alzheimer's disease (AD). Most previous emotion recognition studies have required participants to identify emotional expressions in photographs. This type of assessment differs from real-world emotion recognition in important ways: Images are static rather than dynamic, include only 1 modality of emotional information (i.e., visual information), and are presented absent a social context. Additionally, existing emotion recognition batteries typically include multiple negative emotions, but only 1 positive emotion (i.e., happiness) and no self-conscious emotions (e.g., embarrassment). We present initial results using a new task for assessing emotion recognition that was developed to address these limitations. In this task, respondents view a series of short film clips and are asked to identify the main characters' emotions. The task assesses multiple negative, positive, and self-conscious emotions based on information that is multimodal, dynamic, and socially embedded. We evaluate this approach in a sample of patients with bvFTD, AD, and normal controls. Results indicate that patients with bvFTD have emotion recognition deficits in all 3 categories of emotion compared to the other groups. These deficits were especially pronounced for negative and self-conscious emotions. Emotion recognition in this sample of patients with AD was indistinguishable from controls. These findings underscore the utility of this approach to assessing emotion recognition and suggest that previous findings that recognition of positive emotion was preserved in dementia patients may have resulted from the limited sampling of positive emotion in traditional tests. (c) 2015 APA, all rights reserved).

  18. Generalized pure Lovelock gravity

    Directory of Open Access Journals (Sweden)

    Patrick Concha

    2017-11-01

    Full Text Available We present a generalization of the n-dimensional (pure Lovelock Gravity theory based on an enlarged Lorentz symmetry. In particular, we propose an alternative way to introduce a cosmological term. Interestingly, we show that the usual pure Lovelock gravity is recovered in a matter-free configuration. The five and six-dimensional cases are explicitly studied.

  19. Radio sensibility of listeria monocytogenes in pure culture and frozen contaminated shrimps and stored them at -18 deg C

    International Nuclear Information System (INIS)

    Rubio M V, T.; Espinoza, J.; L Sanchez, M.V.

    1997-01-01

    Full text: Listeria monocytogenes has been recognized for a number of years as a food pathogen bacteria. Ionizing radiation is a new technology and an excellent method to eliminate this microorganism form frozen food. The sensitivity of Listeria monocytogenes serotype 01 and 04 to irradiation, in pure culture and frozen shrimps was investigated. The D 10 values were of 0,26-0,37 kGy in pure culture and frozen shrimps, respectively. The D 10 founded were similar to those reported by the literature under similar conditions. Doses of 6 kGy were enough to eliminate a contamination of 10 6 -10 7 ufc/ml of L. Monocytogenes in frozen shrimps and storage them during 200 days at 18 deg C

  20. ABOUT EMOTIONAL INTELLIGENCE AND LEADERSHIP

    Directory of Open Access Journals (Sweden)

    RADULESCU Corina Michaela

    2012-12-01

    Full Text Available This article is, because of its topic of study, a part of management and includes details regarding the important role of emotional intelligence in management and leadership. The importance of this problem is related to the fact that, in Romanian management, this concept (being of a psycho-management nature, is poorly understood. Emotional intelligence is still a highly publicized concept in the West, subject to many controversies between recognized experts in various fields: organizational management, leadership, psychology, sociology. The target of the article is to highlight the fact that there are few management or recruitment consulting firms in Romania that support emotional intelligence development programs, and fewer are the organizations that realize the impact it has in running a business. Since 1995, from the first publication of Daniel Goleman's book, "Emotional intelligence", EQ has become one of the most debated concepts in U.S management. The content of the article calls for a new business climate, ensuring professional excellence. We want this to be "a guide" in cultivating emotional intelligence in individuals, groups and organizations, through leadership, trying to validate the scientific aspect. Because we live in a time when future projects depend increasingly more on self-control and on the art with which we know to maintain interpersonal relationships, such guidelines are necessary to prevent future challenges. The contribution of the authors brings to the forefront the debate about management, behavior management, the concept of emotional intelligence and the importance of understanding, knowing its substance, and the manner in which the management process has to be adopted in order to achieve positive results in an organization, as a system. Businessmen with a preemptive mind will encourage and support such an education in business, not only to improve the quality of management in their organization but also for the

  1. Perception of emotion in facial stimuli: The interaction of ADRA2A and COMT genotypes, and sex.

    Science.gov (United States)

    Tamm, Gerly; Kreegipuu, Kairi; Harro, Jaanus

    2016-01-04

    Emotional facial stimuli are important social signals that are essential to be perceived and recognized in order to make appropriate decisions and responses in everyday communication. The ability to voluntarily guide attention to perceive and recognize emotions, and react to them varies largely across individuals, and has a strong genetic component (Friedman et al., 2008). Two key genetic variants of the catecholamine system that have been related to emotion perception and attention are the catechol-O-methyl transferase genetic variant (COMT Val158Met) and the α2A-receptor gene promoter polymorphism (ADRA2A C-1291G) accordingly. So far, the interaction of the two with sex in emotion perception has not been studied. Multilevel modeling method was applied to study how COMT Val158Met, ADRA2A C-1291G and sex are associated with measures of emotion perception in a large sample of young adults. Participants (n=506) completed emotion recognition and behavioral emotion detection tasks. It was found that COMT Val158Met genotype in combination with the ADRA2A C-1291G and sex predicts emotion detection, and perception of valence and arousal. In simple visual detection, the ADRA2A C-1291G G-allele leads to slower detection of a highly arousing face (scheming), which is modulated by each additional COMT Val158Met Met-allele and male sex predicting faster responses. The combination of G-allele, Met-allele and male sex also predicts higher perceived negativity in sad faces. No effects of C-1291G, Val158Met, and sex were found on verbal emotion recognition. Applying the findings to study the interplay between catecholamine-O-methyl transferase activity and α2A-receptors in emotion perception disorders (such as ADHD, autism and schizophrenia) in men and women would be the next step towards understanding individual differences in emotion perception. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. The influence of emotion on lexical processing: insights from RT distributional analysis.

    Science.gov (United States)

    Yap, Melvin J; Seow, Cui Shan

    2014-04-01

    In two lexical decision experiments, the present study was designed to examine emotional valence effects on visual lexical decision (standard and go/no-go) performance, using traditional analyses of means and distributional analyses of response times. Consistent with an earlier study by Kousta, Vinson, and Vigliocco (Cognition 112:473-481, 2009), we found that emotional words (both negative and positive) were responded to faster than neutral words. Finer-grained distributional analyses further revealed that the facilitation afforded by valence was reflected by a combination of distributional shifting and an increase in the slow tail of the distribution. This suggests that emotional valence effects in lexical decision are unlikely to be entirely mediated by early, preconscious processes, which are associated with pure distributional shifting. Instead, our results suggest a dissociation between early preconscious processes and a later, more task-specific effect that is driven by feedback from semantically rich representations.

  3. Salience of the lambs: a test of the saliency map hypothesis with pictures of emotive objects.

    Science.gov (United States)

    Humphrey, Katherine; Underwood, Geoffrey; Lambert, Tony

    2012-01-25

    Humans have an ability to rapidly detect emotive stimuli. However, many emotional objects in a scene are also highly visually salient, which raises the question of how dependent the effects of emotionality are on visual saliency and whether the presence of an emotional object changes the power of a more visually salient object in attracting attention. Participants were shown a set of positive, negative, and neutral pictures and completed recall and recognition memory tests. Eye movement data revealed that visual saliency does influence eye movements, but the effect is reliably reduced when an emotional object is present. Pictures containing negative objects were recognized more accurately and recalled in greater detail, and participants fixated more on negative objects than positive or neutral ones. Initial fixations were more likely to be on emotional objects than more visually salient neutral ones, suggesting that the processing of emotional features occurs at a very early stage of perception.

  4. Musical notation reading in pure alexia

    DEFF Research Database (Denmark)

    Starrfelt, Randi; Wong, Yetta K.

    2017-01-01

    Pure alexia (PA) is an acquired reading disorder following lesions to left ventral temporo-occipital cortex. Patients with PA read slowly but correctly, and show an abnormal effect of word length on RTs. However, it is unclear how pure alexia may affect musical notation reading. We report a pure...

  5. [Recognition of facial expressions of emotions by 3-year-olds depending on sleep and risk of depression].

    Science.gov (United States)

    Bat-Pitault, F; Da Fonseca, D; Flori, S; Porcher-Guinet, V; Stagnara, C; Patural, H; Franco, P; Deruelle, C

    2017-10-01

    The emotional process is characterized by a negative bias in depression, thus it was legitimate to establish if they same is true in very young at-risk children. Furthermore, sleep, also proposed as a marker of the depression risk, is closely linked in adults and adolescents with emotions. That is why we wanted first to better describe the characteristics of emotional recognition by 3-year-olds and their links with sleep. Secondly we observed, if found at this young age, an emotional recognition pattern indicating a vulnerability to depression. We studied, in 133 children aged 36 months from the AuBE cohort, the number of correct answers to the task of recognition of facial emotions (joy, anger and sadness). Cognitive functions were also assessed by the WPPSI III at 3 years old, and the different sleep parameters (time of light off and light on, sleep times, difficulty to go to sleep and number of parents' awakes per night) were described by questionnaires filled out by mothers at 6, 12, 18, 24 and 36 months after birth. Of these 133 children, 21 children whose mothers had at least one history of depression (13 boys) were the high-risk group and 19 children (8 boys) born to women with no history of depression were the low-risk group (or control group). Overall, 133 children by the age of 36 months recognize significantly better happiness than other emotions (P=0.000) with a better global recognition higher in girls (M=8.8) than boys (M=7.8) (P=0.013) and a positive correlation between global recognition ability and verbal IQ (P=0.000). Children who have less daytime sleep at 18 months and those who sleep less at 24 months show a better recognition of sadness (P=0.043 and P=0.042); those with difficulties at bedtime at 18 months recognize less happiness (P=0.043), and those who awaken earlier at 24 months have a better global recognition of emotions (P=0.015). Finally, the boys of the high-risk group recognize sadness better than boys in the control group (P=0

  6. Unforgettable film music: The role of emotion in episodic long-term memory for music

    Science.gov (United States)

    Eschrich, Susann; Münte, Thomas F; Altenmüller, Eckart O

    2008-01-01

    Background Specific pieces of music can elicit strong emotions in listeners and, possibly in connection with these emotions, can be remembered even years later. However, episodic memory for emotional music compared with less emotional music has not yet been examined. We investigated whether emotional music is remembered better than less emotional music. Also, we examined the influence of musical structure on memory performance. Results Recognition of 40 musical excerpts was investigated as a function of arousal, valence, and emotional intensity ratings of the music. In the first session the participants judged valence and arousal of the musical pieces. One week later, participants listened to the 40 old and 40 new musical excerpts randomly interspersed and were asked to make an old/new decision as well as to indicate arousal and valence of the pieces. Musical pieces that were rated as very positive were recognized significantly better. Conclusion Musical excerpts rated as very positive are remembered better. Valence seems to be an important modulator of episodic long-term memory for music. Evidently, strong emotions related to the musical experience facilitate memory formation and retrieval. PMID:18505596

  7. Identification of four class emotion from Indonesian spoken language using acoustic and lexical features

    Science.gov (United States)

    Kasyidi, Fatan; Puji Lestari, Dessi

    2018-03-01

    One of the important aspects in human to human communication is to understand emotion of each party. Recently, interactions between human and computer continues to develop, especially affective interaction where emotion recognition is one of its important components. This paper presents our extended works on emotion recognition of Indonesian spoken language to identify four main class of emotions: Happy, Sad, Angry, and Contentment using combination of acoustic/prosodic features and lexical features. We construct emotion speech corpus from Indonesia television talk show where the situations are as close as possible to the natural situation. After constructing the emotion speech corpus, the acoustic/prosodic and lexical features are extracted to train the emotion model. We employ some machine learning algorithms such as Support Vector Machine (SVM), Naive Bayes, and Random Forest to get the best model. The experiment result of testing data shows that the best model has an F-measure score of 0.447 by using only the acoustic/prosodic feature and F-measure score of 0.488 by using both acoustic/prosodic and lexical features to recognize four class emotion using the SVM RBF Kernel.

  8. Unforgettable film music: The role of emotion in episodic long-term memory for music

    Directory of Open Access Journals (Sweden)

    Altenmüller Eckart O

    2008-05-01

    Full Text Available Abstract Background Specific pieces of music can elicit strong emotions in listeners and, possibly in connection with these emotions, can be remembered even years later. However, episodic memory for emotional music compared with less emotional music has not yet been examined. We investigated whether emotional music is remembered better than less emotional music. Also, we examined the influence of musical structure on memory performance. Results Recognition of 40 musical excerpts was investigated as a function of arousal, valence, and emotional intensity ratings of the music. In the first session the participants judged valence and arousal of the musical pieces. One week later, participants listened to the 40 old and 40 new musical excerpts randomly interspersed and were asked to make an old/new decision as well as to indicate arousal and valence of the pieces. Musical pieces that were rated as very positive were recognized significantly better. Conclusion Musical excerpts rated as very positive are remembered better. Valence seems to be an important modulator of episodic long-term memory for music. Evidently, strong emotions related to the musical experience facilitate memory formation and retrieval.

  9. [Beyond depression: assessing personality disorders, alexithymia and socio-emotional alienation in patients with HIV infection].

    Science.gov (United States)

    Masiello, Addolorata; De Guglielmo, Carmen; Giglio, Sergio; Acone, Nicola

    2014-09-01

    HIV infection is commonly associated with emotional and cognitive disorders that recognize both causes of an organic nature (related to the virus itself) and non-organic factors (emotional stress resulting from HIV diagnosis, social stigma and continued risk behaviour such as alcohol or drug abuse). Most of the literature has focused attention on depressive disorder, the most common mental disorder in the HIV population. In our analysis we evaluated the presence of personality disorders and alexithymia in a group of patients seropositive for HIV through appropriate psychological tests. Our data revealed a close relationship between socio-emotional alienation, distorted body perception and the difficulty in relating with each other, which is perceived as threatening and judgmental; this concept takes us back to the social stigma that modifies the emotional communication of HIV patients. The illness is experienced as an outsider that modifies the body, imprisons the emotionalism and cannot be controlled. Such personality alterations stop the emotional communication, thereby developing alexithymia.

  10. Nurses' recognition and registration of depression, anxiety and diabetes-specific emotional problems in outpatients with diabetes mellitus

    DEFF Research Database (Denmark)

    Pouwer, Francois; Beekman, Aartjan T F; Lubach, Caroline

    2006-01-01

    OBJECTIVE: The aim of this study was to investigate how often emotional problems were recognized and registered by diabetes nurses. METHODS: We studied medical charts and questionnaire data of 112 diabetes patients. The hospital anxiety, depression scale and the problem areas in diabetes survey...... were used to measure anxiety, depression and diabetes-specific emotional distress. RESULTS: In patients with moderate to severe levels of anxiety or depression, the presence of an emotional problem was recorded in the medical chart in 20-25% of the cases. The registration-rate of diabetes......-specific emotional distress was also found to be low, ranging from 0% (treatment-related problems) to 29% (diabetes-related emotional problems). CONCLUSION: Registration-rates of emotional problems by diabetes nurses were found to be low, but quite similar to detection rates of physicians and nurses in studies...

  11. A Somatic Movement Approach to Fostering Emotional Resiliency through Laban Movement Analysis

    Directory of Open Access Journals (Sweden)

    Rachelle P. Tsachor

    2017-09-01

    Full Text Available Although movement has long been recognized as expressing emotion and as an agent of change for emotional state, there was a dearth of scientific evidence specifying which aspects of movement influence specific emotions. The recent identification of clusters of Laban movement components which elicit and enhance the basic emotions of anger, fear, sadness and happiness indicates which types of movements can affect these emotions (Shafir et al., 2016, but not how best to apply this knowledge. This perspective paper lays out a conceptual groundwork for how to effectively use these new findings to support emotional resiliency through voluntary choice of one's posture and movements. We suggest that three theoretical principles from Laban Movement Analysis (LMA can guide the gradual change in movement components in one's daily movements to somatically support shift in affective state: (A Introduce new movement components in developmental order; (B Use LMA affinities-among-components to guide the expansion of expressive movement range and (C Sequence change among components based on Laban's Space Harmony theory to support the gradual integration of that new range. The methods postulated in this article have potential to foster resiliency and provide resources for self-efficacy by expanding our capacity to adapt emotionally to challenges through modulating our movement responses.

  12. Real-time Physiological Emotion Detection Mechanisms: Effects of Exercise and Affect Intensity.

    Science.gov (United States)

    Leon, E; Clarke, G; Sepulveda, F; Callaghan, V

    2005-01-01

    The development of systems capable of recognizing and categorising emotions is of interest to researchers in various scientific areas including artificial intelligence. The traditional notion that emotions and rationality are two separate realms has gradually been challenged. The work of neurologists has shown the strong relationship between emotional episodes and the way humans think and act. Furthermore, emotions not only regulate human decisions but could also contribute to a more satisfactory response to the environment, i.e., faster and more precise actions. In this paper an analysis of physiological signals employed in real-time emotion detection is presented in the context of Intelligent Inhabited Environments (IIE). Two studies were performed to investigate whether physical exertion has a significant effect on bodily signals stemming from emotional episodes with subjects having various degrees of affect intensity: 1) a statistical analysis using the Wilcoxon Test, and 2) a cluster analysis using the Davies-Bouldin Index. Preliminary results demonstrated that the heart rate and skin resistance consistently showed similar changes regardless of the physical stimuli while blood volume pressure did not show a significant change. It was also found that neither physical stress nor affect intensity played a role in the separation of neutral and non-neutral emotional states.

  13. Negative Bias in the Perception and Memory of Emotional Information in Alzheimer Disease.

    Science.gov (United States)

    Maria, Gomez-Gallego; Juan, Gomez-Garcia

    2017-05-01

    There is some controversy about the ability of patients with Alzheimer disease (AD) to experience and remember emotional stimuli. This study aims to assess the emotional experience of patients with AD and the existence of emotional enhancement of memory. We also investigated the influence of affective state on these processes. Sixty pictures from the International Affective Picture System were administered to 106 participants (72 patients with AD and 54 controls). Participants performed immediate free recall and recognition tasks. Positive and Negative Affect Schedule was used to assess the participants' current affect. Patients identified the valence of unpleasant pictures better than of others pictures and experienced them as more arousing. Patients and controls recalled and recognized higher number of emotional pictures than of neutral ones. Patients discriminated better the unpleasant pictures. A mood congruent effect was observed on emotional experience but not on memory. Positive affect was associated with better immediate recall and with a more liberal response bias. Patients with AD can identify the emotional content of the stimuli, especially of the unpleasant ones, and the emotional enhancement of memory is preserved. Affective state does not explain the differences in the processing and memory of emotional items between patients and controls.

  14. Using dramatic role-play to develop emotional aptitude

    Directory of Open Access Journals (Sweden)

    Russell Dinapoli

    2009-12-01

    Full Text Available As university educators, we need to prepare students for the transition from the information age to what Daniel H. Pink (2005 calls the conceptual age, which is governed by artistry, empathy and emotion, by including in the curricula activities that stimulate both hemispheres of the brain. This can be done by promoting activities that energize what Daniel Goleman (1995 refers to as emotional intelligence, and it further maintains that, as Paul Ekman (2003 suggests, the ability to detect feelings improves communication. Recognizing the need to include in the curricula procedures that help develop students’ right brain aptitudes and enhance their communication skills, I have endeavoured to introduce dramatic scene study as a sustained activity in my English for Specific Purposes courses at the Universidad de Valencia. My aim was to energize the students’ creative and emotional aptitudes, as well as to dynamize effective teamwork. This article sustains that dramatic role-play, based on scripted scene study and related improvisational activities, is one way of achieving this.

  15. Emotion models for textual emotion classification

    Science.gov (United States)

    Bruna, O.; Avetisyan, H.; Holub, J.

    2016-11-01

    This paper deals with textual emotion classification which gained attention in recent years. Emotion classification is used in user experience, product evaluation, national security, and tutoring applications. It attempts to detect the emotional content in the input text and based on different approaches establish what kind of emotional content is present, if any. Textual emotion classification is the most difficult to handle, since it relies mainly on linguistic resources and it introduces many challenges to assignment of text to emotion represented by a proper model. A crucial part of each emotion detector is emotion model. Focus of this paper is to introduce emotion models used for classification. Categorical and dimensional models of emotion are explained and some more advanced approaches are mentioned.

  16. Virtual faces expressing emotions: an initial concomitant and construct validity study.

    Science.gov (United States)

    Joyal, Christian C; Jacob, Laurence; Cigna, Marie-Hélène; Guay, Jean-Pierre; Renaud, Patrice

    2014-01-01

    Facial expressions of emotions represent classic stimuli for the study of social cognition. Developing virtual dynamic facial expressions of emotions, however, would open-up possibilities, both for fundamental and clinical research. For instance, virtual faces allow real-time Human-Computer retroactions between physiological measures and the virtual agent. The goal of this study was to initially assess concomitants and construct validity of a newly developed set of virtual faces expressing six fundamental emotions (happiness, surprise, anger, sadness, fear, and disgust). Recognition rates, facial electromyography (zygomatic major and corrugator supercilii muscles), and regional gaze fixation latencies (eyes and mouth regions) were compared in 41 adult volunteers (20 ♂, 21 ♀) during the presentation of video clips depicting real vs. virtual adults expressing emotions. Emotions expressed by each set of stimuli were similarly recognized, both by men and women. Accordingly, both sets of stimuli elicited similar activation of facial muscles and similar ocular fixation times in eye regions from man and woman participants. Further validation studies can be performed with these virtual faces among clinical populations known to present social cognition difficulties. Brain-Computer Interface studies with feedback-feedforward interactions based on facial emotion expressions can also be conducted with these stimuli.

  17. How group-based emotions are shaped by collective emotions: evidence for emotional transfer and emotional burden.

    Science.gov (United States)

    Goldenberg, Amit; Saguy, Tamar; Halperin, Eran

    2014-10-01

    Extensive research has established the pivotal role that group-based emotions play in shaping intergroup processes. The underlying implicit assumption in previous work has been that these emotions reflect what the rest of the group feels (i.e., collective emotions). However, one can experience an emotion in the name of her or his group, which is inconsistent with what the collective feels. The current research investigated this phenomenon of emotional nonconformity. Particularly, we proposed that when a certain emotional reaction is perceived as appropriate, but the collective is perceived as not experiencing this emotion, people would experience stronger levels of group-based emotion, placing their emotional experience farther away from that of the collective. We provided evidence for this process across 2 different emotions: group-based guilt and group-based anger (Studies 1 and 2) and across different intergroup contexts (Israeli-Palestinian relations in Israel, and Black-White relations in the United States). In Studies 3 and 4, we demonstrate that this process is moderated by the perceived appropriateness of the collective emotional response. Studies 4 and 5 further provided evidence for the mechanisms underlying this effect, pointing to a process of emotional burden (i.e., feeling responsible for carrying the emotion in the name of the group) and of emotional transfer (i.e., transferring negative feelings one has toward the ingroup, toward the event itself). This work brings to light processes that were yet to be studied regarding the relationship between group members, their perception of their group, and the emotional processes that connect them. 2014 APA, all rights reserved

  18. Basic and complex emotion recognition in children with autism: cross-cultural findings.

    Science.gov (United States)

    Fridenson-Hayo, Shimrit; Berggren, Steve; Lassalle, Amandine; Tal, Shahar; Pigat, Delia; Bölte, Sven; Baron-Cohen, Simon; Golan, Ofer

    2016-01-01

    Children with autism spectrum conditions (ASC) have emotion recognition deficits when tested in different expression modalities (face, voice, body). However, these findings usually focus on basic emotions, using one or two expression modalities. In addition, cultural similarities and differences in emotion recognition patterns in children with ASC have not been explored before. The current study examined the similarities and differences in the recognition of basic and complex emotions by children with ASC and typically developing (TD) controls across three cultures: Israel, Britain, and Sweden. Fifty-five children with high-functioning ASC, aged 5-9, were compared to 58 TD children. On each site, groups were matched on age, sex, and IQ. Children were tested using four tasks, examining recognition of basic and complex emotions from voice recordings, videos of facial and bodily expressions, and emotional video scenarios including all modalities in context. Compared to their TD peers, children with ASC showed emotion recognition deficits in both basic and complex emotions on all three modalities and their integration in context. Complex emotions were harder to recognize, compared to basic emotions for the entire sample. Cross-cultural agreement was found for all major findings, with minor deviations on the face and body tasks. Our findings highlight the multimodal nature of ER deficits in ASC, which exist for basic as well as complex emotions and are relatively stable cross-culturally. Cross-cultural research has the potential to reveal both autism-specific universal deficits and the role that specific cultures play in the way empathy operates in different countries.

  19. Who cares? Offering emotion work as a 'gift' in the nursing labour process.

    Science.gov (United States)

    Bolton, S C

    2000-09-01

    Who cares? Offering emotion work as a 'gift' in the nursing labour process The emotional elements of the nursing labour process are being recognized increasingly. Many commentators stress that nurses' 'emotional labour' is hard and productive work and should be valued in the same way as physical or technical labour. However, the term 'emotional labour' fails to conceptualize the many occasions when nurses not only work hard on their emotions in order to present the detached face of a professional carer, but also to offer authentic caring behaviour to patients in their care. Using qualitative data collected from a group of gynaecology nurses in an English National Health Service (NHS) Trust hospital, this paper argues that nursing work is emotionally complex and may be better understood by utilizing a combination of Hochschild's concepts: emotion work as a 'gift' in addition to 'emotional labour'. The gynaecology nurses in this study describe their work as 'emotionful' and therefore it could be said that this particular group of nurses represent a distinct example. Nevertheless, though it is impossible to generalize from limited data, the research presented in this paper does highlight the emotional complexity of the nursing labour process, expands the current conceptual analysis, and offers a path for future research. The examination further emphasizes the need to understand and value the motivations behind nurses' emotion work and their wish to maintain caring as a central value in professional nursing.

  20. Pure homology of algebraic varieties

    OpenAIRE

    Weber, Andrzej

    2003-01-01

    We show that for a complete complex algebraic variety the pure component of homology coincides with the image of intersection homology. Therefore pure homology is topologically invariant. To obtain slightly more general results we introduce "image homology" for noncomplete varieties.

  1. The effect of partner-directed emotion in social exchange decision-making.

    Science.gov (United States)

    Eimontaite, Iveta; Nicolle, Antoinette; Schindler, Igor; Goel, Vinod

    2013-01-01

    Despite the prevalence of studies examining economic decision-making as a purely rational phenomenon, common sense suggests that emotions affect our decision-making particularly in a social context. To explore the influence of emotions on economic decision-making, we manipulated opponent-directed emotions prior to engaging participants in two social exchange decision-making games (the Trust Game and the Prisoner's Dilemma). Participants played both games with three different (fictional) partners and their tendency to defect was measured. Prior to playing each game, participants exchanged handwritten "essays" with their partners, and subsequently exchanged evaluations of each essay. The essays and evaluations, read by the participant, were designed to induce either anger, sympathy, or a neutral emotional response toward the confederate with whom they would then play the social exchange games. Galvanic skin conductance level (SCL) showed enhanced physiological arousal during anger induction compared to both the neutral and sympathy conditions. In both social exchange games, participants were most likely to defect against their partner after anger induction and least likely to defect after sympathy induction, with the neutral condition eliciting intermediate defection rates. This pattern was found to be strongest in participants exhibiting low cognitive control (as measured by a Go/no-Go task). The findings indicate that emotions felt toward another individual alter how one chooses to interact with them, and that this influence depends both on the specific emotion induced and the cognitive control of the individual.

  2. The effect of partner-directed emotion in social exchange decision-making

    Directory of Open Access Journals (Sweden)

    Iveta eEimontaite

    2013-07-01

    Full Text Available Despite the prevalence of studies examining economic decision-making as a purely rational phenomenon, common sense suggests that emotions affect our decision-making particularly in a social context. To address the influence of emotions on economic decision-making, we manipulated opponent-directed emotions prior to engaging participants in two social exchange decision-making games (the Trust Game and the Prisoner’s Dilemma. Participants played both games with three different (fictional partners and their tendency to defect was measured. Prior to playing each game, participants exchanged handwritten essays with their partners, and subsequently exchanged evaluations of each essay. The essays and evaluations, read by the participant, were designed to induce either anger, sympathy or a neutral emotional response towards the confederate with whom they would then play the social exchange games. Galvanic skin conductance level showed enhanced physiological arousal during anger induction compared to both neutral and sympathy conditions. In both social exchange games, participants were most likely to defect against their partner after anger induction and least likely to defect after sympathy induction, with the neutral condition eliciting intermediate defection rates. This pattern was found to be strongest in participants exhibiting low cognitive control (as measured by a Go/no-Go task. The findings indicate that emotions felt towards another individual alter how one chooses to interact with them, and that this influence depends both on the specific emotion induced and the cognitive control of the individual.

  3. The effect of partner-directed emotion in social exchange decision-making

    Science.gov (United States)

    Eimontaite, Iveta; Nicolle, Antoinette; Schindler, Igor; Goel, Vinod

    2013-01-01

    Despite the prevalence of studies examining economic decision-making as a purely rational phenomenon, common sense suggests that emotions affect our decision-making particularly in a social context. To explore the influence of emotions on economic decision-making, we manipulated opponent-directed emotions prior to engaging participants in two social exchange decision-making games (the Trust Game and the Prisoner's Dilemma). Participants played both games with three different (fictional) partners and their tendency to defect was measured. Prior to playing each game, participants exchanged handwritten “essays” with their partners, and subsequently exchanged evaluations of each essay. The essays and evaluations, read by the participant, were designed to induce either anger, sympathy, or a neutral emotional response toward the confederate with whom they would then play the social exchange games. Galvanic skin conductance level (SCL) showed enhanced physiological arousal during anger induction compared to both the neutral and sympathy conditions. In both social exchange games, participants were most likely to defect against their partner after anger induction and least likely to defect after sympathy induction, with the neutral condition eliciting intermediate defection rates. This pattern was found to be strongest in participants exhibiting low cognitive control (as measured by a Go/no-Go task). The findings indicate that emotions felt toward another individual alter how one chooses to interact with them, and that this influence depends both on the specific emotion induced and the cognitive control of the individual. PMID:23898313

  4. Emotion recognition through static faces and moving bodies: a comparison between typically developed adults and individuals with high level of autistic traits.

    Science.gov (United States)

    Actis-Grosso, Rossana; Bossi, Francesco; Ricciardelli, Paola

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits or with High Functioning Autism Spectrum Disorder (HAT group) was compared in the recognition of four emotions (Happiness, Anger, Fear, and Sadness) either shown in static faces or conveyed by moving body patch-light displays (PLDs). Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage). Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that (i) emotion recognition is not generally impaired in HAT individuals, (ii) the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals.

  5. Emotion recognition through static faces and moving bodies: a comparison between typically-developed adults and individuals with high level of autistic traits

    Directory of Open Access Journals (Sweden)

    Rossana eActis-Grosso

    2015-10-01

    Full Text Available We investigated whether the type of stimulus (pictures of static faces vs. body motion contributes differently to the recognition of emotions. The performance (accuracy and response times of 25 Low Autistic Traits (LAT group young adults (21 males and 20 young adults (16 males with either High Autistic Traits (HAT group or with High Functioning Autism Spectrum Disorder was compared in the recognition of four emotions (Happiness, Anger, Fear and Sadness either shown in static faces or conveyed by moving bodies (patch-light displays, PLDs. Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage. Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that i emotion recognition is not generally impaired in HAT individuals, ii the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals.

  6. Emotion recognition impairment in traumatic brain injury compared with schizophrenia spectrum: similar deficits with different origins.

    Science.gov (United States)

    Mancuso, Mauro; Magnani, Nadia; Cantagallo, Anna; Rossi, Giulia; Capitani, Donatella; Galletti, Vania; Cardamone, Giuseppe; Robertson, Ian Hamilton

    2015-02-01

    The aim of our study was to identify the common and separate mechanisms that might underpin emotion recognition impairment in patients with traumatic brain injury (TBI) and schizophrenia (Sz) compared with healthy controls (HCs). We recruited 21 Sz outpatients, 24 severe TBI outpatients, and 38 HCs, and we used eye-tracking to compare facial emotion processing performance. Both Sz and TBI patients were significantly poorer at recognizing facial emotions compared with HC. Sz patients showed a different way of exploring the Pictures of Facial Affects stimuli and were significantly worse in recognition of neutral expressions. Selective or sustained attention deficits in TBI may reduce efficient emotion recognition, whereas in Sz, there is a more strategic deficit underlying the observed problem. There would seem to be scope for adjustment of effective rehabilitative training focused on emotion recognition.

  7. The integration of visual context information in facial emotion recognition in 5- to 15-year-olds.

    Science.gov (United States)

    Theurel, Anne; Witt, Arnaud; Malsert, Jennifer; Lejeune, Fleur; Fiorentini, Chiara; Barisnikov, Koviljka; Gentaz, Edouard

    2016-10-01

    The current study investigated the role of congruent visual context information in the recognition of facial emotional expression in 190 participants from 5 to 15years of age. Children performed a matching task that presented pictures with different facial emotional expressions (anger, disgust, happiness, fear, and sadness) in two conditions: with and without a visual context. The results showed that emotions presented with visual context information were recognized more accurately than those presented in the absence of visual context. The context effect remained steady with age but varied according to the emotion presented and the gender of participants. The findings demonstrated for the first time that children from the age of 5years are able to integrate facial expression and visual context information, and this integration improves facial emotion recognition. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Altered neural activity and emotions following right middle cerebral artery stroke.

    Science.gov (United States)

    Paradiso, Sergio; Anderson, Beth M; Boles Ponto, Laura L; Tranel, Daniel; Robinson, Robert G

    2011-01-01

    Stroke of the right MCA is common. Such strokes often have consequences for emotional experience, but these can be subtle. In such cases diagnosis is difficult because emotional awareness (limiting reporting of emotional changes) may be affected. The present study sought to clarify the mechanisms of altered emotion experience after right MCA stroke. It was predicted that after right MCA stroke the anterior cingulate cortex (ACC), a brain region concerned with emotional awareness, would show reduced neural activity. Brain activity during presentation of emotional stimuli was measured in 6 patients with stable stroke, and in 12 age- and sex-matched nonlesion comparisons using positron emission tomography and the [(15)O]H(2)O autoradiographic method. MCA stroke was associated with weaker pleasant experience and decreased activity ipsilaterally in the ACC. Other regions involved in emotional processing including thalamus, dorsal and medial prefrontal cortex showed reduced activity ipsilaterally. Dorsal and medial prefrontal cortex, association visual cortex and cerebellum showed reduced activity contralaterally. Experience from unpleasant stimuli was unaltered and was associated with decreased activity only in the left midbrain. Right MCA stroke may reduce experience of pleasant emotions by altering brain activity in limbic and paralimbic regions distant from the area of direct damage, in addition to changes due to direct tissue damage to insula and basal ganglia. The knowledge acquired in this study begins to explain the mechanisms underlying emotional changes following right MCA stroke. Recognizing these changes may improve diagnoses, management and rehabilitation of right MCA stroke victims. Copyright © 2011 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  9. Cultural in-group advantage: emotion recognition in African American and European American faces and voices.

    Science.gov (United States)

    Wickline, Virginia B; Bailey, Wendy; Nowicki, Stephen

    2009-03-01

    The authors explored whether there were in-group advantages in emotion recognition of faces and voices by culture or geographic region. Participants were 72 African American students (33 men, 39 women), 102 European American students (30 men, 72 women), 30 African international students (16 men, 14 women), and 30 European international students (15 men, 15 women). The participants determined emotions in African American and European American faces and voices. Results showed an in-group advantage-sometimes by culture, less often by race-in recognizing facial and vocal emotional expressions. African international students were generally less accurate at interpreting American nonverbal stimuli than were European American, African American, and European international peers. Results suggest that, although partly universal, emotional expressions have subtle differences across cultures that persons must learn.

  10. The Impact of Multiple Concussions on Emotional Distress, Post-Concussive Symptoms, and Neurocognitive Functioning in Active Duty United States Marines Independent of Combat Exposure or Emotional Distress

    Science.gov (United States)

    Lathan, Corinna E.; Bleiberg, Joseph; Tsao, Jack W.

    2014-01-01

    Abstract Controversy exists as to whether the lingering effects of concussion on emotional, physical, and cognitive symptoms is because of the effects of brain trauma or purely to emotional factors such as post-traumatic stress disorder or depression. This study examines the independent effects of concussion on persistent symptoms. The Defense Automated Neurobehavioral Assessment, a clinical decision support tool, was used to assess neurobehavioral functioning in 646 United States Marines, all of whom were fit for duty. Marines were assessed for concussion history, post-concussive symptoms, emotional distress, neurocognitive functioning, and deployment history. Results showed that a recent concussion or ever having experienced a concussion was associated with an increase in emotional distress, but not with persistent post-concussive symptoms (PPCS) or neurocognitive functioning. Having had multiple lifetime concussions, however, was associated with greater emotional distress, PPCS, and reduced neurocognitive functioning that needs attention and rapid discrimination, but not for memory-based tasks. These results are independent of deployment history, combat exposure, and symptoms of post-traumatic stress disorder and depression. Results supported earlier findings that a previous concussion is not generally associated with post-concussive symptoms independent of covariates. In contrast with other studies that failed to find a unique contribution for concussion to PPCS, however, evidence of recent and multiple concussion was seen across a range of emotional distress, post-concussive symptoms, and neurocognitive functioning in this study population. Results are discussed in terms of implications for assessing concussion on return from combat. PMID:25003552

  11. Reading emotions from faces in two indigenous societies.

    Science.gov (United States)

    Crivelli, Carlos; Jarillo, Sergio; Russell, James A; Fernández-Dols, José-Miguel

    2016-07-01

    That all humans recognize certain specific emotions from their facial expression-the Universality Thesis-is a pillar of research, theory, and application in the psychology of emotion. Its most rigorous test occurs in indigenous societies with limited contact with external cultural influences, but such tests are scarce. Here we report 2 such tests. Study 1 was of children and adolescents (N = 68; aged 6-16 years) of the Trobriand Islands (Papua New Guinea, South Pacific) with a Western control group from Spain (N = 113, of similar ages). Study 2 was of children and adolescents (N = 36; same age range) of Matemo Island (Mozambique, Africa). In both studies, participants were shown an array of prototypical facial expressions and asked to point to the person feeling a specific emotion: happiness, fear, anger, disgust, or sadness. The Spanish control group matched faces to emotions as predicted by the Universality Thesis: matching was seen on 83% to 100% of trials. For the indigenous societies, in both studies, the Universality Thesis was moderately supported for happiness: smiles were matched to happiness on 58% and 56% of trials, respectively. For other emotions, however, results were even more modest: 7% to 46% in the Trobriand Islands and 22% to 53% in Matemo Island. These results were robust across age, gender, static versus dynamic display of the facial expressions, and between- versus within-subjects design. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  12. Analysis of pure maple syrup consumers

    Science.gov (United States)

    Paul E. Sendak

    1974-01-01

    Virtually all of the pure maple syrup productim in the United States is in the northern states of Maine, Massachusetts, Michigan, New Hampshire, New York, Ohio, Pennsylvania, Vermont, and Wisconsin. Pure maple syrup users living in the maple production area and users living in other areas of the United States were asked a series of questions about their use of pure...

  13. The perception and identification of facial emotions in individuals with autism spectrum disorders using the Let's Face It! Emotion Skills Battery.

    Science.gov (United States)

    Tanaka, James W; Wolf, Julie M; Klaiman, Cheryl; Koenig, Kathleen; Cockburn, Jeffrey; Herlihy, Lauren; Brown, Carla; Stahl, Sherin S; South, Mikle; McPartland, James C; Kaiser, Martha D; Schultz, Robert T

    2012-12-01

    Although impaired social-emotional ability is a hallmark of autism spectrum disorder (ASD), the perceptual skills and mediating strategies contributing to the social deficits of autism are not well understood. A perceptual skill that is fundamental to effective social communication is the ability to accurately perceive and interpret facial emotions. To evaluate the expression processing of participants with ASD, we designed the Let's Face It! Emotion Skills Battery (LFI! Battery), a computer-based assessment composed of three subscales measuring verbal and perceptual skills implicated in the recognition of facial emotions. We administered the LFI! Battery to groups of participants with ASD and typically developing control (TDC) participants that were matched for age and IQ. On the Name Game labeling task, participants with ASD (N = 68) performed on par with TDC individuals (N = 66) in their ability to name the facial emotions of happy, sad, disgust and surprise and were only impaired in their ability to identify the angry expression. On the Matchmaker Expression task that measures the recognition of facial emotions across different facial identities, the ASD participants (N = 66) performed reliably worse than TDC participants (N = 67) on the emotions of happy, sad, disgust, frighten and angry. In the Parts-Wholes test of perceptual strategies of expression, the TDC participants (N = 67) displayed more holistic encoding for the eyes than the mouths in expressive faces whereas ASD participants (N = 66) exhibited the reverse pattern of holistic recognition for the mouth and analytic recognition of the eyes. In summary, findings from the LFI! Battery show that participants with ASD were able to label the basic facial emotions (with the exception of angry expression) on par with age- and IQ-matched TDC participants. However, participants with ASD were impaired in their ability to generalize facial emotions across different identities and showed a tendency to recognize

  14. Strength Is in Numbers: Can Concordant Artificial Listeners Improve Prediction of Emotion from Speech?

    Directory of Open Access Journals (Sweden)

    Eugenio Martinelli

    Full Text Available Humans can communicate their emotions by modulating facial expressions or the tone of their voice. Albeit numerous applications exist that enable machines to read facial emotions and recognize the content of verbal messages, methods for speech emotion recognition are still in their infancy. Yet, fast and reliable applications for emotion recognition are the obvious advancement of present 'intelligent personal assistants', and may have countless applications in diagnostics, rehabilitation and research. Taking inspiration from the dynamics of human group decision-making, we devised a novel speech emotion recognition system that applies, for the first time, a semi-supervised prediction model based on consensus. Three tests were carried out to compare this algorithm with traditional approaches. Labeling performances relative to a public database of spontaneous speeches are reported. The novel system appears to be fast, robust and less computationally demanding than traditional methods, allowing for easier implementation in portable voice-analyzers (as used in rehabilitation, research, industry, etc. and for applications in the research domain (such as real-time pairing of stimuli to participants' emotional state, selective/differential data collection based on emotional content, etc..

  15. Materiality and Emotions in Making

    Directory of Open Access Journals (Sweden)

    Pirita Seitamaa-Hakkarainen

    2013-10-01

    Full Text Available Design practice involves multifaceted activities related to creative processes involving various design tools and material. In the present exploratory study, we investigated the feasibility of using contextual event sampling as a method of studying design activities, materiality and emotions in the making process. ‘Event sampling’ refers to a research strategy for studying ongoing daily experience and emotions as they occur in the ebb and flow of everyday life. A novel method (Contextual Activity Sampling System (CASS was implemented for recording and archiving design behavior. The data were collected using mobile-phone technology employing a CASS query along with a diary method. We report two case studies conducted using CASS query (set of questions that analyzed the socio-emotional experiences (i.e., challenges – competence involved in the process of designing and handling materials during the respective design projects. In the first case study, we were interested in the main aspects of professional designers’ work; resources they used, the social dimension as well as the emotional side of their work experiences. The purpose of the second case study was to examine, longitudinally, one textile artist’s ‘street art project’. The second case study is an example of autoethnographical research that uses CASS-query to document one’s own creative practice. We conclude that the CASS technology has the potential for design research as it captures time-based and multimodal data from designers. The present study also recognizes some limitations of data collection. Methodological implications regarding the contextual study of design practices and ideas of the tool development are discussed.Keywords: Design activity flow experience, event sampling, Contextual Activity Sampling System (CASS, feasibility

  16. Expression of emotions and physiological changes during teaching

    Science.gov (United States)

    Tobin, Kenneth; King, Donna; Henderson, Senka; Bellocchi, Alberto; Ritchie, Stephen M.

    2016-09-01

    We investigated the expression of emotions while teaching in relation to a teacher's physiological changes. We used polyvagal theory (PVT) to frame the study of teaching in a teacher education program. Donna, a teacher-researcher, experienced high levels of stress and anxiety prior to beginning to teach and throughout the lesson we used her expressed emotions as a focus for this research. We adopted event-oriented inquiry in a study of heart rate, oxygenation of the blood, and expressed emotions. Five events were identified for multilevel analysis in which we used narrative, prosodic analysis, and hermeneutic-phenomenological methods to learn more about the expression of emotions when Donna had: high heart rate (before and while teaching); low blood oxygenation (before and while teaching); and high blood oxygenation (while teaching). What we learned was consistent with the body's monitoring system recognizing social harm and switching to the control of the unmyelinated vagus nerve, thereby shutting down organs and muscles associated with social communication—leading to irregularities in prosody and expression of emotion. In events involving high heart rate and low blood oxygenation the physiological environment was associated with less effective and sometimes confusing patterns in prosody, including intonation, pace of speaking, and pausing. In a low blood oxygenation environment there was evidence of rapid speech and shallow, irregular breathing. In contrast, during an event in which 100 % blood oxygenation occurred, prosody was perceived to be conducive to engagement and teacher expressed positive emotions, such as satisfaction, while teaching. Becoming aware of the purposes of the research and the results we obtained provided the teacher with tools to enact changes to her teaching practice, especially prosody of the voice. We regard it as a high priority to create tools to allow teachers and students, if and as necessary, to ameliorate excess emotions, and

  17. Emotional Self-Efficacy, Emotional Empathy and Emotional Approach Coping as Sources of Happiness

    Directory of Open Access Journals (Sweden)

    Tarık Totan

    2013-05-01

    Full Text Available Among the many variables affecting happiness, there are those that arise from emotional factors. In this study, the hypothesis stating that happiness is affected by emotional self-efficacy, emotional empathy and emotional approach coping has been examined using the path model. A total of 334 university students participated in this study, 229 of whom were females and 105 being males. Oxford Happiness Questionnaire-Short Form, Emotional Self-efficacy Scale, Multi-Dimensional Emotional Empathy Scale, The Emotional Approach Coping Scale and personal information form have been used as data acquisition tools. As a result of path analysis, it was determined that the predicted path from emotional empathy to emotional approach coping was insignificant and thus it was taken out of the model. According to the modified path model, it was determined that there is a positive relationship between emotional self- efficacy and emotional empathy, that emotional self-efficacy positively affects emotional approach coping and happiness, that emotional empathy also positively affects happiness and that emotional approach coping also positively affects happiness.

  18. "What concerns me is..." Expression of emotion by advanced cancer patients during outpatient visits.

    Science.gov (United States)

    Anderson, Wendy G; Alexander, Stewart C; Rodriguez, Keri L; Jeffreys, Amy S; Olsen, Maren K; Pollak, Kathryn I; Tulsky, James A; Arnold, Robert M

    2008-07-01

    Cancer patients have high levels of distress, yet oncologists often do not recognize patients' concerns. We sought to describe how patients with advanced cancer verbally express negative emotion to their oncologists. As part of the Studying Communication in Oncologist-Patient Encounters Trial, we audio-recorded 415 visits that 281 patients with advanced cancer made to their oncologists at three US cancer centers. Using qualitative methodology, we coded for verbal expressions of negative emotion, identified words patients used to express emotion, and categorized emotions by type and content. Patients verbally expressed negative emotion in 17% of the visits. The most commonly used words were: "concern," "scared," "worried," "depressed," and "nervous." Types of emotion expressed were: anxiety (46%), fear (25%), depression (12%), anger (9%), and other (8%). Topics about which emotion was expressed were: symptoms and functional concerns (66%), medical diagnoses and treatments (54%), social issues (14%), and the health care system (9%). Although all patients had terminal cancer, they expressed negative emotion overtly related to death and dying only 2% of the time. Patients infrequently expressed negative emotion to their oncologists. When they did, they typically expressed anxiety and fear, indicating concern about the future. When patients use emotionally expressive words such as those we described, oncologists should respond empathically, allowing patients to express their distress and concerns more fully.

  19. Robust representation and recognition of facial emotions using extreme sparse learning.

    Science.gov (United States)

    Shojaeilangari, Seyedehsamaneh; Yau, Wei-Yun; Nandakumar, Karthik; Li, Jun; Teoh, Eam Khwang

    2015-07-01

    Recognition of natural emotions from human faces is an interesting topic with a wide range of potential applications, such as human-computer interaction, automated tutoring systems, image and video retrieval, smart environments, and driver warning systems. Traditionally, facial emotion recognition systems have been evaluated on laboratory controlled data, which is not representative of the environment faced in real-world applications. To robustly recognize the facial emotions in real-world natural situations, this paper proposes an approach called extreme sparse learning, which has the ability to jointly learn a dictionary (set of basis) and a nonlinear classification model. The proposed approach combines the discriminative power of extreme learning machine with the reconstruction property of sparse representation to enable accurate classification when presented with noisy signals and imperfect data recorded in natural settings. In addition, this paper presents a new local spatio-temporal descriptor that is distinctive and pose-invariant. The proposed framework is able to achieve the state-of-the-art recognition accuracy on both acted and spontaneous facial emotion databases.

  20. Emotional congruence between clients and therapists and its effect on treatment outcome.

    Science.gov (United States)

    Atzil-Slonim, Dana; Bar-Kalifa, Eran; Fisher, Hadar; Peri, Tuvia; Lutz, Wolfgang; Rubel, Julian; Rafaeli, Eshkol

    2018-01-01

    The present study aimed to (a) explore 2 indices of emotional congruence-temporal similarity and directional discrepancy-between clients' and therapists' ratings of their emotions as they cofluctuate session-by-session; and (b) examine whether client/therapist emotional congruence predicts clients' symptom relief and improved functioning. The sample comprised 109 clients treated by 62 therapists in a university setting. Clients and therapists self-reported their negative (NE) and positive emotions (PE) after each session. Symptom severity and functioning level were assessed at the beginning of each session using the clients' self-reports. To assess emotional congruence, an adaptation of West and Kenny's (2011) Truth and Bias model was applied. To examine the consequences of emotional congruence, polynomial regression, and response surface analyses were conducted (Edwards & Parry, 1993). Clients and therapists were temporally similar in both PE and NE. Therapists experienced less intense PE on average, but did not experience more or less intense NE than their clients. Those therapists who experienced more intense NE than their clients were more temporally similar in their emotions to their clients. Therapist/client incongruence in both PE and NE predicted poorer next-session symptomatology; incongruence in PE was also associated with lower client next-session functioning. Session-level symptoms were better when therapists experienced more intense emotions (both PE and NE) than their clients. The findings highlight the importance of recognizing the dynamic nature of emotions in client-therapist interactions and the contribution of session-by-session emotional dynamics to outcomes. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  1. The Duty to Recognize Culture

    DEFF Research Database (Denmark)

    Nielsen, Morten Ebbe Juul

    2012-01-01

    On Taylor and Honneth's theories of recognition and whether one can derive a "duty to recognize Culture" from these......On Taylor and Honneth's theories of recognition and whether one can derive a "duty to recognize Culture" from these...

  2. Emotional recognition from dynamic facial, vocal and musical expressions following traumatic brain injury.

    Science.gov (United States)

    Drapeau, Joanie; Gosselin, Nathalie; Peretz, Isabelle; McKerral, Michelle

    2017-01-01

    To assess emotion recognition from dynamic facial, vocal and musical expressions in sub-groups of adults with traumatic brain injuries (TBI) of different severities and identify possible common underlying mechanisms across domains. Forty-one adults participated in this study: 10 with moderate-severe TBI, nine with complicated mild TBI, 11 with uncomplicated mild TBI and 11 healthy controls, who were administered experimental (emotional recognition, valence-arousal) and control tasks (emotional and structural discrimination) for each domain. Recognition of fearful faces was significantly impaired in moderate-severe and in complicated mild TBI sub-groups, as compared to those with uncomplicated mild TBI and controls. Effect sizes were medium-large. Participants with lower GCS scores performed more poorly when recognizing fearful dynamic facial expressions. Emotion recognition from auditory domains was preserved following TBI, irrespective of severity. All groups performed equally on control tasks, indicating no perceptual disorders. Although emotional recognition from vocal and musical expressions was preserved, no correlation was found across auditory domains. This preliminary study may contribute to improving comprehension of emotional recognition following TBI. Future studies of larger samples could usefully include measures of functional impacts of recognition deficits for fearful facial expressions. These could help refine interventions for emotional recognition following a brain injury.

  3. Beyond Emotion Regulation: Emotion Utilization and Adaptive Functioning

    OpenAIRE

    Izard, Carroll; Stark, Kevin; Trentacosta, Christopher; Schultz, David

    2008-01-01

    Recent research indicates that emotionality, emotion information processing, emotion knowledge, and discrete emotion experiences may influence and interact with emotion utilization, that is, the effective use of the inherently adaptive and motivational functions of emotions. Strategies individuals learn for emotion modulation and emotion utilization become stabilized in emerging affective-cognitive structures, or emotion schemas. In these emotion schemas, the feeling/motivational component of...

  4. Fundamentals of the Pure Spinor Formalism

    CERN Document Server

    Hoogeveen, Joost

    2010-01-01

    This thesis presents recent developments within the pure spinor formalism, which has simplified amplitude computations in perturbative string theory, especially when spacetime fermions are involved. Firstly the worldsheet action of both the minimal and the non-minimal pure spinor formalism is derived from first principles, i.e. from an action with two dimensional diffeomorphism and Weyl invariance. Secondly the decoupling of unphysical states in the minimal pure spinor formalism is proved

  5. The Change in Facial Emotion Recognition Ability in Inpatients with Treatment Resistant Schizophrenia After Electroconvulsive Therapy.

    Science.gov (United States)

    Dalkıran, Mihriban; Tasdemir, Akif; Salihoglu, Tamer; Emul, Murat; Duran, Alaattin; Ugur, Mufit; Yavuz, Ruhi

    2017-09-01

    People with schizophrenia have impairments in emotion recognition along with other social cognitive deficits. In the current study, we aimed to investigate the immediate benefits of ECT on facial emotion recognition ability. Thirty-two treatment resistant patients with schizophrenia who have been indicated for ECT enrolled in the study. Facial emotion stimuli were a set of 56 photographs that depicted seven basic emotions: sadness, anger, happiness, disgust, surprise, fear, and neutral faces. The average age of the participants was 33.4 ± 10.5 years. The rate of recognizing the disgusted facial expression increased significantly after ECT (p facial expressions (p > 0.05). After the ECT, the time period of responding to the fear and happy facial expressions were significantly shorter (p Facial emotion recognition ability is an important social cognitive skill for social harmony, proper relation and living independently. At least, the ECT sessions do not seem to affect facial emotion recognition ability negatively and seem to improve identifying disgusted facial emotion which is related with dopamine enriched regions in brain.

  6. Virtual facial expressions of emotions: An initial concomitant and construct validity study.

    Directory of Open Access Journals (Sweden)

    Christian eJoyal

    2014-09-01

    Full Text Available Abstract. Background. Facial expressions of emotions represent classic stimuli for the study of social cognition. Developing virtual dynamic facial expressions of emotions, however, would open-up possibilities, both for fundamental and clinical research. For instance, virtual faces allow real-time Human-Computer retroactions between physiological measures and the virtual agent. Objectives. The goal of this study was to initially assess concomitant and construct validity of a newly developed set of virtual faces expressing 6 fundamental emotions (happiness, surprise, anger, sadness, fear, or disgust. Recognition rates, facial electromyography (zygomatic major and corrugator supercilii muscles, and regional gaze fixation latencies (eyes and mouth regions were compared in 41 adult volunteers (20 ♂, 21 ♀ during the presentation of video clips depicting real vs. virtual adults expressing emotions. Results. Emotions expressed by each sets of stimuli were similarly recognized, both by men and women. Accordingly, both sets of stimuli elicited similar activation of facial muscles and similar ocular fixation times in eye regions from man and woman participants. Conclusion. Further validation studies can be performed with these virtual faces among clinical populations known to present social cognition difficulties. Brain-Computer Interface studies with feedback-feed forward interactions based on facial emotion expressions can also be conducted with these stimuli.

  7. Autism, emotion recognition and the mirror neuron system: the case of music.

    Science.gov (United States)

    Molnar-Szakacs, Istvan; Wang, Martha J; Laugeson, Elizabeth A; Overy, Katie; Wu, Wai-Ling; Piggot, Judith

    2009-11-16

    Understanding emotions is fundamental to our ability to navigate and thrive in a complex world of human social interaction. Individuals with Autism Spectrum Disorders (ASD) are known to experience difficulties with the communication and understanding of emotion, such as the nonverbal expression of emotion and the interpretation of emotions of others from facial expressions and body language. These deficits often lead to loneliness and isolation from peers, and social withdrawal from the environment in general. In the case of music however, there is evidence to suggest that individuals with ASD do not have difficulties recognizing simple emotions. In addition, individuals with ASD have been found to show normal and even superior abilities with specific aspects of music processing, and often show strong preferences towards music. It is possible these varying abilities with different types of expressive communication may be related to a neural system referred to as the mirror neuron system (MNS), which has been proposed as deficient in individuals with autism. Music's power to stimulate emotions and intensify our social experiences might activate the MNS in individuals with ASD, and thus provide a neural foundation for music as an effective therapeutic tool. In this review, we present literature on the ontogeny of emotion processing in typical development and in individuals with ASD, with a focus on the case of music.

  8. Comparative in vitro biocompatibility of nickel-titanium, pure nickel, pure titanium, and stainless steel: genotoxicity and atomic absorption evaluation.

    Science.gov (United States)

    Assad, M; Lemieux, N; Rivard, C H; Yahia, L H

    1999-01-01

    The genotoxicity level of nickel-titanium (NiTi) was compared to that of its pure constituents, pure nickel (Ni) and pure titanium (Ti) powders, and also to 316L stainless steel (316L SS) as clinical reference material. In order to do so, a dynamic in vitro semiphysiological extraction was performed with all metals using agitation and ISO requirements. Peripheral blood lymphocytes were then cultured in the presence of all material extracts, and their comparative genotoxicity levels were assessed using electron microscopy-in situ end-labeling (EM-ISEL) coupled to immunogold staining. Cellular chromatin exposition to pure Ni and 316L SS demonstrated a significantly stronger gold binding than exposition to NiTi, pure Ti, or the untreated control. In parallel, graphite furnace atomic absorption spectrophotometry (AAS) was also performed on all extraction media. The release of Ni atoms took the following decreasing distribution for the different resulting semiphysiological solutions: pure Ni, 316L SS, NiTi, Ti, and controls. Ti elements were detected after elution of pure titanium only. Both pure titanium and nickel-titanium specimens obtained a relative in vitro biocompatibility. Therefore, this quantitative in vitro study provides optimistic results for the eventual use of nickel-titanium alloys as surgical implant materials.

  9. Emotional decisions in structured populations for the evolution of public cooperation

    Science.gov (United States)

    Wang, Yongjie; Chen, Tong; Chen, Qiao; Si, Guangrun

    2017-02-01

    The behaviors of humans are not always profit-driven in public goods games (PGG). In addition, social preference and decision-making might be influenced, even changed by heuristics and conformity in the real life. Motivated by the facts, we would like to investigate the role of emotional system in cooperative behaviors of structured population in PGG. Meantime, the effects of diffusion of influence are studied in structured population. Numerical simulation results are indicated that emotions play very significant role indeed in emergence and maintenance of cooperation in structured populations in PGG. However, the influences of emotions on others are limited due to diminishing of influence diffusion and the existence of pure defectors. What is more, conformity, to some extent, could drive potentially more people to accept cooperative strategy with higher probability. Higher-level cooperation could be promoted as increasing values of synergy factors, but while the effects might diminish gradually as increasing number of positive heuristic players and conformist. Our work may be beneficial to address the social dilemmas in PGG.

  10. Face-body integration of intense emotional expressions of victory and defeat.

    Directory of Open Access Journals (Sweden)

    Lili Wang

    Full Text Available Human facial expressions can be recognized rapidly and effortlessly. However, for intense emotions from real life, positive and negative facial expressions are difficult to discriminate and the judgment of facial expressions is biased towards simultaneously perceived body expressions. This study employed event-related potentials (ERPs to investigate the neural dynamics involved in the integration of emotional signals from facial and body expressions of victory and defeat. Emotional expressions of professional players were used to create pictures of face-body compounds, with either matched or mismatched emotional expressions in faces and bodies. Behavioral results showed that congruent emotional information of face and body facilitated the recognition of facial expressions. ERP data revealed larger P1 amplitudes for incongruent compared to congruent stimuli. Also, a main effect of body valence on the P1 was observed, with enhanced amplitudes for the stimuli with losing compared to winning bodies. The main effect of body expression was also observed in N170 and N2, with winning bodies producing larger N170/N2 amplitudes. In the later stage, a significant interaction of congruence by body valence was found on the P3 component. Winning bodies elicited lager P3 amplitudes than losing bodies did when face and body conveyed congruent emotional signals. Beyond the knowledge based on prototypical facial and body expressions, the results of this study facilitate us to understand the complexity of emotion evaluation and categorization out of laboratory.

  11. The cosmobiological balance of the emotional and spiritual worlds: phenomenological structuralism in traditional Chinese medical thought.

    Science.gov (United States)

    Davis, S

    1996-03-01

    This paper points to a convergence of formal and rhetorical features in ancient Chinese cosmobiological theory, within which is developed a view of the inner life of human emotions. Inasmuch as there is an extensive classical tradition considering the emotions in conjunction with music, one can justify a structural analysis of medical texts treating disorder in emotional life, since emotions, musical interpretation and structural analysis all deal with systems interrelated in a transformational space largely independent of objective reference and propositional coordination. Following a section of ethnolinguistic sketches to provide grounds in some phenomenological worlds recognized by Chinese people, there is a textual analysis of a classical medical source for the treatment of emotional distress. Through close examination of the compositional schema of this text, it can be demonstrated that the standard categories of correlative cosmology are arrayed within a more comprehensive structural order.

  12. Emotion recognition in mild cognitive impairment: relationship to psychosocial disability and caregiver burden.

    Science.gov (United States)

    McCade, Donna; Savage, Greg; Guastella, Adam; Hickie, Ian B; Lewis, Simon J G; Naismith, Sharon L

    2013-09-01

    Impaired emotion recognition in dementia is associated with increased patient agitation, behavior management difficulties, and caregiver burden. Emerging evidence supports the presence of very early emotion recognition difficulties in mild cognitive impairment (MCI); however, the relationship between these impairments and psychosocial measures is not yet explored. Emotion recognition abilities of 27 patients with nonamnestic MCI (naMCI), 29 patients with amnestic MCI (aMCI), and 22 control participants were assessed. Self-report measures assessed patient functional disability, while informants rated the degree of burden they experienced. Difficulties in recognizing anger was evident in the amnestic subtype. Although both the patient groups reported greater social functioning disability, compared with the controls, a relationship between social dysfunction and anger recognition was evident only for patients with naMCI. A significant association was found between burden and anger recognition in patients with aMCI. Impaired emotion recognition abilities impact MCI subtypes differentially. Interventions targeted at patients with MCI, and caregivers are warranted.

  13. Effects of cultural characteristics on building an emotion classifier through facial expression analysis

    Science.gov (United States)

    da Silva, Flávio Altinier Maximiano; Pedrini, Helio

    2015-03-01

    Facial expressions are an important demonstration of humanity's humors and emotions. Algorithms capable of recognizing facial expressions and associating them with emotions were developed and employed to compare the expressions that different cultural groups use to show their emotions. Static pictures of predominantly occidental and oriental subjects from public datasets were used to train machine learning algorithms, whereas local binary patterns, histogram of oriented gradients (HOGs), and Gabor filters were employed to describe the facial expressions for six different basic emotions. The most consistent combination, formed by the association of HOG filter and support vector machines, was then used to classify the other cultural group: there was a strong drop in accuracy, meaning that the subtle differences of facial expressions of each culture affected the classifier performance. Finally, a classifier was trained with images from both occidental and oriental subjects and its accuracy was higher on multicultural data, evidencing the need of a multicultural training set to build an efficient classifier.

  14. Effects of adult attachment and emotional distractors on brain mechanisms of cognitive control.

    Science.gov (United States)

    Warren, Stacie L; Bost, Kelly K; Roisman, Glenn I; Silton, Rebecca Levin; Spielberg, Jeffrey M; Engels, Anna S; Choi, Eunsil; Sutton, Bradley P; Miller, Gregory A; Heller, Wendy

    2010-12-01

    Using data from 34 participants who completed an emotion-word Stroop task during functional magnetic resonance imaging, we examined the effects of adult attachment on neural activity associated with top-down cognitive control in the presence of emotional distractors. Individuals with lower levels of secure-base-script knowledge--reflected in an adult's inability to generate narratives in which attachment-related threats are recognized, competent help is provided, and the problem is resolved--demonstrated more activity in prefrontal cortical regions associated with emotion regulation (e.g., right orbitofrontal cortex) and with top-down cognitive control (left dorsolateral prefrontal cortex, anterior cingulate cortex, and superior frontal gyrus). Less efficient performance and related increases in brain activity suggest that insecure attachment involves a vulnerability to distraction by attachment-relevant emotional information and that greater cognitive control is required to attend to task-relevant, nonemotional information. These results contribute to the understanding of mechanisms through which attachment-related experiences may influence developmental adaptation.

  15. Concurrence classes for general pure multipartite states

    International Nuclear Information System (INIS)

    Heydari, Hoshang

    2005-01-01

    We propose concurrence classes for general pure multipartite states based on an orthogonal complement of a positive operator-valued measure on quantum phase. In particular, we construct W m class, GHZ m , and GHZ m-1 class concurrences for general pure m-partite states. We give explicit expressions for W 3 and GHZ 3 class concurrences for general pure three-partite states and for W 4 , GHZ 4 and GHZ 3 class concurrences for general pure four-partite states

  16. More Pronounced Deficits in Facial Emotion Recognition for Schizophrenia than Bipolar Disorder

    Science.gov (United States)

    Goghari, Vina M; Sponheim, Scott R

    2012-01-01

    Schizophrenia and bipolar disorder are typically separated in diagnostic systems. Behavioural, cognitive, and brain abnormalities associated with each disorder nonetheless overlap. We evaluated the diagnostic specificity of facial emotion recognition deficits in schizophrenia and bipolar disorder to determine whether select aspects of emotion recognition differed for the two disorders. The investigation used an experimental task that included the same facial images in an emotion recognition condition and an age recognition condition (to control for processes associated with general face recognition) in 27 schizophrenia patients, 16 bipolar I patients, and 30 controls. Schizophrenia and bipolar patients exhibited both shared and distinct aspects of facial emotion recognition deficits. Schizophrenia patients had deficits in recognizing angry facial expressions compared to healthy controls and bipolar patients. Compared to control participants, both schizophrenia and bipolar patients were more likely to mislabel facial expressions of anger as fear. Given that schizophrenia patients exhibited a deficit in emotion recognition for angry faces, which did not appear due to generalized perceptual and cognitive dysfunction, improving recognition of threat-related expression may be an important intervention target to improve social functioning in schizophrenia. PMID:23218816

  17. [Evolution, emotion, language and conscience in the postrationalist psychotherapy].

    Science.gov (United States)

    De Pascale, Adele

    2011-01-01

    A complex system process oriented approach, in other words a constructivistic postrationalist cognitive one to psychology and to psychopathology, stresses the close interdependency among processes as evolution, emotion, language and conscience. During evolution, emotions, whose biological roots we share with superior primates, should be specialized and refined. Along this process should become necessary a more and more abstract way of scaffolding the enormous quantity of data a brain could manage. Cognitive abilities, rooted in the emotional quality of experience, allow - during the phylogenetic development - more and more complex patterns of reflexivity until to the necessary ability of recognizing other's intention and consequently of lying. Language, abstract ability usefull to give increasing experiential data scaffolding, probably coming from motor skills development, brings at the same time the possibility, for a human knowing system, of self-consciousness: to do this it's owed to detach from itself, that is experience a deep sense of loneliness. Here it is that the progressive cognitive skills development is linked to the possibility of lying and of self-deception as long as the acquiring of advanced levels of selfconsciousness.

  18. An ERP investigation of conditional reasoning with emotional and neutral contents.

    Science.gov (United States)

    Blanchette, Isabelle; El-Deredy, Wael

    2014-11-01

    In two experiments we investigate conditional reasoning using event-related potentials (ERPs). Our goal was to examine the time course of inference making in two conditional forms, one logically valid (Modus Ponens, MP) and one logically invalid (Affirming the Consequent, AC). We focus particularly on the involvement of semantically-based inferential processes potentially marked by modulations of the N400. We also compared reasoning about emotional and neutral contents with separate sets of stimuli of differing linguistic complexity across the two experiments. Both MP and AC modulated the N400 component, suggesting the involvement of a semantically-based inferential mechanism common across different logical forms, content types, and linguistic features of the problems. Emotion did not have an effect on early components, and did not interact with components related to inference making. There was a main effect of emotion in the 800-1050 ms time window, consistent with an effect on sustained attention. The results suggest that conditional reasoning is not a purely formal process but that it importantly implicates semantic processing, and that the effect of emotion on reasoning does not primarily operate through a modulation of early automatic stages of information processing. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Humans rely on the same rules to assess emotional valence and intensity in conspecific and dog vocalizations.

    Science.gov (United States)

    Faragó, Tamás; Andics, Attila; Devecseri, Viktor; Kis, Anna; Gácsi, Márta; Miklósi, Adám

    2014-01-01

    Humans excel at assessing conspecific emotional valence and intensity, based solely on non-verbal vocal bursts that are also common in other mammals. It is not known, however, whether human listeners rely on similar acoustic cues to assess emotional content in conspecific and heterospecific vocalizations, and which acoustical parameters affect their performance. Here, for the first time, we directly compared the emotional valence and intensity perception of dog and human non-verbal vocalizations. We revealed similar relationships between acoustic features and emotional valence and intensity ratings of human and dog vocalizations: those with shorter call lengths were rated as more positive, whereas those with a higher pitch were rated as more intense. Our findings demonstrate that humans rate conspecific emotional vocalizations along basic acoustic rules, and that they apply similar rules when processing dog vocal expressions. This suggests that humans may utilize similar mental mechanisms for recognizing human and heterospecific vocal emotions.

  20. Dissociation between facial and bodily expressions in emotion recognition: A case study.

    Science.gov (United States)

    Leiva, Samanta; Margulis, Laura; Micciulli, Andrea; Ferreres, Aldo

    2017-12-21

    Existing single-case studies have reported deficit in recognizing basic emotions through facial expression and unaffected performance with body expressions, but not the opposite pattern. The aim of this paper is to present a case study with impaired emotion recognition through body expressions and intact performance with facial expressions. In this single-case study we assessed a 30-year-old patient with autism spectrum disorder, without intellectual disability, and a healthy control group (n = 30) with four tasks of basic and complex emotion recognition through face and body movements, and two non-emotional control tasks. To analyze the dissociation between facial and body expressions, we used Crawford and Garthwaite's operational criteria, and we compared the patient and the control group performance with a modified one-tailed t-test designed specifically for single-case studies. There were no statistically significant differences between the patient's and the control group's performances on the non-emotional body movement task or the facial perception task. For both kinds of emotions (basic and complex) when the patient's performance was compared to the control group's, statistically significant differences were only observed for the recognition of body expressions. There were no significant differences between the patient's and the control group's correct answers for emotional facial stimuli. Our results showed a profile of impaired emotion recognition through body expressions and intact performance with facial expressions. This is the first case study that describes the existence of this kind of dissociation pattern between facial and body expressions of basic and complex emotions.

  1. Emotion and Prejudice: Specific Emotions Toward Outgroups

    OpenAIRE

    2007-01-01

    Abstract This research draws on ideas about emotion-related appraisal tendencies to generate and test novel propositions about intergroup emotions. First, emotion elicited by outgroup category activation can be transferred to an unrelated stimulus (incidental emotion effects). Second, people predisposed toward an emotion are more prejudiced toward groups that are likely to be associated with that emotion. D...

  2. Polarized ensembles of random pure states

    Science.gov (United States)

    Deelan Cunden, Fabio; Facchi, Paolo; Florio, Giuseppe

    2013-08-01

    A new family of polarized ensembles of random pure states is presented. These ensembles are obtained by linear superposition of two random pure states with suitable distributions, and are quite manageable. We will use the obtained results for two purposes: on the one hand we will be able to derive an efficient strategy for sampling states from isopurity manifolds. On the other, we will characterize the deviation of a pure quantum state from separability under the influence of noise.

  3. Work schedule flexibility is associated with emotional exhaustion among registered nurses in Swiss hospitals: A cross-sectional study.

    Science.gov (United States)

    Dhaini, Suzanne R; Denhaerynck, Kris; Bachnick, Stefanie; Schwendimann, René; Schubert, Maria; De Geest, Sabina; Simon, Michael

    2018-06-01

    Emotional exhaustion among healthcare workers is a widely investigated, well-recognized problem, the incidence of which has recently been linked to work environment factors, particularly work/family conflict. However, another environmental feature that may be equally influential, but that is more amenable to nurse manager action, remains less recognized: shift schedule flexibility. This study's main purposes were to assess variations in work schedule flexibility between Swiss acute care hospital units, and to investigate associations between psychosocial work environment (e.g. work schedule flexibility) and self-reported emotional exhaustion among registered nurses. This is a secondary analysis of data collected for the multi-center observational cross-sectional Match RN study, which included a national sample of 23 hospitals and 1833 registered nurses across Switzerland. Overall, self-reported work schedule flexibility among registered nurses was limited: 32% of participants reported little or no influence in planning their own shifts. Work schedule flexibility (β -0.11; CI -0.16; -0.06) and perceived nurse manager ability (β -0.30; CI -0.49; -0.10) were negatively related to self-reported emotional exhaustion. Work-family conflict (β 0.39; CI 0.33; 0.45) was positively correlated to emotional exhaustion. The study results indicate that managerial efforts to improve working environments, including special efforts to improve work schedule flexibility, might play an important role in promoting nurses' emotional health. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. The communication of "pure" group-based anger reduces tendencies toward intergroup conflict because it increases out-group empathy.

    Science.gov (United States)

    de Vos, Bart; van Zomeren, Martijn; Gordijn, Ernestine H; Postmes, Tom

    2013-08-01

    The communication of group-based anger in intergroup conflict is often associated with destructive conflict behavior. However, we show that communicating group-based anger toward the out-group can evoke empathy and thus reduce intergroup conflict. This is because it stresses the value of maintaining a positive long-term intergroup relationship, thereby increasing understanding for the situation (in contrast to the communication of the closely related emotion of contempt). Three experiments demonstrate that the communication of group-based anger indeed reduces destructive conflict intentions compared with (a) a control condition (Experiments 1-2), (b) the communication of group-based contempt (Experiment 2), and (c) the communication of a combination of group-based anger and contempt (Experiments 2-3). Moreover, results from all three experiments reveal that empathy mediated the positive effect of communicating "pure" group-based anger. We discuss the implications of these findings for the theory and practice of communicating emotions in intergroup conflicts.

  5. Recognition of Emotional and Nonemotional Facial Expressions: A Comparison between Williams Syndrome and Autism

    Science.gov (United States)

    Lacroix, Agnes; Guidetti, Michele; Roge, Bernadette; Reilly, Judy

    2009-01-01

    The aim of our study was to compare two neurodevelopmental disorders (Williams syndrome and autism) in terms of the ability to recognize emotional and nonemotional facial expressions. The comparison of these two disorders is particularly relevant to the investigation of face processing and should contribute to a better understanding of social…

  6. Polarized ensembles of random pure states

    International Nuclear Information System (INIS)

    Cunden, Fabio Deelan; Facchi, Paolo; Florio, Giuseppe

    2013-01-01

    A new family of polarized ensembles of random pure states is presented. These ensembles are obtained by linear superposition of two random pure states with suitable distributions, and are quite manageable. We will use the obtained results for two purposes: on the one hand we will be able to derive an efficient strategy for sampling states from isopurity manifolds. On the other, we will characterize the deviation of a pure quantum state from separability under the influence of noise. (paper)

  7. Do people essentialize emotions? Individual differences in emotion essentialism and emotional experience.

    Science.gov (United States)

    Lindquist, Kristen A; Gendron, Maria; Oosterwijk, Suzanne; Barrett, Lisa Feldman

    2013-08-01

    Many scientific models of emotion assume that emotion categories are natural kinds that carve nature at its joints. These beliefs remain strong, despite the fact that the empirical record on the issue has remained equivocal for over a century. In this research, the authors examined one reason for this situation: People essentialize emotion categories by assuming that members of the same category (e.g., fear) have a shared metaphysical essence (i.e., a common causal mechanism). In Study 1, the authors found that lay people essentialize emotions by assuming that instances of the same emotion category have a shared essence that defines them, even when their surface features differ. Study 2 extended these findings, demonstrating that lay people tend to essentialize categories the more a category is of the body (vs. the mind). In Study 3, we examined the links between emotion essentialism and the complexity of actual emotional experiences. In particular, we predicted and found that individuals who hold essentialist beliefs about emotions describe themselves as experiencing highly differentiated emotional experiences but do not show evidence of stronger emotional differentiation in their momentary ratings of experience in everyday life. Implications for the science of emotion are discussed. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  8. Recognizing spontaneous facial expressions of emotion in a small-scale society of Papua New Guinea.

    Science.gov (United States)

    Crivelli, Carlos; Russell, James A; Jarillo, Sergio; Fernández-Dols, José-Miguel

    2017-03-01

    We report 2 studies on how residents of Papua New Guinea interpret facial expressions produced spontaneously by other residents of Papua New Guinea. Members of a small-scale indigenous society, Trobrianders (Milne Bay Province; N = 32, 14 to 17 years) were shown 5 facial expressions spontaneously produced by members of another small-scale indigenous society, Fore (Eastern Highlands Province) that Ekman had photographed, labeled, and published in The Face of Man (1980), each as an expression of a basic emotion: happiness, sadness, anger, surprise, and disgust. Trobrianders were asked to use any word they wanted to describe how each person shown felt and to provide valence and arousal ratings. Other Trobrianders (N = 24, 12 to 14 years) were shown the same photographs but asked to choose their response from a short list. In both studies, agreement with Ekman's predicted labels was low: 0% to 16% and 13% to 38% of observers, respectively. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. Is the desire for amputation related to disturbed emotion processing? A multiple case study analysis in BIID.

    Science.gov (United States)

    Bottini, Gabriella; Brugger, Peter; Sedda, Anna

    2015-01-01

    Body integrity identity disorder (BIID) is characterized by the overwhelming desire to amputate one or more healthy limbs or to be paraplegic. Recently, a neurological explanation of this condition has been proposed, in part on the basis of findings that the insular cortex might present structural anomalies in these individuals. While these studies focused on body representation, much less is known about emotional processing. Importantly, emotional impairments have been found in psychiatric disorders, and a psychiatric etiology is still a valid alternative to purely neurological accounts of BIID. In this study, we explored, by means of a computerized experiment, facial emotion recognition and emotional responses to disgusting images in seven individuals with BIID, taking into account their clinical features and investigating in detail disgust processing, strongly linked to insular functioning. We demonstrate that BIID is not characterized by a general emotional impairment; rather, there is a selectively reduced disgust response to violations of the body envelope. Taken together, our results support the need to explore this condition under an interdisciplinary perspective, taking into account also emotional connotations and the social modulation of body representation.

  10. Psychopathic traits in adolescents and recognition of emotion in facial expressions

    Directory of Open Access Journals (Sweden)

    Silvio José Lemos Vasconcellos

    2014-12-01

    Full Text Available Recent studies have investigated the ability of adult psychopaths and children with psychopathy traits to identify specific facial expressions of emotion. Conclusive results have not yet been found regarding whether psychopathic traits are associated with a specific deficit in the ability of identifying negative emotions such as fear and sadness. This study compared 20 adolescents with psychopathic traits and 21 adolescents without these traits in terms of their ability to recognize facial expressions of emotion using facial stimuli presented during 200 milliseconds, 500 milliseconds, and 1 second expositions. Analyses indicated significant differences between the two groups' performances only for fear and when displayed for 200 ms. This finding is consistent with findings from other studies in the field and suggests that controlling the duration of exposure to affective stimuli in future studies may help to clarify the mechanisms underlying the facial affect recognition deficits of individuals with psychopathic traits.

  11. Responding to Emotional Stress in Pediatric Hospitals: Results From a National Survey of Chief Nursing Officers.

    Science.gov (United States)

    Huetsch, Michael; Green, Jeremy

    2016-01-01

    The aim of this study was to identify leadership awareness of emotional stress and employee support efforts in pediatric hospitals. The current pediatric environment has seen increases in treatment intensity, care duration, and acuity of patients resulting in increased likelihood of being exposed to emotional events. Mail survey was sent to chief nursing officers at 87 pediatric hospitals. A total of 49 responses (56%) were received. Hospitals with less than 250 beds were significantly more likely to rate emotional stress as a large to very large problem, whereas ANCC Magnet® hospitals felt better about support efforts after patient deaths. Most commonly used support offerings focused on staff recovery after a traumatic event as opposed to training for prevention of emotional stress. Emotional stress is a well-recognized issue in pediatric hospitals with comparatively large resource commitment. Further focus on caregiver prevention training and unit leadership recognition of stress may be needed.

  12. Clinical features and subjective/physiological responses to emotional stimuli in the presence of emotion dysregulation in attention-deficit hyperactivity disorder.

    Science.gov (United States)

    Taskiran, Candan; Karaismailoglu, Serkan; Cak Esen, Halime Tuna; Tuzun, Zeynep; Erdem, Aysen; Balkanci, Zeynep Dicle; Dolgun, Anil Barak; Cengel Kultur, Sadriye Ebru

    2018-05-01

    Emotion dysregulation (ED) has long been recognized in clinical descriptions of attention-deficit hyperactivity disorder (ADHD), but a renewed interest in ED has advanced research on the overlap between the two entities. Autonomic reactivity (AR) is a neurobiological correlate of emotion regulation; however, the association between ADHD and AR remains unclear. Our aim was to explore the clinical differences, AR, and subjective emotional responses to visual emotional stimuli in ADHD children with and without ED. School-aged ADHD children with (n = 28) and without (n = 20) ED, according to the definition of deficiency in emotional self-regulation (DESR), and healthy controls (n = 22) were interviewed by using the Schedule for Affective Disorders and Schizophrenia for School Aged Children-Present and Lifetime version (K-SADS-PL) to screen frequent psychopathologies for these ages. All subjects were evaluated with Child Behavior Checklist 6-18 (CBCL), the Strengths and Difficulties Questionnaire (SDQ), the McMaster Family Assessment Device (FAD), the School-Age Temperament Inventory (SATI), and Conners' Parent Rating Scale (CPRS-48), which were completed by parents. To evaluate emotional responses, the International Affective Picture System (IAPS) and the subjective and physiological responses (electrodermal activity and heart rate reactivity) to selected pictures were examined. Regarding clinically distinctive features, the ADHD+ED group differed from the ADHD-ED and the control groups in terms of having higher temperamental negative reactivity, more oppositional/conduct problems, and lower prosocial behaviors. In the AR measures, children in the ADHD+ED group rated unpleasant stimuli as more negative, but they still had lower heart rate reactivity (HRR) than the ADHD-ED and control groups; moreover, unlike the two other groups, the ADHD+ED group showed no differences in HRR between different emotional stimuli. The presented findings are unique in terms of their

  13. Tensor modes in pure natural inflation

    Science.gov (United States)

    Nomura, Yasunori; Yamazaki, Masahito

    2018-05-01

    We study tensor modes in pure natural inflation [1], a recently-proposed inflationary model in which an axionic inflaton couples to pure Yang-Mills gauge fields. We find that the tensor-to-scalar ratio r is naturally bounded from below. This bound originates from the finiteness of the number of metastable branches of vacua in pure Yang-Mills theories. Details of the model can be probed by future cosmic microwave background experiments and improved lattice gauge theory calculations of the θ-angle dependence of the vacuum energy.

  14. Method of producing vegetable puree

    DEFF Research Database (Denmark)

    2004-01-01

    A process for producing a vegetable puree, comprising the sequential steps of: a)crushing, chopping or slicing the vegetable into pieces of 1 to 30 mm; b) blanching the vegetable pieces at a temperature of 60 to 90°C; c) contacted the blanched vegetable pieces with a macerating enzyme activity; d......) blending the macerated vegetable pieces and obtaining a puree....

  15. Are only Emotional Strengths Emotional? Character Strengths and Disposition to Positive Emotions.

    Science.gov (United States)

    Güsewell, Angelika; Ruch, Willibald

    2012-07-01

    This study aimed to examine the relations between character strengths and dispositional positive emotions (i.e. joy, contentment, pride, love, compassion, amusement, and awe). A sample of 574 German-speaking adults filled in the Dispositional Positive Emotion Scales (DPES; Shiota, Keltner, & John, 2006), and the Values in Action Inventory of Strengths (VIA-IS; Peterson, Park, & Seligman, 2005). The factorial structure of the DPES was examined on item level. Joy and contentment could not be clearly separated; the items of the other five emotions loaded on separate factors. A confirmatory factor analysis assuming two latent factors (self-oriented and object/situation specific) was computed on scale level. Results confirmed the existence of these factors, but also indicated that the seven emotions did not split up into two clearly separable families. Correlations between dispositional positive emotions and character strengths were positive and generally low to moderate; a few theoretically meaningful strengths-emotions pairs yielded coefficients>.40. Finally, the link between five character strengths factors (i.e. emotional strengths, interpersonal strengths, strengths of restraint, intellectual strengths, and theological strengths) and the emotional dispositions was examined. Each of the factors displayed a distinctive "emotional pattern"; emotional strengths evidenced the most numerous and strongest links to emotional dispositions. © 2012 The Authors. Applied Psychology: Health and Well-Being © 2012 The International Association of Applied Psychology.

  16. The Effectiveness of Emotional Intelligence Training on Communication Skills in Students with Intellectual Disabilities

    Directory of Open Access Journals (Sweden)

    Maryam Sheydaei

    2015-09-01

    Full Text Available Objectives: Emotional intelligence skills begin at home, and with positive interactions with parents and other children. Parents can help children recognize their emotions, name them, and learn how to respect their feelings and adapt to social situations. The aim of the present study was to investigate the effectiveness of emotional intelligence training on the communication skills of students with intellectual disabilities. Methods: This study was quasi-experimental, with a pre-test, post-test design and a control group. The sample consisted of 32 educable students with intellectual disabilities (14-18 years old. Results: The results showed that the intervention program had created a significant difference between the scores of the experimental and control groups (P<0.05, and that the scores for communication skills were increased, both post-test and also in the experimental group follow-up (P<0.05. Discussion: Emotional intelligence training enhanced the communication skills of students with intellectual disabilities. Teachers, professionals, and clinicians could use these training in their practices.

  17. Recognizing teen depression

    Science.gov (United States)

    ... medlineplus.gov/ency/patientinstructions/000648.htm Recognizing teen depression To use the sharing features on this page, ... life. Be Aware of the Risk for Teen Depression Your teen is more at risk for depression ...

  18. On the Assessment of Emotions and Emotional Competencies

    Directory of Open Access Journals (Sweden)

    Johnny J.R Fontaine

    2011-11-01

    Full Text Available The idea to devote a special issue on the Assessment of Emotional Functioning and Emotional Competence arose during the preparation of the 10th European Conference on Psychological Assessment that took place from the 16th until 19th September 2009 in Ghent. The conference theme was "The assessment of emotions and emotional competencies". Emotions have become a cross-cutting theme of research across theoretical and applied domains in psychology. The academic interest is especially voiced by scientific journals focusing on emotion, such as 'Motivation and Emotion', 'Cognition and Emotion', and more recently 'Emotion'. Moreover, there has been a long-standing interest in emotions in the applied domains, especially in clinical psychology.

  19. Emotional Self-Efficacy, Emotional Empathy and Emotional Approach Coping as Sources of Happiness

    OpenAIRE

    Tarık Totan; Tayfun Doğan; Fatma Sapmaz

    2013-01-01

    Among the many variables affecting happiness, there are those that arise from emotional factors. In this study, the hypothesis stating that happiness is affected by emotional self-efficacy, emotional empathy and emotional approach coping has been examined using the path model. A total of 334 university students participated in this study, 229 of whom were females and 105 being males. Oxford Happiness Questionnaire-Short Form, Emotional Self-efficacy Scale, Multi-Dimensional Emotional Empathy ...

  20. A system for tracking and recognizing pedestrian faces using a network of loosely coupled cameras

    Science.gov (United States)

    Gagnon, L.; Laliberté, F.; Foucher, S.; Branzan Albu, A.; Laurendeau, D.

    2006-05-01

    A face recognition module has been developed for an intelligent multi-camera video surveillance system. The module can recognize a pedestrian face in terms of six basic emotions and the neutral state. Face and facial features detection (eyes, nasal root, nose and mouth) are first performed using cascades of boosted classifiers. These features are used to normalize the pose and dimension of the face image. Gabor filters are then sampled on a regular grid covering the face image to build a facial feature vector that feeds a nearest neighbor classifier with a cosine distance similarity measure for facial expression interpretation and face model construction. A graphical user interface allows the user to adjust the module parameters.

  1. ActionScript Developer's Guide to PureMVC

    CERN Document Server

    Hall, Cliff

    2011-01-01

    Gain hands-on experience with PureMVC, the popular open source framework for developing maintainable applications with a Model-View-Controller architecture. In this concise guide, PureMVC creator Cliff Hall teaches the fundamentals of PureMVC development by walking you through the construction of a complete non-trivial Adobe AIR application. Through clear explanations and numerous ActionScript code examples, you'll learn best practices for using the framework's classes in your day-to-day work. Discover how PureMVC enables you to focus on the purpose and scope of your application, while the f

  2. Updating schematic emotional facial expressions in working memory: Response bias and sensitivity.

    Science.gov (United States)

    Tamm, Gerly; Kreegipuu, Kairi; Harro, Jaanus; Cowan, Nelson

    2017-01-01

    It is unclear if positive, negative, or neutral emotional expressions have an advantage in short-term recognition. Moreover, it is unclear from previous studies of working memory for emotional faces whether effects of emotions comprise response bias or sensitivity. The aim of this study was to compare how schematic emotional expressions (sad, angry, scheming, happy, and neutral) are discriminated and recognized in an updating task (2-back recognition) in a representative sample of birth cohort of young adults. Schematic facial expressions allow control of identity processing, which is separate from expression processing, and have been used extensively in attention research but not much, until now, in working memory research. We found that expressions with a U-curved mouth (i.e., upwardly curved), namely happy and scheming expressions, favoured a bias towards recognition (i.e., towards indicating that the probe and the stimulus in working memory are the same). Other effects of emotional expression were considerably smaller (1-2% of the variance explained)) compared to a large proportion of variance that was explained by the physical similarity of items being compared. We suggest that the nature of the stimuli plays a role in this. The present application of signal detection methodology with emotional, schematic faces in a working memory procedure requiring fast comparisons helps to resolve important contradictions that have emerged in the emotional perception literature. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. EFFECT OF INTEGRATED YOGA ON EMOTIONAL DIMENSIONS OF THE PARTICIPANTS IN SVYASA

    Directory of Open Access Journals (Sweden)

    Sindhu

    2016-11-01

    Full Text Available BACKGROUND: Skill to monitor one’s own and others’ thinking and actions is termed as Emotional intelligence (EI. [1] Psychological dimensions of EI are emotional sensitivity, emotional maturity and emotional competency which motivate participants to recognize, interpret and handle the dynamics of their behavioral pattern. OBJECTIVE: To assess the effect of the Integrat ed Yoga module (IYM on emotional dimensions of the participants in SVYASA. MATERIALS AND METHODS: The study includes 40 subjects between 20 - 60years of age selected from health home of Swami Vivekananda Yoga Anusandhana Samsthana (SVYASA University, Bang alore for IYM. EQ test developed by Prof N. K. Chadha used to assess the EI was given to all the subjects on admission to S - VYASA. All the subjects of this study participated in IYM for a week . After one week of IYM, same questionnaire was given to partici pants. STATISTICAL ANALYSIS: Means, standard deviations, Paired t test were used for analyzing the data with the help of SPSS 16. RESULTS: EQ analysis (n=40 showed significant increase ( P <0.05 in emotional quotient and maturity (r=0.403, 0.341 respecti vely, with significant decrease in sensitivity (r=0.482. Competency was also found to be increased, but was not found to be statistically significant. CONCLUSION: The present study suggests that IYM can result in improvement in maturity and competency d imensions of EQ aiding in emotional balance and reasoning

  4. Long-term effects of child abuse and neglect on emotion processing in adulthood.

    Science.gov (United States)

    Young, Joanna Cahall; Widom, Cathy Spatz

    2014-08-01

    To determine whether child maltreatment has a long-term impact on emotion processing abilities in adulthood and whether IQ, psychopathology, or psychopathy mediate the relationship between childhood maltreatment and emotion processing in adulthood. Using a prospective cohort design, children (ages 0-11) with documented cases of abuse and neglect during 1967-1971 were matched with non-maltreated children and followed up into adulthood. Potential mediators (IQ, Post-Traumatic Stress [PTSD], Generalized Anxiety [GAD], Dysthymia, and Major Depressive [MDD] Disorders, and psychopathy) were assessed in young adulthood with standardized assessment techniques. In middle adulthood (Mage=47), the International Affective Picture System was used to measure emotion processing. Structural equation modeling was used to test mediation models. Individuals with a history of childhood maltreatment were less accurate in emotion processing overall and in processing positive and neutral pictures than matched controls. Childhood physical abuse predicted less accuracy in neutral pictures and childhood sexual abuse and neglect predicted less accuracy in recognizing positive pictures. MDD, GAD, and IQ predicted overall picture recognition accuracy. However, of the mediators examined, only IQ acted to mediate the relationship between child maltreatment and emotion processing deficits. Although research has focused on emotion processing in maltreated children, these new findings show an impact child abuse and neglect on emotion processing in middle adulthood. Research and interventions aimed at improving emotional processing deficiencies in abused and neglected children should consider the role of IQ. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Neural activation to emotional faces in adolescents with autism spectrum disorders.

    Science.gov (United States)

    Weng, Shih-Jen; Carrasco, Melisa; Swartz, Johnna R; Wiggins, Jillian Lee; Kurapati, Nikhil; Liberzon, Israel; Risi, Susan; Lord, Catherine; Monk, Christopher S

    2011-03-01

    Autism spectrum disorders (ASD) involve a core deficit in social functioning and impairments in the ability to recognize face emotions. In an emotional faces task designed to constrain group differences in attention, the present study used functional MRI to characterize activation in the amygdala, ventral prefrontal cortex (vPFC), and striatum, three structures involved in socio-emotional processing in adolescents with ASD. Twenty-two adolescents with ASD and 20 healthy adolescents viewed facial expressions (happy, fearful, sad and neutral) that were briefly presented (250 ms) during functional MRI acquisition. To monitor attention, subjects pressed a button to identify the gender of each face. The ASD group showed greater activation to the faces relative to the control group in the amygdala, vPFC and striatum. Follow-up analyses indicated that the ASD relative to control group showed greater activation in the amygdala, vPFC and striatum (p gender identification task. When group differences in attention to facial expressions were limited, adolescents with ASD showed greater activation in structures involved in socio-emotional processing. © 2010 The Authors. Journal of Child Psychology and Psychiatry © 2010 Association for Child and Adolescent Mental Health.

  6. Gender differences in the relationship between social communication and emotion recognition.

    Science.gov (United States)

    Kothari, Radha; Skuse, David; Wakefield, Justin; Micali, Nadia

    2013-11-01

    To investigate the association between autistic traits and emotion recognition in a large community sample of children using facial and social motion cues, additionally stratifying by gender. A general population sample of 3,666 children from the Avon Longitudinal Study of Parents and Children (ALSPAC) were assessed on their ability to correctly recognize emotions using the faces subtest of the Diagnostic Analysis of Non-Verbal Accuracy, and the Emotional Triangles Task, a novel test assessing recognition of emotion from social motion cues. Children with autistic-like social communication difficulties, as assessed by the Social Communication Disorders Checklist, were compared with children without such difficulties. Autistic-like social communication difficulties were associated with poorer recognition of emotion from social motion cues in both genders, but were associated with poorer facial emotion recognition in boys only (odds ratio = 1.9, 95% CI = 1.4, 2.6, p = .0001). This finding must be considered in light of lower power to detect differences in girls. In this community sample of children, greater deficits in social communication skills are associated with poorer discrimination of emotions, implying there may be an underlying continuum of liability to the association between these characteristics. As a similar degree of association was observed in both genders on a novel test of social motion cues, the relatively good performance of girls on the more familiar task of facial emotion discrimination may be due to compensatory mechanisms. Our study might indicate the existence of a cognitive process by which girls with underlying autistic traits can compensate for their covert deficits in emotion recognition, although this would require further investigation. Copyright © 2013 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  7. Un-earthing emotions through art: facilitating reflective practice with poetry and photographic imagery.

    Science.gov (United States)

    Lapum, Jennifer; Yau, Terrence; Church, Kathryn; Ruttonsha, Perin; David, Alison Matthews

    2015-06-01

    In this article, we comment upon and provide an arts-informed example of an emotive-focused reflection of a health care practitioner. Specifically, we use poetry and photographic imagery as tools to un-earth practitioners' emotions within agonizing and traumatic clinical encounters. In order to recognize one's own humanness and authentically engage in the art of medicine, we immerse ourselves in the first author's poetic and photographic self-reflection. The poem and image are intended to inspire interpretation and meaning based on the reader's own professional and/or personal context. The last line of the poem is "I take off the gloves. My hands are marked."

  8. The Process Model of Group-Based Emotion: Integrating Intergroup Emotion and Emotion Regulation Perspectives.

    Science.gov (United States)

    Goldenberg, Amit; Halperin, Eran; van Zomeren, Martijn; Gross, James J

    2016-05-01

    Scholars interested in emotion regulation have documented the different goals and strategies individuals have for regulating their emotions. However, little attention has been paid to the regulation of group-based emotions, which are based on individuals' self-categorization as a group member and occur in response to situations perceived as relevant for that group. We propose a model for examining group-based emotion regulation that integrates intergroup emotions theory and the process model of emotion regulation. This synergy expands intergroup emotion theory by facilitating further investigation of different goals (i.e., hedonic or instrumental) and strategies (e.g., situation selection and modification strategies) used to regulate group-based emotions. It also expands emotion regulation research by emphasizing the role of self-categorization (e.g., as an individual or a group member) in the emotional process. Finally, we discuss the promise of this theoretical synergy and suggest several directions for future research on group-based emotion regulation. © 2015 by the Society for Personality and Social Psychology, Inc.

  9. Sex-specific associations between peripheral oxytocin and emotion perception in schizophrenia.

    Science.gov (United States)

    Rubin, Leah H; Carter, C Sue; Drogos, Lauren; Jamadar, Rhoda; Pournajafi-Nazarloo, Hossein; Sweeney, John A; Maki, Pauline M

    2011-08-01

    We previously reported that higher levels of peripheral oxytocin are associated with lower levels of positive, general, and overall symptoms in women but not in men with schizophrenia. Here we investigate the influence of sex, sex steroid hormone fluctuations, and peripheral oxytocin levels on emotional processing in men and women with schizophrenia. Twenty-two women with schizophrenia and 31 female controls completed the Penn Emotion Acuity Test (PEAT), a facial emotion recognition and perception task, during two menstrual cycle phases: 1) early follicular (Days 2-4; low estrogen/progesterone) and 2) midluteal (Days 20-22; high estrogen/progesterone). Twenty-six males with schizophrenia and 26 male controls completed testing at comparable intervals. We obtained plasma hormone assays of estrogen, progesterone, testosterone, and oxytocin. No sex differences were noted on the PEAT. Plasma oxytocin levels did not fluctuate across phases of the menstrual cycle. However, female patients and controls more accurately identified facial emotions during the early follicular versus midluteal phase (pmen. Like healthy women, women with schizophrenia demonstrate menstrual-cycle dependent fluctuations in recognizing emotional cues. Like healthy women, female patients with higher levels of oxytocin perceived faces as happier. Future studies need to address whether this sex-specific relationship is associated with trust and other positive emotions, and whether exogenous oxytocin might enhance mood states and social interaction in female or all schizophrenia patients. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. Music to my ears: Age-related decline in musical and facial emotion recognition.

    Science.gov (United States)

    Sutcliffe, Ryan; Rendell, Peter G; Henry, Julie D; Bailey, Phoebe E; Ruffman, Ted

    2017-12-01

    We investigated young-old differences in emotion recognition using music and face stimuli and tested explanatory hypotheses regarding older adults' typically worse emotion recognition. In Experiment 1, young and older adults labeled emotions in an established set of faces, and in classical piano stimuli that we pilot-tested on other young and older adults. Older adults were worse at detecting anger, sadness, fear, and happiness in music. Performance on the music and face emotion tasks was not correlated for either age group. Because musical expressions of fear were not equated for age groups in the pilot study of Experiment 1, we conducted a second experiment in which we created a novel set of music stimuli that included more accessible musical styles, and which we again pilot-tested on young and older adults. In this pilot study, all musical emotions were identified similarly by young and older adults. In Experiment 2, participants also made age estimations in another set of faces to examine whether potential relations between the face and music emotion tasks would be shared with the age estimation task. Older adults did worse in each of the tasks, and had specific difficulty recognizing happy, sad, peaceful, angry, and fearful music clips. Older adults' difficulties in each of the 3 tasks-music emotion, face emotion, and face age-were not correlated with each other. General cognitive decline did not appear to explain our results as increasing age predicted emotion performance even after fluid IQ was controlled for within the older adult group. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  11. Immediacy Bias in Emotion Perception: Current Emotions Seem More Intense than Previous Emotions

    Science.gov (United States)

    Van Boven, Leaf; White, Katherine; Huber, Michaela

    2009-01-01

    People tend to perceive immediate emotions as more intense than previous emotions. This "immediacy bias" in emotion perception occurred for exposure to emotional but not neutral stimuli (Study 1), when emotional stimuli were separated by both shorter (2 s; Studies 1 and 2) and longer (20 min; Studies 3, 4, and 5) delays, and for emotional…

  12. Expression of emotion in the kinematics of locomotion.

    Science.gov (United States)

    Barliya, Avi; Omlor, Lars; Giese, Martin A; Berthoz, Alain; Flash, Tamar

    2013-03-01

    Here, we examine how different emotions-happiness, fear, sadness and anger-affect the kinematics of locomotion. We focus on a compact representation of locomotion properties using the intersegmental law of coordination (Borghese et al. in J Physiol 494(3):863-879, 1996), which states that, during the gait cycle of human locomotion, the elevation angles of the thigh, shank and foot do not evolve independently of each other but form a planar pattern of co-variation. This phenomenon is highly robust and has been extensively studied. The orientation of the plane has been correlated with changes in the speed of locomotion and with reduction in energy expenditure as speed increases. An analytical model explaining the conditions underlying the emergence of this plane and predicting its orientation reveals that it suffices to examine the amplitudes of the elevation angles of the different segments along with the phase shifts between them (Barliya et al. in Exp Brain Res 193:371-385, 2009). We thus investigated the influence of different emotions on the parameters directly determining the orientation of the intersegmental plane and on the angular rotation profiles of the leg segments, examining both the effect of changes in walking speed and effects independent of speed. Subjects were professional actors and naïve subjects with no training in acting. As expected, emotions were found to strongly affect the kinematics of locomotion, particularly walking speed. The intersegmental coordination patterns revealed that emotional expression caused additional modifications to the locomotion patterns that could not be explained solely by a change in speed. For all emotions except sadness, the amplitude of thigh elevation angles changed from those in neutral locomotion. The intersegmental plane was also differently oriented, especially during anger. We suggest that, while speed is the dominant variable allowing discrimination between different emotional gaits, emotion can be

  13. Pure apraxia of speech due to infarct in premotor cortex.

    Science.gov (United States)

    Patira, Riddhi; Ciniglia, Lauren; Calvert, Timothy; Altschuler, Eric L

    Apraxia of speech (AOS) is now recognized as an articulation disorder distinct from dysarthria and aphasia. Various lesions have been associated with AOS in studies that are limited in precise localization due to variability in size and type of pathology. We present a case of pure AOS in setting of an acute stroke to localize more precisely than ever before the brain area responsible for AOS, dorsal premotor cortex (dPMC). The dPMC is in unique position to plan and coordinate speech production by virtue of its connection with nearby motor cortex harboring corticobulbar tract, supplementary motor area, inferior frontal operculum, and temporo-parietal area via the dorsal stream of dual-stream model of speech processing. The role of dPMC is further supported as part of dorsal stream in the dual-stream model of speech processing as well as controller in the hierarchical state feedback control model. Copyright © 2017 Polish Neurological Society. Published by Elsevier Urban & Partner Sp. z o.o. All rights reserved.

  14. Impact of civil war on emotion recognition: the denial of sadness in Sierra Leone.

    Science.gov (United States)

    Umiltà, Maria Allessandra; Wood, Rachel; Loffredo, Francesca; Ravera, Roberto; Gallese, Vittorio

    2013-01-01

    Studies of children with atypical emotional experience demonstrate that childhood exposure to high levels of hostility and threat biases emotion perception. This study investigates emotion processing, in former child soldiers and non-combatant civilians. All participants have experienced prolonged violence exposure during childhood. The study, carried out in Sierra Leone, aimed to examine the effects of exposure to and forced participation in acts of extreme violence on the emotion processing of young adults war survivors. A total of 76 young, male adults (38 former child soldier survivors and 38 civilian survivors) were tested in order to assess participants' ability to identify four different facial emotion expressions from photographs and movies. Both groups were able to recognize facial expressions of emotion. However, despite their general ability to correctly identify facial emotions, participants showed a significant response bias in their recognition of sadness. Both former soldiers and civilians made more errors in identifying expressions of sadness than in the other three emotions and when mislabeling sadness participants most often described it as anger. Conversely, when making erroneous identifications of other emotions, participants were most likely to label the expressed emotion as sadness. In addition, while for three of the four emotions participants were better able to make a correct identification the greater the intensity of the expression, this pattern was not observed for sadness. During movies presentation the recognition of sadness was significantly worse for soldiers. While both former child soldiers and civilians were found to be able to identify facial emotions, a significant response bias in their attribution of negative emotions was observed. Such bias was particularly pronounced in former child soldiers. These findings point to a pervasive long-lasting effect of childhood exposure to violence on emotion processing in later life.

  15. Impact of civil war on emotion recognition: the denial of sadness in Sierra Leone.

    Directory of Open Access Journals (Sweden)

    Maria Alessandra eUmilta'

    2013-09-01

    Full Text Available Studies of children with atypical emotional experience demonstrate that childhood exposure to high levels of hostility and threat biases emotion perception. This study investigates emotion processing, in former child soldiers and non-combatant civilians. All participants have experienced prolonged violence exposure during childhood. The study, carried out in Sierra Leone, aimed to examine the effects of exposure to and forced participation in acts of extreme violence on the emotion processing of young adults war survivors. A total of 76 young, male adults (38 former child soldier survivors and 38 civilian survivors were tested in order to assess participants’ ability to identify four different facial emotion expressions from photographs and movies. Both groups were able to recognize facial expressions of emotion. However, despite their general ability to correctly identify facial emotions, participants showed a significant response bias in their recognition of sadness. Both former soldiers and civilians made more errors in identifying expressions of sadness than in the other three emotions and when mislabeling sadness participants most often described it as anger. Conversely, when making erroneous identifications of other emotions, participants were most likely to label the expressed emotion as sadness. In addition, while for three of the four emotions participants were better able to make a correct identification the greater the intensity of the expression, this pattern was not observed for sadness. During movies presentation the recognition of sadness was significantly worse for soldiers. While both former child soldiers and civilians were found to be able to identify facial emotions, a significant response bias in their attribution of negative emotions was observed. Such bias was particularly pronounced in former child soldiers. These findings point to a pervasive long-lasting effect of childhood exposure to violence on emotion processing

  16. Fictional Emotions within Emotion Driven Design

    DEFF Research Database (Denmark)

    Knutz, Eva

    2012-01-01

    The aim of this paper is to address imaginative experiences of emotions by drawing Kendall Walton’s theory of make-believe. Moreover, we use a design case as means for investigating how a child’s felt emotions towards a hospital situation relates to his or her imaginative experiences of emotions...... towards a fictive character in a computer game simulating the real-world situation. In so doing, we contribute with new insights to existing theories of emotions in design, which tend to focus narrowly on felt and measurable emotions....

  17. Using King's interacting systems theory to link emotional intelligence and nursing practice.

    Science.gov (United States)

    Shanta, Linda L; Connolly, Maria

    2013-01-01

    King's theory is a broad theory designed to provide a framework for nursing (I.M. King, 1981), whereas emotional intelligence (EI; J.D. Mayer & P. Salovey, 2004) is a theory that is specific for addressing potential competency in dealing with emotions and emotional information. J.D. Mayer, P. Salovey, D.R. Caruso, and G. Sitarenios (2001) defined EI as the "ability to recognize the meaning of emotions and their relationships and to use them as a basis for reasoning and problem solving" (p. 234). These researchers believed that EI is related to cognitive intellect through the ability to use reasoning by way of information to find meaning. J.D. Mayer and P. Salovey (2004) argued that the skills that comprise EI were likely enhanced through obtaining a liberal education infused with values exploration. J.D. Mayer, P. Salovey, D.R. Caruso, and G. Sitarenios (2001) contended that there are 4 branches of abilities that create EI: (a) the skill of perceiving emotion within oneself and others, (b) assimilation of an emotion to facilitate thinking, (c) understanding and knowledge of emotion, and (d) conscious regulation of emotion. Each level or branch builds upon the previous one, and awareness of what each branch offers the individual in enhancing relationships with others is a key component of healthy emotional interactions. This article will provide a theoretic foundation based upon King's interacting systems theory (IST; 1981) that embraces EI as a crucial component in the nurse's ability to provide holistic care for patients, peers, and themselves. King's IST underscores the necessity of nurses possessing abilities of EI as they care for others but does not fully describe a mechanism to understand and incorporate emotions within the complex nurse-patient interactions and communications that are part of the nursing process. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. Sex differences in effective fronto-limbic connectivity during negative emotion processing.

    Science.gov (United States)

    Lungu, Ovidiu; Potvin, Stéphane; Tikàsz, Andràs; Mendrek, Adrianna

    2015-12-01

    In view of the greater prevalence of depression and anxiety disorders in women than in men, functional magnetic resonance imaging (fMRI) studies have examined sex-differences in brain activations during emotion processing. Comparatively, sex-differences in brain connectivity received little attention, despite evidence for important fronto-limbic connections during emotion processing across sexes. Here, we investigated sex-differences in fronto-limbic connectivity during negative emotion processing. Forty-six healthy individuals (25 women, 21 men) viewed negative, positive and neutral images during an fMRI session. Effective connectivity between significantly activated regions was examined using Granger causality and psychophysical interaction analyses. Sex steroid hormones and feminine-masculine traits were also measured. Subjective ratings of negative emotional images were higher in women than in men. Across sexes, significant activations were observed in the dorso-medial prefrontal cortex (dmPFC) and the right amygdala. Granger connectivity from right amygdala was significantly greater than that from dmPFC during the 'high negative' condition, an effect driven by men. Magnitude of this effect correlated negatively with highly negative image ratings and feminine traits and positively with testosterone levels. These results highlight critical sex differences in brain connectivity during negative emotion processing and point to the fact that both biological (sex steroid hormones) and psychosocial (gender role and identity) variables contribute to them. As the dmPFC is involved in social cognition and action planning, and the amygdala-in threat detection, the connectivity results suggest that compared to women, men have a more evaluative, rather than purely affective, brain response during negative emotion processing. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Recognition of Facial Expressions of Mixed Emotions in School-Age Children Exposed to Terrorism

    Science.gov (United States)

    Scrimin, Sara; Moscardino, Ughetta; Capello, Fabia; Altoe, Gianmarco; Axia, Giovanna

    2009-01-01

    This exploratory study aims at investigating the effects of terrorism on children's ability to recognize emotions. A sample of 101 exposed and 102 nonexposed children (mean age = 11 years), balanced for age and gender, were assessed 20 months after a terrorist attack in Beslan, Russia. Two trials controlled for children's ability to match a facial…

  20. Difficulty identifying feelings and automatic activation in the fusiform gyrus in response to facial emotion.

    Science.gov (United States)

    Eichmann, Mischa; Kugel, Harald; Suslow, Thomas

    2008-12-01

    Difficulties in identifying and differentiating one's emotions are a central characteristic of alexithymia. In the present study, automatic activation of the fusiform gyrus to facial emotion was investigated as a function of alexithymia as assessed by the 20-item Toronto Alexithymia Scale. During 3 Tesla fMRI scanning, pictures of faces bearing sad, happy, and neutral expressions masked by neutral faces were presented to 22 healthy adults who also responded to the Toronto Alexithymia Scale. The fusiform gyrus was selected as the region of interest, and voxel values of this region were extracted, summarized as means, and tested among the different conditions (sad, happy, and neutral faces). Masked sad facial emotions were associated with greater bilateral activation of the fusiform gyrus than masked neutral faces. The subscale, Difficulty Identifying Feelings, was negatively correlated with the neural response of the fusiform gyrus to masked sad faces. The correlation results suggest that automatic hyporesponsiveness of the fusiform gyrus to negative emotion stimuli may reflect problems in recognizing one's emotions in everyday life.

  1. Recognizing Facial Expressions Automatically from Video

    Science.gov (United States)

    Shan, Caifeng; Braspenning, Ralph

    Facial expressions, resulting from movements of the facial muscles, are the face changes in response to a person's internal emotional states, intentions, or social communications. There is a considerable history associated with the study on facial expressions. Darwin [22] was the first to describe in details the specific facial expressions associated with emotions in animals and humans, who argued that all mammals show emotions reliably in their faces. Since that, facial expression analysis has been a area of great research interest for behavioral scientists [27]. Psychological studies [48, 3] suggest that facial expressions, as the main mode for nonverbal communication, play a vital role in human face-to-face communication. For illustration, we show some examples of facial expressions in Fig. 1.

  2. Self-control in Online Discussions: Disinhibited Online Behavior as a Failure to Recognize Social Cues.

    Science.gov (United States)

    Voggeser, Birgit J; Singh, Ranjit K; Göritz, Anja S

    2017-01-01

    In an online experiment we examined the role of self-control in recognizing social cues in the context of disinhibited online behavior (e.g., flaming and trolling). We temporarily lowered participants' self-control capacity with an ego depletion paradigm (i.e., color Stroop task). Next, we measured participants' sensitivity to social cues with an emotional Stroop task containing neutral, negative, and taboo words. Sensitivity to social cues is represented by the increase in reaction time to negative and especially taboo words compared to neutral words. As expected, undepleted participants were slower to process the color of negative and taboo words. By contrast, depleted participants (i.e., those with lowered self-control capacity) did not react differently to taboo or negative words than they did to neutral words. The experiment illustrates that self-control failure may manifest itself in a failure to recognize social cues. The finding underlines the importance of self-control in understanding disinhibited online behavior: Many instances of disinhibited online behavior may occur not because people are unable to control themselves, but because they do not realize that a situation calls for self-control in the first place.

  3. Self-control in Online Discussions: Disinhibited Online Behavior as a Failure to Recognize Social Cues

    Directory of Open Access Journals (Sweden)

    Birgit J. Voggeser

    2018-01-01

    Full Text Available In an online experiment we examined the role of self-control in recognizing social cues in the context of disinhibited online behavior (e.g., flaming and trolling. We temporarily lowered participants' self-control capacity with an ego depletion paradigm (i.e., color Stroop task. Next, we measured participants' sensitivity to social cues with an emotional Stroop task containing neutral, negative, and taboo words. Sensitivity to social cues is represented by the increase in reaction time to negative and especially taboo words compared to neutral words. As expected, undepleted participants were slower to process the color of negative and taboo words. By contrast, depleted participants (i.e., those with lowered self-control capacity did not react differently to taboo or negative words than they did to neutral words. The experiment illustrates that self-control failure may manifest itself in a failure to recognize social cues. The finding underlines the importance of self-control in understanding disinhibited online behavior: Many instances of disinhibited online behavior may occur not because people are unable to control themselves, but because they do not realize that a situation calls for self-control in the first place.

  4. Show me how you walk and I tell you how you feel - a functional near-infrared spectroscopy study on emotion perception based on human gait.

    Science.gov (United States)

    Schneider, Sabrina; Christensen, Andrea; Häußinger, Florian B; Fallgatter, Andreas J; Giese, Martin A; Ehlis, Ann-Christine

    2014-01-15

    The ability to recognize and adequately interpret emotional states in others plays a fundamental role in regulating social interaction. Body language presents an essential element of nonverbal communication which is often perceived prior to mimic expression. However, the neural networks that underlie the processing of emotionally expressive body movement and body posture are poorly understood. 33 healthy subjects have been investigated using the optically based imaging method functional near-infrared spectroscopy (fNIRS) during the performance of a newly developed emotion discrimination paradigm consisting of faceless avatars expressing fearful, angry, sad, happy or neutral gait patterns. Participants were instructed to judge (a) the presented emotional state (emotion task) and (b) the observed walking speed of the respective avatar (speed task). We measured increases in cortical oxygenated haemoglobin (O2HB) in response to visual stimulation during emotion discrimination. These O2HB concentration changes were enhanced for negative emotions in contrast to neutral gait sequences in right occipito-temporal and left temporal and temporo-parietal brain regions. Moreover, fearful and angry bodies elicited higher activation increases during the emotion task compared to the speed task. Haemodynamic responses were correlated with a number of behavioural measures, whereby a positive relationship between emotion regulation strategy preference and O2HB concentration increases after sad walks was mediated by the ability to accurately categorize sad walks. Our results support the idea of a distributed brain network involved in the recognition of bodily emotion expression that comprises visual association areas as well as body/movement perception specific cortical regions that are also sensitive to emotion. This network is activated less when the emotion is not intentionally processed (i.e. during the speed task). Furthermore, activity of this perceptive network is, mediated by

  5. Theory of mind as a mediator of reasoning and facial emotion recognition: findings from 200 healthy people.

    Science.gov (United States)

    Lee, Seul Bee; Koo, Se Jun; Song, Yun Young; Lee, Mi Kyung; Jeong, Yu-Jin; Kwon, Catherine; Park, Kyoung Ri; Park, Jin Young; Kang, Jee In; Lee, Eun; An, Suk Kyoon

    2014-04-01

    It was proposed that the ability to recognize facial emotions is closely related to complex neurocognitive processes and/or skills related to theory of mind (ToM). This study examines whether ToM skills mediate the relationship between higher neurocognitive functions, such as reasoning ability, and facial emotion recognition. A total of 200 healthy subjects (101 males, 99 females) were recruited. Facial emotion recognition was measured through the use of 64 facial emotional stimuli that were selected from photographs from the Korean Facial Expressions of Emotion (KOFEE). Participants were requested to complete the Theory of Mind Picture Stories task and Standard Progressive Matrices (SPM). Multiple regression analysis showed that the SPM score (t=3.19, p=0.002, β=0.22) and the overall ToM score (t=2.56, p=0.011, β=0.18) were primarily associated with a total hit rate (%) of the emotion recognition task. Hierarchical regression analysis through a three-step mediation model showed that ToM may partially mediate the relationship between SPM and performance on facial emotion recognition. These findings imply that higher neurocognitive functioning, inclusive of reasoning, may not only directly contribute towards facial emotion recognition but also influence ToM, which in turn, influences facial emotion recognition. These findings are particularly true for healthy young people.

  6. Darwin revisited: The vagus nerve is a causal element in controlling recognition of other's emotions.

    Science.gov (United States)

    Colzato, Lorenza S; Sellaro, Roberta; Beste, Christian

    2017-07-01

    Charles Darwin proposed that via the vagus nerve, the tenth cranial nerve, emotional facial expressions are evolved, adaptive and serve a crucial communicative function. In line with this idea, the later-developed polyvagal theory assumes that the vagus nerve is the key phylogenetic substrate that regulates emotional and social behavior. The polyvagal theory assumes that optimal social interaction, which includes the recognition of emotion in faces, is modulated by the vagus nerve. So far, in humans, it has not yet been demonstrated that the vagus plays a causal role in emotion recognition. To investigate this we employed transcutaneous vagus nerve stimulation (tVNS), a novel non-invasive brain stimulation technique that modulates brain activity via bottom-up mechanisms. A sham/placebo-controlled, randomized cross-over within-subjects design was used to infer a causal relation between the stimulated vagus nerve and the related ability to recognize emotions as indexed by the Reading the Mind in the Eyes Test in 38 healthy young volunteers. Active tVNS, compared to sham stimulation, enhanced emotion recognition for easy items, suggesting that it promoted the ability to decode salient social cues. Our results confirm that the vagus nerve is causally involved in emotion recognition, supporting Darwin's argumentation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. COMPUTER-AIDED PSYCHOTHERAPY BASED ON MULTIMODAL ELICITATION, ESTIMATION AND REGULATION OF EMOTION

    OpenAIRE

    Ćosić, Krešimir; Popović, Siniša; Horvat, Marko; Kukolja, Davor; Dropuljić, Branimir; Kovač, Bernard; Jakovljević, Miro

    2013-01-01

    Contemporary psychiatry is looking at affective sciences to understand human behavior, cognition and the mind in health and disease. Since it has been recognized that emotions have a pivotal role for the human mind, an ever increasing number of laboratories and research centers are interested in affective sciences, affective neuroscience, affective psychology and affective psychopathology. Therefore, this paper presents multidisciplinary research results of Laboratory for Interactive...

  8. Recognition of emotional facial expressions and broad autism phenotype in parents of children diagnosed with autistic spectrum disorder.

    Science.gov (United States)

    Kadak, Muhammed Tayyib; Demirel, Omer Faruk; Yavuz, Mesut; Demir, Türkay

    2014-07-01

    Research findings debate about features of broad autism phenotype. In this study, we tested whether parents of children with autism have problems recognizing emotional facial expression and the contribution of such an impairment to the broad phenotype of autism. Seventy-two parents of children with autistic spectrum disorder and 38 parents of control group participated in the study. Broad autism features was measured with Autism Quotient (AQ). Recognition of Emotional Face Expression Test was assessed with the Emotion Recognition Test, consisting a set of photographs from Ekman & Friesen's. In a two-tailed analysis of variance of AQ, there was a significant difference for social skills (F(1, 106)=6.095; p<.05). Analyses of variance revealed significant difference in the recognition of happy, surprised and neutral expressions (F(1, 106)=4.068, p=.046; F(1, 106)=4.068, p=.046; F(1, 106)=6.064, p=.016). According to our findings, social impairment could be considered a characteristic feature of BAP. ASD parents had difficulty recognizing neutral expressions, suggesting that ASD parents may have impaired recognition of ambiguous expressions as do autistic children. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Positive emotion impedes emotional but not cognitive conflict processing.

    Science.gov (United States)

    Zinchenko, Artyom; Obermeier, Christian; Kanske, Philipp; Schröger, Erich; Kotz, Sonja A

    2017-06-01

    Cognitive control enables successful goal-directed behavior by resolving a conflict between opposing action tendencies, while emotional control arises as a consequence of emotional conflict processing such as in irony. While negative emotion facilitates both cognitive and emotional conflict processing, it is unclear how emotional conflict processing is affected by positive emotion (e.g., humor). In 2 EEG experiments, we investigated the role of positive audiovisual target stimuli in cognitive and emotional conflict processing. Participants categorized either spoken vowels (cognitive task) or their emotional valence (emotional task) and ignored the visual stimulus dimension. Behaviorally, a positive target showed no influence on cognitive conflict processing, but impeded emotional conflict processing. In the emotional task, response time conflict costs were higher for positive than for neutral targets. In the EEG, we observed an interaction of emotion by congruence in the P200 and N200 ERP components in emotional but not in cognitive conflict processing. In the emotional conflict task, the P200 and N200 conflict effect was larger for emotional than neutral targets. Thus, our results show that emotion affects conflict processing differently as a function of conflict type and emotional valence. This suggests that there are conflict- and valence-specific mechanisms modulating executive control.

  10. THE COGNITIVE MATRIX OF EMOTIONAL-COMMUNICATIVE PERSONALITY

    Directory of Open Access Journals (Sweden)

    В И Шаховский

    2018-12-01

    Full Text Available The purpose of the article is to show the development of scientific thought that leads to the origin and definition of the concept of “language personality”. Attention is drawn to the fact that up to the 1970s emotions had been completely excluded from the scope of linguistic attention. With the advent of an-thropocentric linguistics, emotions were recognized as the human being focal point, but linguists’ atten-tion was still attracted merely to the language of homo loquens / sentiens - the emotional component was missing. Therefore, the objectives of the article are as follows: 1 to present and discuss the develop-ment of the Language Personality Theory; 2 to prove the necessity of including the emotive component into the concept of the language personality structure; 3 to substantiate the introduction of the new term - “emotionally-communicative personality”, which logically fits into the terminological system of mod-ern communicology and emphasizes its communicative significance. The theoretical material includes nu-merous works devoted to the problem of language personality, beginning from V.V. Vinogradov (the 1930s to G.I. Bogin, Yu.N. Karaulov (the 1980s, from the 1990s to the present O.A. Dmitrieva, I.A. Murzi-nova (2015; Shakhovsky, 2000; V.I. Shakhovskiy (2008 a&b; A.A. Shteba (2014 and many others. To my knowledge, the notion of language personality has not been discussed by foreign linguists. Another block of theoretical material is dedicated to the problem of the language and emotion correlation. Russian linguistics has been researching this problem since 1969. Main results of these studies can be found in the works of V.I. Shakhovsky, from (1969 to present, S.V. Ionova (1998, 2015, N.A. Krasavsky (2001, T.V. Larina (2009, 2015, and Ya.A. Volkova (2014 among others. The problem of the language and emotion correlation is varied in its formulation - the language of emotions or language and emotions: A. Schleicher, 1869; Ch

  11. Image-based Analysis of Emotional Facial Expressions in Full Face Transplants.

    Science.gov (United States)

    Bedeloglu, Merve; Topcu, Çagdas; Akgul, Arzu; Döger, Ela Naz; Sever, Refik; Ozkan, Ozlenen; Ozkan, Omer; Uysal, Hilmi; Polat, Ovunc; Çolak, Omer Halil

    2018-01-20

    In this study, it is aimed to determine the degree of the development in emotional expression of full face transplant patients from photographs. Hence, a rehabilitation process can be planned according to the determination of degrees as a later work. As envisaged, in full face transplant cases, the determination of expressions can be confused or cannot be achieved as the healthy control group. In order to perform image-based analysis, a control group consist of 9 healthy males and 2 full-face transplant patients participated in the study. Appearance-based Gabor Wavelet Transform (GWT) and Local Binary Pattern (LBP) methods are adopted for recognizing neutral and 6 emotional expressions which consist of angry, scared, happy, hate, confused and sad. Feature extraction was carried out by using both methods and combination of these methods serially. In the performed expressions, the extracted features of the most distinct zones in the facial area where the eye and mouth region, have been used to classify the emotions. Also, the combination of these region features has been used to improve classifier performance. Control subjects and transplant patients' ability to perform emotional expressions have been determined with K-nearest neighbor (KNN) classifier with region-specific and method-specific decision stages. The results have been compared with healthy group. It has been observed that transplant patients don't reflect some emotional expressions. Also, there were confusions among expressions.

  12. Pure robotic retrocaval ureter repair

    Directory of Open Access Journals (Sweden)

    Ashok k. Hemal

    2008-12-01

    Full Text Available PURPOSE: To demonstrate the feasibility of pure robotic retrocaval ureter repair. MATERIALS AND METHODS: A 33 year old female presented with right loin pain and obstruction on intravenous urography with the classical "fish-hook" appearance. She was counseled on the various methods of repair and elected to have a robot assisted repair. The following steps are performed during a pure robotic retrocaval ureter repair. The patient is placed in a modified flank position, pneumoperitoneum created and ports inserted. The colon is mobilized to expose the retroperitoneal structures: inferior vena cava, right gonadal vein, right ureter, and duodenum. The renal pelvis and ureter are mobilized and the renal pelvis transected. The ureter is transposed anterior to the inferior vena cava and a pyelopyelostomy is performed over a JJ stent. RESULTS: This patient was discharged on postoperative day 3. The catheter and drain tube were removed on day 1. Her JJ stent was removed at 6 weeks postoperatively. The postoperative intravenous urography at 3 months confirmed normal drainage of contrast medium. CONCLUSION: Pure robotic retrocaval ureter is a feasible procedure; however, there does not appear to be any great advantage over pure laparoscopy, apart from the ergonomic ease for the surgeon as well the simpler intracorporeal suturing.

  13. Managing emotions - an ability of emotional intelligence.

    OpenAIRE

    Correia, Ana Almeida; Veiga-Branco, Augusta

    2011-01-01

    This study focuses on the concept Managing Emotions from Emotional Intelligence (I.E.), (Mayer-Salovey, 1990, 1997, Goleman, 1995), also identified as Emotional Regulation (Bisquerra, 2000), to obtain recognition and practical use of this concept, through the use of Emotional Fitness charts (Bimbela-Pedrola, 2008), to develop these abilities and manage emotions in contexts of practical life. Objective: To train preschool teachers, as well as primary and lower secondary sc...

  14. Emotion regulation mediates age differences in emotions.

    Science.gov (United States)

    Yeung, Dannii Y; Wong, Carmen K M; Lok, David P P

    2011-04-01

    This study aimed at testing the proposition of socioemotional selectivity theory whether older people would use more antecedent-focused emotion regulatory strategies like cognitive reappraisal but fewer response-focused strategies like suppression. It also aimed at investigating the mediating role of emotion regulation on the relationship between age and emotions. The sample consisted of 654 younger and older adults aged between 18 and 64. Results showed that age was significantly associated with positive emotions and cognitive reappraisal. No difference was found in negative emotions and suppression between younger and older adults. Cognitive reappraisal partially mediated the effect of age on positive emotions. Findings of this study contribute to our understanding of the underlying mechanism of age variations in emotional experiences.

  15. Affective orientation influences memory for emotional and neutral words.

    Science.gov (United States)

    Greenberg, Seth N; Tokarev, Julian; Estes, Zachary

    2012-01-01

    Memory is better for emotional words than for neutral words, but the conditions contributing to emotional memory improvement are not entirely understood. Elsewhere, it has been observed that retrieval of a word is easier when its attributes are congruent with a property assessed during an earlier judgment task. The present study examined whether affective assessment of a word matters to its remembrance. Two experiments were run, one in which only valence assessment was performed, and another in which valence assessment was combined with a running recognition for list words. In both experiments, some participants judged whether each word in a randomized list was negative (negative monitoring), and others judged whether each was positive (positive monitoring). We then tested their explicit memory for the words via both free recall and delayed recognition. Both experiments revealed an affective congruence effect, such that negative words were more likely to be recalled and recognized after negative monitoring, whereas positive words likewise benefited from positive monitoring. Memory for neutral words was better after negative monitoring than positive monitoring.Thus, memory for both emotional and neutral words is contingent on one's affective orientation during encoding.

  16. Protection of children of divorced parents who are victims of emotional abuse

    Directory of Open Access Journals (Sweden)

    Batić Dragana

    2011-01-01

    Full Text Available Very little is said and written about the problem of emotional abuse of children, as a result of parental divorce and separation, probably because it is a very sophisticated type of emotional abuse, which unfortunately sometimes experts do not recognize. This phenomenon is rarely explored and researched in general and especially in the Republic of Macedonia. It is not disputed that there is a solid legal framework for a government response to this type of child abuse in Republic of Macedonia. Given the impact on children, this problem requires much more attention, education and cooperation between the competent institutions. This paper tries to explore the concept of emotional abuse of children, as a result of divorce and separation of the parents, as a very specific form of domestic violence from a psychological point of view, as well as to analyze the legal norm of this form of domestic violence in the Republic of Macedonia.

  17. Characterizing commercial pureed foods: sensory, nutritional, and textural analysis.

    Science.gov (United States)

    Ettinger, Laurel; Keller, Heather H; Duizer, Lisa M

    2014-01-01

    Dysphagia (swallowing impairment) is a common consequence of stroke and degenerative diseases such as Parkinson's and Alzheimer's. Limited research is available on pureed foods, specifically the qualities of commercial products. Because research has linked pureed foods, specifically in-house pureed products, to malnutrition due to inferior sensory and nutritional qualities, commercial purees also need to be investigated. Proprietary research on sensory attributes of commercial foods is available; however direct comparisons of commercial pureed foods have never been reported. Descriptive sensory analysis as well as nutritional and texture analysis of commercially pureed prepared products was performed using a trained descriptive analysis panel. The pureed foods tested included four brands of carrots, of turkey, and two of bread. Each commercial puree was analyzed for fat (Soxhlet), protein (Dumas), carbohydrate (proximate analysis), fiber (total fiber), and sodium content (Quantab titrator strips). The purees were also texturally compared with a line spread test and a back extrusion test. Differences were found in the purees for sensory attributes as well as nutritional and textural properties. Findings suggest that implementation of standards is required to reduce variability between products, specifically regarding the textural components of the products. This would ensure all commercial products available in Canada meet standards established as being considered safe for swallowing.

  18. Vacuum evaporation of pure metals

    OpenAIRE

    Safarian, Jafar; Engh, Thorvald Abel

    2013-01-01

    Theories on the evaporation of pure substances are reviewed and applied to study vacuum evaporation of pure metals. It is shown that there is good agreement between different theories for weak evaporation, whereas there are differences under intensive evaporation conditions. For weak evaporation, the evaporation coefficient in Hertz-Knudsen equation is 1.66. Vapor velocity as a function of the pressure is calculated applying several theories. If a condensing surface is less than one collision...

  19. The Interplay between Emotion and Cognition in Autism Spectrum Disorder: Implications for Developmental Theory

    Science.gov (United States)

    Gaigg, Sebastian B.

    2012-01-01

    Autism Spectrum Disorder (ASD) is a neurodevelopmental disorder that is clinically defined by abnormalities in reciprocal social and communicative behaviors and an inflexible adherence to routinised patterns of thought and behavior. Laboratory studies repeatedly demonstrate that autistic individuals experience difficulties in recognizing and understanding the emotional expressions of others and naturalistic observations show that they use such expressions infrequently and inappropriately to regulate social exchanges. Dominant theories attribute this facet of the ASD phenotype to abnormalities in a social brain network that mediates social-motivational and social-cognitive processes such as face processing, mental state understanding, and empathy. Such theories imply that only emotion related processes relevant to social cognition are compromised in ASD but accumulating evidence suggests that the disorder may be characterized by more widespread anomalies in the domain of emotions. In this review I summarize the relevant literature and argue that the social-emotional characteristics of ASD may be better understood in terms of a disruption in the domain-general interplay between emotion and cognition. More specifically I will suggest that ASD is the developmental consequence of early emerging anomalies in how emotional responses to the environment modulate a wide range of cognitive processes including those that are relevant to navigating the social world. PMID:23316143

  20. Microstructural Correlates of Emotional Attribution Impairment in Non-Demented Patients with Amyotrophic Lateral Sclerosis.

    Science.gov (United States)

    Crespi, Chiara; Cerami, Chiara; Dodich, Alessandra; Canessa, Nicola; Iannaccone, Sandro; Corbo, Massimo; Lunetta, Christian; Falini, Andrea; Cappa, Stefano F

    2016-01-01

    Impairments in the ability to recognize and attribute emotional states to others have been described in amyotrophic lateral sclerosis patients and linked to the dysfunction of key nodes of the emotional empathy network. Microstructural correlates of such disorders are still unexplored. We investigated the white-matter substrates of emotional attribution deficits in a sample of amyotrophic lateral sclerosis patients without cognitive decline. Thirteen individuals with either probable or definite amyotrophic lateral sclerosis and 14 healthy controls were enrolled in a Diffusion Tensor Imaging study and administered the Story-based Empathy Task, assessing the ability to attribute mental states to others (i.e., Intention and Emotion attribution conditions). As already reported, a significant global reduction of empathic skills, mainly driven by a failure in Emotion Attribution condition, was found in amyotrophic lateral sclerosis patients compared to healthy subjects. The severity of this deficit was significantly correlated with fractional anisotropy along the forceps minor, genu of corpus callosum, right uncinate and inferior fronto-occipital fasciculi. The involvement of frontal commissural fiber tracts and right ventral associative fronto-limbic pathways is the microstructural hallmark of the impairment of high-order processing of socio-emotional stimuli in amyotrophic lateral sclerosis. These results support the notion of the neurofunctional and neuroanatomical continuum between amyotrophic lateral sclerosis and frontotemporal dementia.