WorldWideScience

Sample records for emotional faces state

  1. State anxiety and emotional face recognition in healthy volunteers

    OpenAIRE

    Attwood, Angela S.; Easey, Kayleigh E.; Dalili, Michael N.; Skinner, Andrew L.; Woods, Andy; Crick, Lana; Ilett, Elizabeth; Penton-Voak, Ian S.; Munafò, Marcus R.

    2017-01-01

    High trait anxiety has been associated with detriments in emotional face processing. By contrast, relatively little is known about the effects of state anxiety on emotional face processing. We investigated the effects of state anxiety on recognition of emotional expressions (anger, sadness, surprise, disgust, fear and happiness) experimentally, using the 7.5% carbon dioxide (CO2) model to induce state anxiety, and in a large observational study. The experimental studies indicated reduced glob...

  2. State-dependent alteration in face emotion recognition in depression.

    Science.gov (United States)

    Anderson, Ian M; Shippen, Clare; Juhasz, Gabriella; Chase, Diana; Thomas, Emma; Downey, Darragh; Toth, Zoltan G; Lloyd-Williams, Kathryn; Elliott, Rebecca; Deakin, J F William

    2011-04-01

    Negative biases in emotional processing are well recognised in people who are currently depressed but are less well described in those with a history of depression, where such biases may contribute to vulnerability to relapse. To compare accuracy, discrimination and bias in face emotion recognition in those with current and remitted depression. The sample comprised a control group (n = 101), a currently depressed group (n = 30) and a remitted depression group (n = 99). Participants provided valid data after receiving a computerised face emotion recognition task following standardised assessment of diagnosis and mood symptoms. In the control group women were more accurate in recognising emotions than men owing to greater discrimination. Among participants with depression, those in remission correctly identified more emotions than controls owing to increased response bias, whereas those currently depressed recognised fewer emotions owing to decreased discrimination. These effects were most marked for anger, fear and sadness but there was no significant emotion × group interaction, and a similar pattern tended to be seen for happiness although not for surprise or disgust. These differences were confined to participants who were antidepressant-free, with those taking antidepressants having similar results to the control group. Abnormalities in face emotion recognition differ between people with current depression and those in remission. Reduced discrimination in depressed participants may reflect withdrawal from the emotions of others, whereas the increased bias in those with a history of depression could contribute to vulnerability to relapse. The normal face emotion recognition seen in those taking medication may relate to the known effects of antidepressants on emotional processing and could contribute to their ability to protect against depressive relapse.

  3. From specificity to sensitivity: affective states modulate visual working memory for emotional expressive faces.

    Science.gov (United States)

    Maran, Thomas; Sachse, Pierre; Furtner, Marco

    2015-01-01

    Previous findings suggest that visual working memory (VWM) preferentially remembers angry looking faces. However, the meaning of facial actions is construed in relation to context. To date, there are no studies investigating the role of perceiver-based context when processing emotional cues in VWM. To explore the influence of affective context on VWM for faces, we conducted two experiments using both a VWM task for emotionally expressive faces and a mood induction procedure. Affective context was manipulated by unpleasant (Experiment 1) and pleasant (Experiment 2) IAPS pictures in order to induce an affect high in motivational intensity (defensive or appetitive, respectively) compared to a low arousal control condition. Results indicated specifically increased sensitivity of VWM for angry looking faces in the neutral condition. Enhanced VWM for angry faces was prevented by inducing affects of high motivational intensity. In both experiments, affective states led to a switch from specific enhancement of angry expressions in VWM to an equally sensitive representation of all emotional expressions. Our findings demonstrate that emotional expressions are of different behavioral relevance for the receiver depending on the affective context, supporting a functional organization of VWM along with flexible resource allocation. In VWM, stimulus processing adjusts to situational requirements and transitions from a specifically prioritizing default mode in predictable environments to a sensitive, hypervigilant mode in exposure to emotional events.

  4. From Specificity to Sensitivity: Affective states modulate visual working memory for emotional expressive faces

    Directory of Open Access Journals (Sweden)

    Thomas eMaran

    2015-08-01

    Full Text Available Previous findings suggest that visual working memory preferentially remembers angry looking faces. However, the meaning of facial actions is construed in relation to context. To date, there are no studies investigating the role of perceiver-based context when processing emotional cues in visual working memory. To explore the influence of affective context on visual working memory for faces, we conducted two experiments using both a visual working memory task for emotionally expressive faces and a mood induction procedure. Affective context was manipulated by unpleasant (Experiment 1 and pleasant (Experiment 2 IAPS pictures in order to induce an affect high in motivational intensity (defensive or appetitive, respectively compared to a low arousal control condition. Results indicated specifically increased sensitivity of visual working memory for angry looking faces in the neutral condition. Enhanced visual working memory for angry faces was prevented by inducing affects of high motivational intensity. In both experiments, affective states led to a switch from specific enhancement of angry expressions in visual working memory to an equally sensitive representation of all emotional expressions. Our findings demonstrate that emotional expressions are of different behavioral relevance for the receiver depending on the affective context, supporting a functional organization of visual working memory along with flexible resource allocation. In visual working memory, stimulus processing adjusts to situational requirements and transitions from a specifically prioritizing default mode in predictable environments to a sensitive, hypervigilant mode in exposure to emotional events.

  5. Emotion Words: Adding Face Value.

    Science.gov (United States)

    Fugate, Jennifer M B; Gendron, Maria; Nakashima, Satoshi F; Barrett, Lisa Feldman

    2017-06-12

    Despite a growing number of studies suggesting that emotion words affect perceptual judgments of emotional stimuli, little is known about how emotion words affect perceptual memory for emotional faces. In Experiments 1 and 2 we tested how emotion words (compared with control words) affected participants' abilities to select a target emotional face from among distractor faces. Participants were generally more likely to false alarm to distractor emotional faces when primed with an emotion word congruent with the face (compared with a control word). Moreover, participants showed both decreased sensitivity (d') to discriminate between target and distractor faces, as well as altered response biases (c; more likely to answer "yes") when primed with an emotion word (compared with a control word). In Experiment 3 we showed that emotion words had more of an effect on perceptual memory judgments when the structural information in the target face was limited, as well as when participants were only able to categorize the face with a partially congruent emotion word. The overall results are consistent with the idea that emotion words affect the encoding of emotional faces in perceptual memory. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Steroids facing emotions

    NARCIS (Netherlands)

    Putman, P.L.J.

    2006-01-01

    The studies reported in this thesis have been performed to gain a better understanding about motivational mediators of selective attention and memory for emotionally relevant stimuli, and about the roles that some steroid hormones play in regulation of human motivation and emotion. The stimuli used

  7. Emotional faces and the default mode network.

    Science.gov (United States)

    Sreenivas, S; Boehm, S G; Linden, D E J

    2012-01-11

    The default-mode network (DMN) of the human brain has become a central topic of cognitive neuroscience research. Although alterations in its resting state activity and in its recruitment during tasks have been reported for several mental and neurodegenerative disorders, its role in emotion processing has received relatively little attention. We investigated brain responses to different categories of emotional faces with functional magnetic resonance imaging (fMRI) and found deactivation in ventromedial prefrontal cortex (VMPFC), posterior cingulate gyrus (PC) and cuneus. This deactivation was modulated by emotional category and was less prominent for happy than for sad faces. These deactivated areas along the midline conformed to areas of the DMN. We also observed emotion-dependent deactivation of the left middle frontal gyrus, which is not a classical component of the DMN. Conversely, several areas in a fronto-parietal network commonly linked with attention were differentially activated by emotion categories. Functional connectivity patterns, as obtained by correlation of activation levels, also varied between emotions. VMPFC, PC or cuneus served as hubs between the DMN-type areas and the fronto-parietal network. These data support recent suggestions that the DMN is not a unitary system but differentiates according to task and even type of stimulus. The emotion-specific differential pattern of DMN deactivation may be explored further in patients with mood disorder, where the quest for biological markers of emotional biases is still ongoing. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  8. State-Dependent Alterations in Inhibitory Control and Emotional Face Identification in Seasonal Affective Disorder

    DEFF Research Database (Denmark)

    Hjordt, Liv V.; Stenbæk, Dea S.; Madsen, Kathrine Skak

    2017-01-01

    Background: Depressed individuals often exhibit impaired inhibition to negative input and identification of positive stimuli, but it is unclear whether this is a state or trait feature. We here exploited a naturalistic model, namely individuals with seasonal affective disorder (SAD), to study thi...... of life for these patients. (PsycINFO Database Record (c) 2017 APA, all rights reserved)...

  9. Emotionally anesthetized: media violence induces neural changes during emotional face processing

    OpenAIRE

    Stockdale, Laura A.; Morrison, Robert G.; Kmiecik, Matthew J.; Garbarino, James; Silton, Rebecca L.

    2015-01-01

    Media violence exposure causes increased aggression and decreased prosocial behavior, suggesting that media violence desensitizes people to the emotional experience of others. Alterations in emotional face processing following exposure to media violence may result in desensitization to others’ emotional states. This study used scalp electroencephalography methods to examine the link between exposure to violence and neural changes associated with emotional face processing. Twenty-five particip...

  10. Emotional Labor, Face and Guan xi

    Institute of Scientific and Technical Information of China (English)

    Tianwenling

    2017-01-01

    Emotional Labor, Face and Guan xi are all relevant to performance, appearance, and emotional feelings, which are essential elements in work place. In other words, not only front-line workers, but all employees in an organization is faced up with the three

  11. Emotional facial expressions reduce neural adaptation to face identity.

    Science.gov (United States)

    Gerlicher, Anna M V; van Loon, Anouk M; Scholte, H Steven; Lamme, Victor A F; van der Leij, Andries R

    2014-05-01

    In human social interactions, facial emotional expressions are a crucial source of information. Repeatedly presented information typically leads to an adaptation of neural responses. However, processing seems sustained with emotional facial expressions. Therefore, we tested whether sustained processing of emotional expressions, especially threat-related expressions, would attenuate neural adaptation. Neutral and emotional expressions (happy, mixed and fearful) of same and different identity were presented at 3 Hz. We used electroencephalography to record the evoked steady-state visual potentials (ssVEP) and tested to what extent the ssVEP amplitude adapts to the same when compared with different face identities. We found adaptation to the identity of a neutral face. However, for emotional faces, adaptation was reduced, decreasing linearly with negative valence, with the least adaptation to fearful expressions. This short and straightforward method may prove to be a valuable new tool in the study of emotional processing.

  12. The Moderating Effect of Self-Reported State and Trait Anxiety on the Late Positive Potential to Emotional Faces in 6–11-Year-Old Children

    Directory of Open Access Journals (Sweden)

    Georgia Chronaki

    2018-02-01

    Full Text Available Introduction: The emergence of anxiety during childhood is accompanied by the development of attentional biases to threat. However, the neural mechanisms underlying these biases are poorly understood. In addition, previous research has not examined whether state and trait anxiety are independently associated with threat-related biases.Methods: We compared ERP waveforms during the processing of emotional faces in a population sample of 58 6–11-year-olds who completed self-reported measures of trait and state anxiety and depression.Results: The results showed that the P1 was larger to angry than neutral faces in the left hemisphere, though early components (P1, N170 were not strongly associated with child anxiety or depression. In contrast, Late Positive Potential (LPP amplitudes to angry (vs. neutral faces were significantly and positively associated with symptoms of anxiety/depression. In addition, the difference between LPPs for angry (vs. neutral faces was independently associated with state and trait anxiety symptoms.Discussion: The results showed that neural responses to facial emotion in children with elevated symptoms of anxiety and depression were most evident at later processing stages characterized as evaluative and effortful. The findings support cognitive models of threat perception in anxiety and indicate that trait elements of anxiety and more transitory fluctuations in anxious affect are important in understanding individual variation in the neural response to threat in late childhood.

  13. Detecting and categorizing fleeting emotions in faces.

    Science.gov (United States)

    Sweeny, Timothy D; Suzuki, Satoru; Grabowecky, Marcia; Paller, Ken A

    2013-02-01

    Expressions of emotion are often brief, providing only fleeting images from which to base important social judgments. We sought to characterize the sensitivity and mechanisms of emotion detection and expression categorization when exposure to faces is very brief, and to determine whether these processes dissociate. Observers viewed 2 backward-masked facial expressions in quick succession, 1 neutral and the other emotional (happy, fearful, or angry), in a 2-interval forced-choice task. On each trial, observers attempted to detect the emotional expression (emotion detection) and to classify the expression (expression categorization). Above-chance emotion detection was possible with extremely brief exposures of 10 ms and was most accurate for happy expressions. We compared categorization among expressions using a d' analysis, and found that categorization was usually above chance for angry versus happy and fearful versus happy, but consistently poor for fearful versus angry expressions. Fearful versus angry categorization was poor even when only negative emotions (fearful, angry, or disgusted) were used, suggesting that this categorization is poor independent of decision context. Inverting faces impaired angry versus happy categorization, but not emotion detection, suggesting that information from facial features is used differently for emotion detection and expression categorizations. Emotion detection often occurred without expression categorization, and expression categorization sometimes occurred without emotion detection. These results are consistent with the notion that emotion detection and expression categorization involve separate mechanisms. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  14. Detecting and Categorizing Fleeting Emotions in Faces

    Science.gov (United States)

    Sweeny, Timothy D.; Suzuki, Satoru; Grabowecky, Marcia; Paller, Ken A.

    2013-01-01

    Expressions of emotion are often brief, providing only fleeting images from which to base important social judgments. We sought to characterize the sensitivity and mechanisms of emotion detection and expression categorization when exposure to faces is very brief, and to determine whether these processes dissociate. Observers viewed 2 backward-masked facial expressions in quick succession, 1 neutral and the other emotional (happy, fearful, or angry), in a 2-interval forced-choice task. On each trial, observers attempted to detect the emotional expression (emotion detection) and to classify the expression (expression categorization). Above-chance emotion detection was possible with extremely brief exposures of 10 ms and was most accurate for happy expressions. We compared categorization among expressions using a d′ analysis, and found that categorization was usually above chance for angry versus happy and fearful versus happy, but consistently poor for fearful versus angry expressions. Fearful versus angry categorization was poor even when only negative emotions (fearful, angry, or disgusted) were used, suggesting that this categorization is poor independent of decision context. Inverting faces impaired angry versus happy categorization, but not emotion detection, suggesting that information from facial features is used differently for emotion detection and expression categorizations. Emotion detection often occurred without expression categorization, and expression categorization sometimes occurred without emotion detection. These results are consistent with the notion that emotion detection and expression categorization involve separate mechanisms. PMID:22866885

  15. Matching faces with emotional expressions

    Directory of Open Access Journals (Sweden)

    Wenfeng eChen

    2011-08-01

    Full Text Available There is some evidence that faces with a happy expression are recognized better than faces with other expressions. However, little is known about whether this happy face advantage also applies to perceptual face matching, and whether similar differences exist among other expressions. Using a sequential matching paradigm, we systematically compared the effects of seven basic facial expressions on identity recognition. Identity matching was quickest when a pair of faces had an identical happy/sad/neutral expression, poorer when they had a fearful/surprise/angry expression, and poorest when they had a disgust expression. Faces with a happy/sad/fear/surprise expression were matched faster than those with an anger/disgust expression when the second face in a pair had a neutral expression. These results demonstrate that effects of facial expression on identity recognition are not limited to happy faces when a learned face is immediately tested. The results suggest different influences of expression in perceptual matching and long-term recognition memory.

  16. Processing of emotional faces in social phobia

    Directory of Open Access Journals (Sweden)

    Nicole Kristjansen Rosenberg

    2011-02-01

    Full Text Available Previous research has found that individuals with social phobia differ from controls in their processing of emotional faces. For instance, people with social phobia show increased attention to briefly presented threatening faces. However, when exposure times are increased, the direction of this attentional bias is more unclear. Studies investigating eye movements have found both increased as well as decreased attention to threatening faces in socially anxious participants. The current study investigated eye movements to emotional faces in eight patients with social phobia and 34 controls. Three different tasks with different exposure durations were used, which allowed for an investigation of the time course of attention. At the early time interval, patients showed a complex pattern of both vigilance and avoidance of threatening faces. At the longest time interval, patients avoided the eyes of sad, disgust, and neutral faces more than controls, whereas there were no group differences for angry faces.

  17. Abnormal left and right amygdala-orbitofrontal cortical functional connectivity to emotional faces: state versus trait vulnerability markers of depression in bipolar disorder.

    Science.gov (United States)

    Versace, Amelia; Thompson, Wesley K; Zhou, Donli; Almeida, Jorge R C; Hassel, Stefanie; Klein, Crystal R; Kupfer, David J; Phillips, Mary L

    2010-03-01

    Amygdala-orbitofrontal cortical (OFC) functional connectivity (FC) to emotional stimuli and relationships with white matter remain little examined in bipolar disorder individuals (BD). Thirty-one BD (type I; n = 17 remitted; n = 14 depressed) and 24 age- and gender-ratio-matched healthy individuals (HC) viewed neutral, mild, and intense happy or sad emotional faces in two experiments. The FC was computed as linear and nonlinear dependence measures between amygdala and OFC time series. Effects of group, laterality, and emotion intensity upon amygdala-OFC FC and amygdala-OFC FC white matter fractional anisotropy (FA) relationships were examined. The BD versus HC showed significantly greater right amygdala-OFC FC (p relationship (p = .001) between left amygdala-OFC FC to sad faces and FA in HC. In BD, antidepressants were associated with significantly reduced left amygdala-OFC FC to mild sad faces (p = .001). In BD, abnormally elevated right amygdala-OFC FC to sad stimuli might represent a trait vulnerability for depression, whereas abnormally elevated left amygdala-OFC FC to sad stimuli and abnormally reduced amygdala-OFC FC to intense happy stimuli might represent a depression state marker. Abnormal FC measures might normalize with antidepressant medications in BD. Nonlinear amygdala-OFC FC-FA relationships in BD and HC require further study. Copyright 2010 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  18. Serotonergic modulation of face-emotion recognition

    Directory of Open Access Journals (Sweden)

    C.M. Del-Ben

    2008-04-01

    Full Text Available Facial expressions of basic emotions have been widely used to investigate the neural substrates of emotion processing, but little is known about the exact meaning of subjective changes provoked by perceiving facial expressions. Our assumption was that fearful faces would be related to the processing of potential threats, whereas angry faces would be related to the processing of proximal threats. Experimental studies have suggested that serotonin modulates the brain processes underlying defensive responses to environmental threats, facilitating risk assessment behavior elicited by potential threats and inhibiting fight or flight responses to proximal threats. In order to test these predictions about the relationship between fearful and angry faces and defensive behaviors, we carried out a review of the literature about the effects of pharmacological probes that affect 5-HT-mediated neurotransmission on the perception of emotional faces. The hypothesis that angry faces would be processed as a proximal threat and that, as a consequence, their recognition would be impaired by an increase in 5-HT function was not supported by the results reviewed. In contrast, most of the studies that evaluated the behavioral effects of serotonin challenges showed that increased 5-HT neurotransmission facilitates the recognition of fearful faces, whereas its decrease impairs the same performance. These results agree with the hypothesis that fearful faces are processed as potential threats and that 5-HT enhances this brain processing.

  19. Emotionally anesthetized: media violence induces neural changes during emotional face processing.

    Science.gov (United States)

    Stockdale, Laura A; Morrison, Robert G; Kmiecik, Matthew J; Garbarino, James; Silton, Rebecca L

    2015-10-01

    Media violence exposure causes increased aggression and decreased prosocial behavior, suggesting that media violence desensitizes people to the emotional experience of others. Alterations in emotional face processing following exposure to media violence may result in desensitization to others' emotional states. This study used scalp electroencephalography methods to examine the link between exposure to violence and neural changes associated with emotional face processing. Twenty-five participants were shown a violent or nonviolent film clip and then completed a gender discrimination stop-signal task using emotional faces. Media violence did not affect the early visual P100 component; however, decreased amplitude was observed in the N170 and P200 event-related potentials following the violent film, indicating that exposure to film violence leads to suppression of holistic face processing and implicit emotional processing. Participants who had just seen a violent film showed increased frontal N200/P300 amplitude. These results suggest that media violence exposure may desensitize people to emotional stimuli and thereby require fewer cognitive resources to inhibit behavior. © The Author (2015). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  20. Weather and emotional state

    Science.gov (United States)

    Spasova, Z.

    2010-09-01

    Introduction Given the proven effects of weather on the human organism, an attempt to examine its effects on a psychic and emotional level has been made. Emotions affect the bio-tonus, working ability and concentration, hence their significance in various domains of economic life, such as health care, education, transportation, tourism, etc. Data and methods The research has been made in Sofia City within a period of 8 months, using 5 psychological methods (Eysenck Personality Questionnaire (EPQ), State-Trait Anxiety Inventory (STAI), Test for Self-assessment of the emotional state (developed by Wessman and Ricks), Test for evaluation of moods and Test "Self-confidence - Activity - Mood" (developed by the specialists from the Military Academy in Saint Petersburg). The Fiodorov-Chubukov's complex-climatic method was used to characterize meteorological conditions because of the purpose to include in the analysis a maximal number of meteorological elements. 16 weather types are defined in dependence of the meteorological elements values according to this method. Abrupt weather changes from one day to another, defined by the same method, were considered as well. Results and discussions The results obtained by t-test show that the different categories of weather lead to changes in the emotional status, which indicates a character either positive or negative for the organism. The abrupt weather changes, according to expectations, have negative effect on human emotions but only when a transition to the cloudy weather or weather type, classified as "unfavourable" has been realized. The relationship between weather and human emotions is rather complicated since it depends on individual characteristics of people. One of these individual psychological characteristics, marked by the dimension "neuroticism", has a strong effect on emotional reactions in different weather conditions. Emotionally stable individuals are more "protected" to the weather influence on their emotions

  1. Mapping the emotional face. How individual face parts contribute to successful emotion recognition.

    Directory of Open Access Journals (Sweden)

    Martin Wegrzyn

    Full Text Available Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes to disgust and happiness (mouth. The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation.

  2. Mapping the emotional face. How individual face parts contribute to successful emotion recognition

    Science.gov (United States)

    Wegrzyn, Martin; Vogt, Maria; Kireclioglu, Berna; Schneider, Julia; Kissler, Johanna

    2017-01-01

    Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation. PMID:28493921

  3. Modulation of the composite face effect by unintended emotion cues.

    Science.gov (United States)

    Gray, Katie L H; Murphy, Jennifer; Marsh, Jade E; Cook, Richard

    2017-04-01

    When upper and lower regions from different emotionless faces are aligned to form a facial composite, observers 'fuse' the two halves together, perceptually. The illusory distortion induced by task-irrelevant ('distractor') halves hinders participants' judgements about task-relevant ('target') halves. This composite-face effect reveals a tendency to integrate feature information from disparate regions of intact upright faces, consistent with theories of holistic face processing. However, observers frequently perceive emotion in ostensibly neutral faces, contrary to the intentions of experimenters. This study sought to determine whether this 'perceived emotion' influences the composite-face effect. In our first experiment, we confirmed that the composite effect grows stronger as the strength of distractor emotion increased. Critically, effects of distractor emotion were induced by weak emotion intensities, and were incidental insofar as emotion cues hindered image matching, not emotion labelling per se . In Experiment 2, we found a correlation between the presence of perceived emotion in a set of ostensibly neutral distractor regions sourced from commonly used face databases, and the strength of illusory distortion they induced. In Experiment 3, participants completed a sequential matching composite task in which half of the distractor regions were rated high and low for perceived emotion, respectively. Significantly stronger composite effects were induced by the high-emotion distractor halves. These convergent results suggest that perceived emotion increases the strength of the composite-face effect induced by supposedly emotionless faces. These findings have important implications for the study of holistic face processing in typical and atypical populations.

  4. Recognition of Face and Emotional Facial Expressions in Autism

    Directory of Open Access Journals (Sweden)

    Muhammed Tayyib Kadak

    2013-03-01

    Full Text Available Autism is a genetically transferred neurodevelopmental disorder characterized by severe and permanent deficits in many interpersonal relation areas like communication, social interaction and emotional responsiveness. Patients with autism have deficits in face recognition, eye contact and recognition of emotional expression. Both recognition of face and expression of facial emotion carried on face processing. Structural and functional impairment in fusiform gyrus, amygdala, superior temporal sulcus and other brain regions lead to deficits in recognition of face and facial emotion. Therefore studies suggest that face processing deficits resulted in problems in areas of social interaction and emotion in autism. Studies revealed that children with autism had problems in recognition of facial expression and used mouth region more than eye region. It was also shown that autistic patients interpreted ambiguous expressions as negative emotion. In autism, deficits related in various stages of face processing like detection of gaze, face identity, recognition of emotional expression were determined, so far. Social interaction impairments in autistic spectrum disorders originated from face processing deficits during the periods of infancy, childhood and adolescence. Recognition of face and expression of facial emotion could be affected either automatically by orienting towards faces after birth, or by “learning” processes in developmental periods such as identity and emotion processing. This article aimed to review neurobiological basis of face processing and recognition of emotional facial expressions during normal development and in autism.

  5. Interference among the Processing of Facial Emotion, Face Race, and Face Gender

    OpenAIRE

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People are able to simultaneously process multiple dimensions of facial properties. Facial processing models are based on the processing of facial properties. This paper examined the processing of facial emotion, face race and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interfered with face race in all the tasks. The interaction of face race and face gend...

  6. Emotion elicitor or emotion messenger? Subliminal priming reveals two faces of facial expressions.

    Science.gov (United States)

    Ruys, Kirsten I; Stapel, Diederik A

    2008-06-01

    Facial emotional expressions can serve both as emotional stimuli and as communicative signals. The research reported here was conducted to illustrate how responses to both roles of facial emotional expressions unfold over time. As an emotion elicitor, a facial emotional expression (e.g., a disgusted face) activates a response that is similar to responses to other emotional stimuli of the same valence (e.g., a dirty, nonflushed toilet). As an emotion messenger, the same facial expression (e.g., a disgusted face) serves as a communicative signal by also activating the knowledge that the sender is experiencing a specific emotion (e.g., the sender feels disgusted). By varying the duration of exposure to disgusted, fearful, angry, and neutral faces in two subliminal-priming studies, we demonstrated that responses to faces as emotion elicitors occur prior to responses to faces as emotion messengers, and that both types of responses may unfold unconsciously.

  7. 5-HTTLPR differentially predicts brain network responses to emotional faces

    DEFF Research Database (Denmark)

    Fisher, Patrick M; Grady, Cheryl L; Madsen, Martin K

    2015-01-01

    The effects of the 5-HTTLPR polymorphism on neural responses to emotionally salient faces have been studied extensively, focusing on amygdala reactivity and amygdala-prefrontal interactions. Despite compelling evidence that emotional face paradigms engage a distributed network of brain regions...... to fearful faces was significantly greater in S' carriers compared to LA LA individuals. These findings provide novel evidence for emotion-specific 5-HTTLPR effects on the response of a distributed set of brain regions including areas responsive to emotionally salient stimuli and critical components...... involved in emotion, cognitive and visual processing, less is known about 5-HTTLPR effects on broader network responses. To address this, we evaluated 5-HTTLPR differences in the whole-brain response to an emotional faces paradigm including neutral, angry and fearful faces using functional magnetic...

  8. Interference among the Processing of Facial Emotion, Face Race, and Face Gender

    Science.gov (United States)

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender). PMID:27840621

  9. Interference among the Processing of Facial Emotion, Face Race, and Face Gender.

    Science.gov (United States)

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender).

  10. Emotion categorization does not depend on explicit face categorization

    NARCIS (Netherlands)

    Seirafi, M.; de Weerd, P.; de Gelder, B.

    2013-01-01

    Face perception and emotion recognition have been extensively studied in the past decade; however, the relation between them is still poorly understood. A traditional view is that successful emotional categorization requires categorization of the stimulus as a ‘face', at least at the basic level.

  11. Face Age and Eye Gaze Influence Older Adults' Emotion Recognition.

    Science.gov (United States)

    Campbell, Anna; Murray, Janice E; Atkinson, Lianne; Ruffman, Ted

    2017-07-01

    Eye gaze has been shown to influence emotion recognition. In addition, older adults (over 65 years) are not as influenced by gaze direction cues as young adults (18-30 years). Nevertheless, these differences might stem from the use of young to middle-aged faces in emotion recognition research because older adults have an attention bias toward old-age faces. Therefore, using older face stimuli might allow older adults to process gaze direction cues to influence emotion recognition. To investigate this idea, young and older adults completed an emotion recognition task with young and older face stimuli displaying direct and averted gaze, assessing labeling accuracy for angry, disgusted, fearful, happy, and sad faces. Direct gaze rather than averted gaze improved young adults' recognition of emotions in young and older faces, but for older adults this was true only for older faces. The current study highlights the impact of stimulus face age and gaze direction on emotion recognition in young and older adults. The use of young face stimuli with direct gaze in most research might contribute to age-related emotion recognition differences. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Behavioural and neurophysiological evidence for face identity and face emotion processing in animals

    Science.gov (United States)

    Tate, Andrew J; Fischer, Hanno; Leigh, Andrea E; Kendrick, Keith M

    2006-01-01

    Visual cues from faces provide important social information relating to individual identity, sexual attraction and emotional state. Behavioural and neurophysiological studies on both monkeys and sheep have shown that specialized skills and neural systems for processing these complex cues to guide behaviour have evolved in a number of mammals and are not present exclusively in humans. Indeed, there are remarkable similarities in the ways that faces are processed by the brain in humans and other mammalian species. While human studies with brain imaging and gross neurophysiological recording approaches have revealed global aspects of the face-processing network, they cannot investigate how information is encoded by specific neural networks. Single neuron electrophysiological recording approaches in both monkeys and sheep have, however, provided some insights into the neural encoding principles involved and, particularly, the presence of a remarkable degree of high-level encoding even at the level of a specific face. Recent developments that allow simultaneous recordings to be made from many hundreds of individual neurons are also beginning to reveal evidence for global aspects of a population-based code. This review will summarize what we have learned so far from these animal-based studies about the way the mammalian brain processes the faces and the emotions they can communicate, as well as associated capacities such as how identity and emotion cues are dissociated and how face imagery might be generated. It will also try to highlight what questions and advances in knowledge still challenge us in order to provide a complete understanding of just how brain networks perform this complex and important social recognition task. PMID:17118930

  13. Method for Face-Emotion Retrieval Using A Cartoon Emotional Expression Approach

    Science.gov (United States)

    Kostov, Vlaho; Yanagisawa, Hideyoshi; Johansson, Martin; Fukuda, Shuichi

    A simple method for extracting emotion from a human face, as a form of non-verbal communication, was developed to cope with and optimize mobile communication in a globalized and diversified society. A cartoon face based model was developed and used to evaluate emotional content of real faces. After a pilot survey, basic rules were defined and student subjects were asked to express emotion using the cartoon face. Their face samples were then analyzed using principal component analysis and the Mahalanobis distance method. Feature parameters considered as having relations with emotions were extracted and new cartoon faces (based on these parameters) were generated. The subjects evaluated emotion of these cartoon faces again and we confirmed these parameters were suitable. To confirm how these parameters could be applied to real faces, we asked subjects to express the same emotions which were then captured electronically. Simple image processing techniques were also developed to extract these features from real faces and we then compared them with the cartoon face parameters. It is demonstrated via the cartoon face that we are able to express the emotions from very small amounts of information. As a result, real and cartoon faces correspond to each other. It is also shown that emotion could be extracted from still and dynamic real face images using these cartoon-based features.

  14. Alcoholism and dampened temporal limbic activation to emotional faces.

    Science.gov (United States)

    Marinkovic, Ksenija; Oscar-Berman, Marlene; Urban, Trinity; O'Reilly, Cara E; Howard, Julie A; Sawyer, Kayle; Harris, Gordon J

    2009-11-01

    Excessive chronic drinking is accompanied by a broad spectrum of emotional changes ranging from apathy and emotional flatness to deficits in comprehending emotional information, but their neural bases are poorly understood. Emotional abnormalities associated with alcoholism were examined with functional magnetic resonance imaging in abstinent long-term alcoholic men in comparison to healthy demographically matched controls. Participants were presented with emotionally valenced words and photographs of faces during deep (semantic) and shallow (perceptual) encoding tasks followed by recognition. Overall, faces evoked stronger activation than words, with the expected material-specific laterality (left hemisphere for words, and right for faces) and depth of processing effects. However, whereas control participants showed stronger activation in the amygdala and hippocampus when viewing faces with emotional (relative to neutral) expressions, the alcoholics responded in an undifferentiated manner to all facial expressions. In the alcoholic participants, amygdala activity was inversely correlated with an increase in lateral prefrontal activity as a function of their behavioral deficits. Prefrontal modulation of emotional function as a compensation for the blunted amygdala activity during a socially relevant face appraisal task is in agreement with a distributed network engagement during emotional face processing. Deficient activation of amygdala and hippocampus may underlie impaired processing of emotional faces associated with long-term alcoholism and may be a part of the wide array of behavioral problems including disinhibition, concurring with previously documented interpersonal difficulties in this population. Furthermore, the results suggest that alcoholics may rely on prefrontal rather than temporal limbic areas in order to compensate for reduced limbic responsivity and to maintain behavioral adequacy when faced with emotionally or socially challenging situations.

  15. Task-irrelevant emotion facilitates face discrimination learning.

    Science.gov (United States)

    Lorenzino, Martina; Caudek, Corrado

    2015-03-01

    We understand poorly how the ability to discriminate faces from one another is shaped by visual experience. The purpose of the present study is to determine whether face discrimination learning can be facilitated by facial emotions. To answer this question, we used a task-irrelevant perceptual learning paradigm because it closely mimics the learning processes that, in daily life, occur without a conscious intention to learn and without an attentional focus on specific facial features. We measured face discrimination thresholds before and after training. During the training phase (4 days), participants performed a contrast discrimination task on face images. They were not informed that we introduced (task-irrelevant) subtle variations in the face images from trial to trial. For the Identity group, the task-irrelevant features were variations along a morphing continuum of facial identity. For the Emotion group, the task-irrelevant features were variations along an emotional expression morphing continuum. The Control group did not undergo contrast discrimination learning and only performed the pre-training and post-training tests, with the same temporal gap between them as the other two groups. Results indicate that face discrimination improved, but only for the Emotion group. Participants in the Emotion group, moreover, showed face discrimination improvements also for stimulus variations along the facial identity dimension, even if these (task-irrelevant) stimulus features had not been presented during training. The present results highlight the importance of emotions for face discrimination learning. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. The complex duration perception of emotional faces: Effects of face direction

    Directory of Open Access Journals (Sweden)

    Katrin Martina Kliegl

    2015-03-01

    Full Text Available The perceived duration of emotional face stimuli strongly depends on the expressed emotion. But, emotional faces also differ regarding a number of other features like gaze, face direction, or sex. Usually, these features have been controlled by only using pictures of female models with straight gaze and face direction. Doi and Shinohara (2009 reported that an overestimation of angry faces could only be found when the model’s gaze was oriented towards the observer. We aimed at replicating this effect for face direction. Moreover, we explored the effect of face direction on the duration perception sad faces. Controlling for the sex of the face model and the participant, female and male participants rated the duration of neutral, angry and sad face stimuli of both sexes photographed from different perspectives in a bisection task. In line with current findings, we report a significant overestimation of angry compared to neutral face stimuli that was modulated by face direction. Moreover, the perceived duration of sad face stimuli did not differ from that of neutral faces and was not influenced by face direction. Furthermore, we found that faces of the opposite sex appeared to last longer than those of the same sex. This outcome is discussed with regards to stimulus parameters like the induced arousal, social relevance and an evolutionary context.

  17. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias.

    Science.gov (United States)

    Invitto, Sara; Calcagnì, Antonio; Mignozzi, Arianna; Scardino, Rosanna; Piraino, Giulia; Turchi, Daniele; De Feudis, Irio; Brunetti, Antonio; Bevilacqua, Vitoantonio; de Tommaso, Marina

    2017-01-01

    Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians). Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP) and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment). A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.

  18. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias

    Directory of Open Access Journals (Sweden)

    Sara Invitto

    2017-08-01

    Full Text Available Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians. Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment. A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.

  19. The contribution of emotional empathy to approachability judgements assigned to emotional faces is context specific

    Directory of Open Access Journals (Sweden)

    Megan L Willis

    2015-08-01

    Full Text Available Previous research on approachability judgements has indicated that facial expressions modulate how these judgements are made, but the relationship between emotional empathy and context in this decision-making process has not yet been examined. This study examined the contribution of emotional empathy to approachability judgements assigned to emotional faces in different contexts. One hundred and twenty female participants completed the Questionnaire Measure of Emotional Empathy. Participants provided approachability judgements to faces displaying angry, disgusted, fearful, happy, neutral and sad expressions, in three different contexts – when evaluating whether they would approach another individual to: 1 receive help; 2 give help; or 3 when no contextual information was provided. In addition, participants were also required to provide ratings of perceived threat, emotional intensity and label facial expressions. Emotional empathy significantly predicted approachability ratings for specific emotions in each context, over and above the contribution of perceived threat and intensity, which were associated with emotional empathy. Higher emotional empathy predicted less willingness to approach people with angry and disgusted faces to receive help, and a greater willingness to approach people with happy faces to receive help. Higher emotional empathy also predicted a greater willingness to approach people with sad faces to offer help, and more willingness to approach people with happy faces when no contextual information was provided. These results highlight the important contribution of individual differences in emotional empathy in predicting how approachability judgements are assigned to facial expressions in context.

  20. Men appear more lateralized when noticing emotion in male faces.

    Science.gov (United States)

    Rahman, Qazi; Anchassi, Tarek

    2012-02-01

    Empirical tests of the "right hemisphere dominance" versus "valence" theories of emotion processing are confounded by known sex differences in lateralization. Moreover, information about the sex of the person posing an emotion might be processed differently by men and women because of an adaptive male bias to notice expressions of threat and vigilance in other male faces. The purpose of this study was to investigate whether sex of poser and emotion displayed influenced lateralization in men and women by analyzing "laterality quotient" scores on a test which depicts vertically split chimeric faces, formed with one half showing a neutral expression and the other half showing an emotional expression. We found that men (N = 50) were significantly more lateralized for emotions indicative of vigilance and threat (happy, sad, angry, and surprised) in male faces relative to female faces and compared to women (N = 44). These data indicate that sex differences in functional cerebral lateralization for facial emotion may be specific to the emotion presented and the sex of face presenting it. PsycINFO Database Record (c) 2012 APA, all rights reserved

  1. Visual Afterimages of Emotional Faces in High Functioning Autism

    Science.gov (United States)

    Rutherford, M. D.; Troubridge, Erin K.; Walsh, Jennifer

    2012-01-01

    Fixating an emotional facial expression can create afterimages, such that subsequent faces are seen as having the opposite expression of that fixated. Visual afterimages have been used to map the relationships among emotion categories, and this method was used here to compare ASD and matched control participants. Participants adapted to a facial…

  2. Facial emotion recognition, face scan paths, and face perception in children with neurofibromatosis type 1.

    Science.gov (United States)

    Lewis, Amelia K; Porter, Melanie A; Williams, Tracey A; Bzishvili, Samantha; North, Kathryn N; Payne, Jonathan M

    2017-05-01

    This study aimed to investigate face scan paths and face perception abilities in children with Neurofibromatosis Type 1 (NF1) and how these might relate to emotion recognition abilities in this population. The authors investigated facial emotion recognition, face scan paths, and face perception in 29 children with NF1 compared to 29 chronological age-matched typically developing controls. Correlations between facial emotion recognition, face scan paths, and face perception in children with NF1 were examined. Children with NF1 displayed significantly poorer recognition of fearful expressions compared to controls, as well as a nonsignificant trend toward poorer recognition of anger. Although there was no significant difference between groups in time spent viewing individual core facial features (eyes, nose, mouth, and nonfeature regions), children with NF1 spent significantly less time than controls viewing the face as a whole. Children with NF1 also displayed significantly poorer face perception abilities than typically developing controls. Facial emotion recognition deficits were not significantly associated with aberrant face scan paths or face perception abilities in the NF1 group. These results suggest that impairments in the perception, identification, and interpretation of information from faces are important aspects of the social-cognitive phenotype of NF1. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Seeing emotion with your ears: emotional prosody implicitly guides visual attention to faces.

    Directory of Open Access Journals (Sweden)

    Simon Rigoulot

    Full Text Available Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality and emotional speech prosody (auditory modality which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0-1250 ms], [1250-2500 ms], [2500-5000 ms] were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect, although this effect was often emotion-specific (with greatest effects for fear. Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions.

  4. Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

    Science.gov (United States)

    Rigoulot, Simon; Pell, Marc D.

    2012-01-01

    Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions. PMID:22303454

  5. Emotion recognition training using composite faces generalises across identities but not all emotions.

    Science.gov (United States)

    Dalili, Michael N; Schofield-Toloza, Lawrence; Munafò, Marcus R; Penton-Voak, Ian S

    2017-08-01

    Many cognitive bias modification (CBM) tasks use facial expressions of emotion as stimuli. Some tasks use unique facial stimuli, while others use composite stimuli, given evidence that emotion is encoded prototypically. However, CBM using composite stimuli may be identity- or emotion-specific, and may not generalise to other stimuli. We investigated the generalisability of effects using composite faces in two experiments. Healthy adults in each study were randomised to one of four training conditions: two stimulus-congruent conditions, where same faces were used during all phases of the task, and two stimulus-incongruent conditions, where faces of the opposite sex (Experiment 1) or faces depicting another emotion (Experiment 2) were used after the modification phase. Our results suggested that training effects generalised across identities. However, our results indicated only partial generalisation across emotions. These findings suggest effects obtained using composite stimuli may extend beyond the stimuli used in the task but remain emotion-specific.

  6. Short-term memory for emotional faces in dysphoria.

    Science.gov (United States)

    Noreen, Saima; Ridout, Nathan

    2010-07-01

    The study aimed to determine if the memory bias for negative faces previously demonstrated in depression and dysphoria generalises from long- to short-term memory. A total of 29 dysphoric (DP) and 22 non-dysphoric (ND) participants were presented with a series of faces and asked to identify the emotion portrayed (happiness, sadness, anger, or neutral affect). Following a delay, four faces were presented (the original plus three distractors) and participants were asked to identify the target face. Half of the trials assessed memory for facial emotion, and the remaining trials examined memory for facial identity. At encoding, no group differences were apparent. At memory testing, relative to ND participants, DP participants exhibited impaired memory for all types of facial emotion and for facial identity when the faces featured happiness, anger, or neutral affect, but not sadness. DP participants exhibited impaired identity memory for happy faces relative to angry, sad, and neutral, whereas ND participants exhibited enhanced facial identity memory when faces were angry. In general, memory for faces was not related to performance at encoding. However, in DP participants only, memory for sad faces was related to sadness recognition at encoding. The results suggest that the negative memory bias for faces in dysphoria does not generalise from long- to short-term memory.

  7. Poignancy: Mixed Emotional Experience in the Face of Meaningful Endings

    Science.gov (United States)

    Ersner-Hershfield, Hal; Mikels, Joseph A.; Sullivan, Sarah J.; Carstensen, Laura L.

    2009-01-01

    The experience of mixed emotions increases with age. Socioemotional selectivity theory suggests that mixed emotions are associated with shifting time horizons. Theoretically, perceived constraints on future time increase appreciation for life, which, in turn, elicits positive emotions such as happiness. Yet, the very same temporal constraints heighten awareness that these positive experiences come to an end, thus yielding mixed emotional states. In 2 studies, the authors examined the link between the awareness of anticipated endings and mixed emotional experience. In Study 1, participants repeatedly imagined being in a meaningful location. Participants in the experimental condition imagined being in the meaningful location for the final time. Only participants who imagined “last times” at meaningful locations experienced more mixed emotions. In Study 2, college seniors reported their emotions on graduation day. Mixed emotions were higher when participants were reminded of the ending that they were experiencing. Findings suggest that poignancy is an emotional experience associated with meaningful endings. PMID:18179325

  8. Transcutaneous vagus nerve stimulation (tVNS) enhances recognition of emotions in faces but not bodies.

    Science.gov (United States)

    Sellaro, Roberta; de Gelder, Beatrice; Finisguerra, Alessandra; Colzato, Lorenza S

    2018-02-01

    The polyvagal theory suggests that the vagus nerve is the key phylogenetic substrate enabling optimal social interactions, a crucial aspect of which is emotion recognition. A previous study showed that the vagus nerve plays a causal role in mediating people's ability to recognize emotions based on images of the eye region. The aim of this study is to verify whether the previously reported causal link between vagal activity and emotion recognition can be generalized to situations in which emotions must be inferred from images of whole faces and bodies. To this end, we employed transcutaneous vagus nerve stimulation (tVNS), a novel non-invasive brain stimulation technique that causes the vagus nerve to fire by the application of a mild electrical stimulation to the auricular branch of the vagus nerve, located in the anterior protuberance of the outer ear. In two separate sessions, participants received active or sham tVNS before and while performing two emotion recognition tasks, aimed at indexing their ability to recognize emotions from facial and bodily expressions. Active tVNS, compared to sham stimulation, enhanced emotion recognition for whole faces but not for bodies. Our results confirm and further extend recent observations supporting a causal relationship between vagus nerve activity and the ability to infer others' emotional state, but restrict this association to situations in which the emotional state is conveyed by the whole face and/or by salient facial cues, such as eyes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Digitizing the moving face: asymmetries of emotion and gender

    Directory of Open Access Journals (Sweden)

    Ashish Desai

    2009-04-01

    Full Text Available In a previous study with dextral males, Richardson and Bowers (1999 digitized real time video signals and found movement asymmetries over the left lower face for emotional, but not non-emotional expressions. These findings correspond to observations, based on subjective ratings of static pictures, that the left side of the face is more intensely expressive than the right (Sackeim, 1978. From a neuropsychological perspective, one possible interpretation of these findings is that emotional priming of the right hemisphere of the brain results in more muscular activity over the contralateral left than ipsilateral right side of the lower face. The purpose of the present study was to use computer-imaging methodology to determine whether there were gender differences in movement asymmetries across the face. We hypothesized that females would show less evidence of facial movement asymmetries during the expression of emotion. This hypothesis was based on findings of gender differences in the degree to which specific cognitive functions may be lateralized in the brain (i.e., females less lateralized than males. Forty-eight normal dextral college students (25 females, 23 males were videotaped while they displayed voluntary emotional expressions. A quantitative measure of movement change (called entropy was computed by subtracting the values of corresponding pixel intensities between adjacent frames and summing their differences. The upper and lower hemiface regions were examined separately due to differences in the cortical enervation of facial muscles in the upper (bilateral versus lower face (contralateral. Repeated measures ANOVA’s were used to analyze for the amount of overall facial movement and for facial asymmetries. Certain emotions were associated with significantly greater overall facial movement than others (p fear > (angry =sad > neutral. Both males and females showed this same pattern, with no gender differences in the total amount of facial

  10. Neurophysiological evidence (ERPs) for hemispheric processing of facial expressions of emotions: Evidence from whole face and chimeric face stimuli.

    Science.gov (United States)

    Damaskinou, Nikoleta; Watling, Dawn

    2018-05-01

    This study was designed to investigate the patterns of electrophysiological responses of early emotional processing at frontocentral sites in adults and to explore whether adults' activation patterns show hemispheric lateralization for facial emotion processing. Thirty-five adults viewed full face and chimeric face stimuli. After viewing two faces, sequentially, participants were asked to decide which of the two faces was more emotive. The findings from the standard faces and the chimeric faces suggest that emotion processing is present during the early phases of face processing in the frontocentral sites. In particular, sad emotional faces are processed differently than neutral and happy (including happy chimeras) faces in these early phases of processing. Further, there were differences in the electrode amplitudes over the left and right hemisphere, particularly in the early temporal window. This research provides supporting evidence that the chimeric face test is a test of emotion processing that elicits right hemispheric processing.

  11. Disrupted neural processing of emotional faces in psychopathy.

    Science.gov (United States)

    Contreras-Rodríguez, Oren; Pujol, Jesus; Batalla, Iolanda; Harrison, Ben J; Bosque, Javier; Ibern-Regàs, Immaculada; Hernández-Ribas, Rosa; Soriano-Mas, Carles; Deus, Joan; López-Solà, Marina; Pifarré, Josep; Menchón, José M; Cardoner, Narcís

    2014-04-01

    Psychopaths show a reduced ability to recognize emotion facial expressions, which may disturb the interpersonal relationship development and successful social adaptation. Behavioral hypotheses point toward an association between emotion recognition deficits in psychopathy and amygdala dysfunction. Our prediction was that amygdala dysfunction would combine deficient activation with disturbances in functional connectivity with cortical regions of the face-processing network. Twenty-two psychopaths and 22 control subjects were assessed and functional magnetic resonance maps were generated to identify both brain activation and task-induced functional connectivity using psychophysiological interaction analysis during an emotional face-matching task. Results showed significant amygdala activation in control subjects only, but differences between study groups did not reach statistical significance. In contrast, psychopaths showed significantly increased activation in visual and prefrontal areas, with this latest activation being associated with psychopaths' affective-interpersonal disturbances. Psychophysiological interaction analyses revealed a reciprocal reduction in functional connectivity between the left amygdala and visual and prefrontal cortices. Our results suggest that emotional stimulation may evoke a relevant cortical response in psychopaths, but a disruption in the processing of emotional faces exists involving the reciprocal functional interaction between the amygdala and neocortex, consistent with the notion of a failure to integrate emotion into cognition in psychopathic individuals.

  12. Similar representations of emotions across faces and voices.

    Science.gov (United States)

    Kuhn, Lisa Katharina; Wydell, Taeko; Lavan, Nadine; McGettigan, Carolyn; Garrido, Lúcia

    2017-09-01

    [Correction Notice: An Erratum for this article was reported in Vol 17(6) of Emotion (see record 2017-18585-001). In the article, the copyright attribution was incorrectly listed and the Creative Commons CC-BY license disclaimer was incorrectly omitted from the author note. The correct copyright is "© 2017 The Author(s)" and the omitted disclaimer is below. All versions of this article have been corrected. "This article has been published under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Copyright for this article is retained by the author(s). Author(s) grant(s) the American Psychological Association the exclusive right to publish the article and identify itself as the original publisher."] Emotions are a vital component of social communication, carried across a range of modalities and via different perceptual signals such as specific muscle contractions in the face and in the upper respiratory system. Previous studies have found that emotion recognition impairments after brain damage depend on the modality of presentation: recognition from faces may be impaired whereas recognition from voices remains preserved, and vice versa. On the other hand, there is also evidence for shared neural activation during emotion processing in both modalities. In a behavioral study, we investigated whether there are shared representations in the recognition of emotions from faces and voices. We used a within-subjects design in which participants rated the intensity of facial expressions and nonverbal vocalizations for each of the 6 basic emotion labels. For each participant and each modality, we then computed a representation matrix with the intensity ratings of each emotion. These matrices allowed us to examine the patterns of confusions between emotions and to characterize the representations

  13. Are neutral faces of children really emotionally neutral?

    OpenAIRE

    小松, 佐穂子; 箱田, 裕司; Komatsu, Sahoko; Hakoda, Yuji

    2012-01-01

    In this study, we investigated whether people recognize emotions from neutral faces of children (11 to 13 years old). We took facial images of 53 male and 54 female Japanese children who had been asked to keep a neutral facial expression. Then, we conducted an experiment in which 43 participants (19 to 34 years old) rated the strength of four emotions (happiness, surprise, sadness, and anger) for the facial images, using a 7- point scale. We found that (a) they rated both male and female face...

  14. Categorical Perception of emotional faces is not affected by aging

    Directory of Open Access Journals (Sweden)

    Mandy Rossignol

    2009-11-01

    Full Text Available Effects of normal aging on categorical perception (CP of facial emotional expressions were investigated. One-hundred healthy participants (20 to 70 years old; five age groups had to identify morphed expressions ranging from neutrality to happiness, sadness and fear. We analysed percentages and latencies of correct recognition for nonmorphed emotional expressions, percentages and latencies of emotional recognition for morphed-faces, locus of the boundaries along the different continua and the number of intrusions. The results showed that unmorphed happy and fearful faces were better processed than unmorphed sad and neutral faces. For morphed faces, CP was confirmed, as latencies increased as a function of the distance between the displayed morph and the original unmorphed photograph. The locus of categorical boundaries was not affected by age. Aging did not alter the accuracy of recognition for original pictures, no more than the emotional recognition of morphed faces or the rate of intrusions. However, latencies of responses increased with age, for both unmorphed and morphed pictures. In conclusion, CP of facial expressions appears to be spared in aging.

  15. Enhanced amygdala reactivity to emotional faces in adults reporting childhood emotional maltreatment

    Science.gov (United States)

    van Tol, Marie-José; Demenescu, Liliana R.; van der Wee, Nic J. A.; Veltman, Dick J.; Aleman, André; van Buchem, Mark A.; Spinhoven, Philip; Penninx, Brenda W. J. H.; Elzinga, Bernet M.

    2013-01-01

    In the context of chronic childhood emotional maltreatment (CEM; emotional abuse and/or neglect), adequately responding to facial expressions is an important skill. Over time, however, this adaptive response may lead to a persistent vigilance for emotional facial expressions. The amygdala and the medial prefrontal cortex (mPFC) are key regions in face processing. However, the neurobiological correlates of face processing in adults reporting CEM are yet unknown. We examined amydala and mPFC reactivity to emotional faces (Angry, Fearful, Sad, Happy, Neutral) vs scrambled faces in healthy controls and unmedicated patients with depression and/or anxiety disorders reporting CEM before the age of 16 years (n = 60), and controls and patients who report no childhood abuse (n = 75). We found that CEM was associated with enhanced bilateral amygdala reactivity to emotional faces in general, and independent of psychiatric status. Furthermore, we found no support for differential mPFC functioning, suggesting that amygdala hyper-responsivity to emotional facial perception in adults reporting CEM may be independent from top–down influences of the mPFC. These findings may be key in understanding the increased emotional sensitivity and interpersonal difficulties, that have been reported in individuals with a history of CEM. PMID:22258799

  16. Auditory Emotion Word Primes Influence Emotional Face Categorization in Children and Adults, but Not Vice Versa.

    Science.gov (United States)

    Vesker, Michael; Bahn, Daniela; Kauschke, Christina; Tschense, Monika; Degé, Franziska; Schwarzer, Gudrun

    2018-01-01

    In order to assess how the perception of audible speech and facial expressions influence one another for the perception of emotions, and how this influence might change over the course of development, we conducted two cross-modal priming experiments with three age groups of children (6-, 9-, and 12-years old), as well as college-aged adults. In Experiment 1, 74 children and 24 adult participants were tasked with categorizing photographs of emotional faces as positive or negative as quickly as possible after being primed with emotion words presented via audio in valence-congruent and valence-incongruent trials. In Experiment 2, 67 children and 24 adult participants carried out a similar categorization task, but with faces acting as visual primes, and emotion words acting as auditory targets. The results of Experiment 1 showed that participants made more errors when categorizing positive faces primed by negative words versus positive words, and that 6-year-old children are particularly sensitive to positive word primes, giving faster correct responses regardless of target valence. Meanwhile, the results of Experiment 2 did not show any congruency effects for priming by facial expressions. Thus, audible emotion words seem to exert an influence on the emotional categorization of faces, while faces do not seem to influence the categorization of emotion words in a significant way.

  17. Assessment of incongruent emotions in face and voice

    NARCIS (Netherlands)

    Takagi, S.; Tabei, K.-I.; Huis in 't Veld, E.M.J.; de Gelder, B.

    2013-01-01

    Information derived from facial and vocal nonverbal expressions plays an important role in social communication in the real and virtual worlds. In the present study, we investigated cultural differences between Japanese and Dutch participants in the multisensory perception of emotion. We used a face

  18. Child's recognition of emotions in robot's face and body

    NARCIS (Netherlands)

    Cohen, I.; Looije, R.; Neerincx, M.A.

    2011-01-01

    Social robots can comfort and support children who have to cope with chronic diseases. In previous studies, a "facial robot", the iCat, proved to show well-recognized emotional expressions that are important in social interactions. The question is if a mobile robot without a face, the Nao, can

  19. Emotional faces influence evaluation of natural and transformed food.

    Science.gov (United States)

    Manippa, Valerio; Padulo, Caterina; Brancucci, Alfredo

    2018-07-01

    Previous evidence showed the presence of a straight relationship between feeding behavior and emotions. Despite that, no studies have focused on the influence of emotional faces on food processing. In our study, participants were presented with 72 couples of visual stimuli composed of a neutral, happy, or disgusted faces (5000 ms duration in Experiment 1, adaptation; 150 ms in Experiment 2, priming) followed by a food stimulus (1500 ms). Food stimuli were grouped in pleasant foods, further divided in natural and transformed, and unpleasant rotten foods. The task consisted in judging the food valence (as 'pleasant' or 'unpleasant') by keypress. Results showed a different pattern of response based on the transformation level of food. In general, the evaluation of natural foods was more rapid compared with transformed foods, maybe for their simplicity and healthier perception. In addition, transformed foods yielded incongruent responses with respect to the preceding emotional face, whereas natural foods yielded congruent responses with respect to it. These effects were independent of the duration of the emotional face (i.e., adaptation or priming paradigm) and may depend on pleasant food stimuli salience.

  20. Identification of emotions in mixed disgusted-happy faces as a function of depressive symptom severity.

    Science.gov (United States)

    Sanchez, Alvaro; Romero, Nuria; Maurage, Pierre; De Raedt, Rudi

    2017-12-01

    Interpersonal difficulties are common in depression, but their underlying mechanisms are not yet fully understood. The role of depression in the identification of mixed emotional signals with a direct interpersonal value remains unclear. The present study aimed to clarify this question. A sample of 39 individuals reporting a broad range of depression levels completed an emotion identification task where they viewed faces expressing three emotional categories (100% disgusted and 100% happy faces, as well as their morphed 50% disgusted - 50% happy exemplars). Participants were asked to identify the corresponding depicted emotion as "clearly disgusted", "mixed", or "clearly happy". Higher depression levels were associated with lower identification of positive emotions in 50% disgusted - 50% happy faces. The study was conducted with an analogue sample reporting individual differences in subclinical depression levels. Further research must replicate these findings in a clinical sample and clarify whether differential emotional identification patterns emerge in depression for different mixed negative-positive emotions (sad-happy vs. disgusted-happy). Depression may account for a lower bias to perceive positive states when ambiguous states from others include subtle signals of social threat (i.e., disgust), leading to an under-perception of positive social signals. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Mixed emotions: Sensitivity to facial variance in a crowd of faces.

    Science.gov (United States)

    Haberman, Jason; Lee, Pegan; Whitney, David

    2015-01-01

    The visual system automatically represents summary information from crowds of faces, such as the average expression. This is a useful heuristic insofar as it provides critical information about the state of the world, not simply information about the state of one individual. However, the average alone is not sufficient for making decisions about how to respond to a crowd. The variance or heterogeneity of the crowd--the mixture of emotions--conveys information about the reliability of the average, essential for determining whether the average can be trusted. Despite its importance, the representation of variance within a crowd of faces has yet to be examined. This is addressed here in three experiments. In the first experiment, observers viewed a sample set of faces that varied in emotion, and then adjusted a subsequent set to match the variance of the sample set. To isolate variance as the summary statistic of interest, the average emotion of both sets was random. Results suggested that observers had information regarding crowd variance. The second experiment verified that this was indeed a uniquely high-level phenomenon, as observers were unable to derive the variance of an inverted set of faces as precisely as an upright set of faces. The third experiment replicated and extended the first two experiments using method-of-constant-stimuli. Together, these results show that the visual system is sensitive to emergent information about the emotional heterogeneity, or ambivalence, in crowds of faces.

  2. How should neuroscience study emotions? by distinguishing emotion states, concepts, and experiences.

    Science.gov (United States)

    Adolphs, Ralph

    2017-01-01

    In this debate with Lisa Feldman Barrett, I defend a view of emotions as biological functional states. Affective neuroscience studies emotions in this sense, but it also studies the conscious experience of emotion ('feelings'), our ability to attribute emotions to others and to animals ('attribution', 'anthropomorphizing'), our ability to think and talk about emotion ('concepts of emotion', 'semantic knowledge of emotion') and the behaviors caused by an emotion ('expression of emotions', 'emotional reactions'). I think that the most pressing challenge facing affective neuroscience is the need to carefully distinguish between these distinct aspects of 'emotion'. I view emotion states as evolved functional states that regulate complex behavior, in both people and animals, in response to challenges that instantiate recurrent environmental themes. These functional states, in turn, can also cause conscious experiences (feelings), and their effects and our memories for those effects also contribute to our semantic knowledge of emotions (concepts). Cross-species studies, dissociations in neurological and psychiatric patients, and more ecologically valid neuroimaging designs should be used to partly separate these different phenomena. © The Author (2016). Published by Oxford University Press.

  3. Processing of emotional faces in congenital amusia: An emotional music priming event-related potential study.

    Science.gov (United States)

    Zhishuai, Jin; Hong, Liu; Daxing, Wu; Pin, Zhang; Xuejing, Lu

    2017-01-01

    Congenital amusia is characterized by lifelong impairments in music perception and processing. It is unclear whether pitch detection deficits impact amusic individuals' perception of musical emotion. In the current work, 19 amusics and 21 healthy controls were subjected to electroencephalography (EEG) while being exposed to music excerpts and emotional faces. We assessed each individual's ability to discriminate positive- and negative-valenced emotional faces and analyzed electrophysiological indices, in the form of event-related potentials (ERPs) recorded at 32 sites, following exposure to emotionally positive or negative music excerpts. We observed smaller N2 amplitudes in response to facial expressions in the amusia group than in the control group, suggesting that amusics were less affected by the musical stimuli. The late-positive component (LPC) in amusics was similar to that in controls. Our results suggest that the neurocognitive deficit characteristic of congenital amusia is fundamentally an impairment in musical information processing rather than an impairment in emotional processing.

  4. Processing of emotional faces in congenital amusia: An emotional music priming event-related potential study

    Directory of Open Access Journals (Sweden)

    Jin Zhishuai

    2017-01-01

    Full Text Available Congenital amusia is characterized by lifelong impairments in music perception and processing. It is unclear whether pitch detection deficits impact amusic individuals' perception of musical emotion. In the current work, 19 amusics and 21 healthy controls were subjected to electroencephalography (EEG while being exposed to music excerpts and emotional faces. We assessed each individual's ability to discriminate positive- and negative-valenced emotional faces and analyzed electrophysiological indices, in the form of event-related potentials (ERPs recorded at 32 sites, following exposure to emotionally positive or negative music excerpts. We observed smaller N2 amplitudes in response to facial expressions in the amusia group than in the control group, suggesting that amusics were less affected by the musical stimuli. The late-positive component (LPC in amusics was similar to that in controls. Our results suggest that the neurocognitive deficit characteristic of congenital amusia is fundamentally an impairment in musical information processing rather than an impairment in emotional processing.

  5. Face or body? Oxytocin improves perception of emotions from facial expressions in incongruent emotional body context.

    Science.gov (United States)

    Perry, Anat; Aviezer, Hillel; Goldstein, Pavel; Palgi, Sharon; Klein, Ehud; Shamay-Tsoory, Simone G

    2013-11-01

    The neuropeptide oxytocin (OT) has been repeatedly reported to play an essential role in the regulation of social cognition in humans in general, and specifically in enhancing the recognition of emotions from facial expressions. The later was assessed in different paradigms that rely primarily on isolated and decontextualized emotional faces. However, recent evidence has indicated that the perception of basic facial expressions is not context invariant and can be categorically altered by context, especially body context, at early perceptual levels. Body context has a strong effect on our perception of emotional expressions, especially when the actual target face and the contextually expected face are perceptually similar. To examine whether and how OT affects emotion recognition, we investigated the role of OT in categorizing facial expressions in incongruent body contexts. Our results show that in the combined process of deciphering emotions from facial expressions and from context, OT gives an advantage to the face. This advantage is most evident when the target face and the contextually expected face are perceptually similar. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Children's understanding of facial expression of emotion: II. Drawing of emotion-faces.

    Science.gov (United States)

    Missaghi-Lakshman, M; Whissell, C

    1991-06-01

    67 children from Grades 2, 4, and 7 drew faces representing the emotional expressions of fear, anger, surprise, disgust, happiness, and sadness. The children themselves and 29 adults later decoded the drawings in an emotion-recognition task. Children were the more accurate decoders, and their accuracy and the accuracy of adults increased significantly for judgments of 7th-grade drawings. The emotions happy and sad were most accurately decoded. There were no significant differences associated with sex. In their drawings, children utilized a symbol system that seems to be based on a highlighting or exaggeration of features of the innately governed facial expression of emotion.

  7. Vicarious Social Touch Biases Gazing at Faces and Facial Emotions.

    Science.gov (United States)

    Schirmer, Annett; Ng, Tabitha; Ebstein, Richard P

    2018-02-01

    Research has suggested that interpersonal touch promotes social processing and other-concern, and that women may respond to it more sensitively than men. In this study, we asked whether this phenomenon would extend to third-party observers who experience touch vicariously. In an eye-tracking experiment, participants (N = 64, 32 men and 32 women) viewed prime and target images with the intention of remembering them. Primes comprised line drawings of dyadic interactions with and without touch. Targets comprised two faces shown side-by-side, with one being neutral and the other being happy or sad. Analysis of prime fixations revealed that faces in touch interactions attracted longer gazing than faces in no-touch interactions. In addition, touch enhanced gazing at the area of touch in women but not men. Analysis of target fixations revealed that touch priming increased looking at both faces immediately after target onset, and subsequently, at the emotional face in the pair. Sex differences in target processing were nonsignificant. Together, the present results imply that vicarious touch biases visual attention to faces and promotes emotion sensitivity. In addition, they suggest that, compared with men, women are more aware of tactile exchanges in their environment. As such, vicarious touch appears to share important qualities with actual physical touch. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  8. Emotion Recognition in Face and Body Motion in Bulimia Nervosa.

    Science.gov (United States)

    Dapelo, Marcela Marin; Surguladze, Simon; Morris, Robin; Tchanturia, Kate

    2017-11-01

    Social cognition has been studied extensively in anorexia nervosa (AN), but there are few studies in bulimia nervosa (BN). This study investigated the ability of people with BN to recognise emotions in ambiguous facial expressions and in body movement. Participants were 26 women with BN, who were compared with 35 with AN, and 42 healthy controls. Participants completed an emotion recognition task by using faces portraying blended emotions, along with a body emotion recognition task by using videos of point-light walkers. The results indicated that BN participants exhibited difficulties recognising disgust in less-ambiguous facial expressions, and a tendency to interpret non-angry faces as anger, compared with healthy controls. These difficulties were similar to those found in AN. There were no significant differences amongst the groups in body motion emotion recognition. The findings suggest that difficulties with disgust and anger recognition in facial expressions may be shared transdiagnostically in people with eating disorders. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association.

  9. Emotion perception accuracy and bias in face-to-face versus cyberbullying.

    Science.gov (United States)

    Ciucci, Enrica; Baroncelli, Andrea; Nowicki, Stephen

    2014-01-01

    The authors investigated the association of traditional and cyber forms of bullying and victimization with emotion perception accuracy and emotion perception bias. Four basic emotions were considered (i.e., happiness, sadness, anger, and fear); 526 middle school students (280 females; M age = 12.58 years, SD = 1.16 years) were recruited, and emotionality was controlled. Results indicated no significant findings for girls. Boys with higher levels of traditional bullying did not show any deficit in perception accuracy of emotions, but they were prone to identify happiness and fear in faces when a different emotion was expressed; in addition, male cyberbullying was related to greater accuracy in recognizing fear. In terms of the victims, cyber victims had a global problem in recognizing emotions and a specific problem in processing anger and fear. It was concluded that emotion perception accuracy and bias were associated with bullying and victimization for boys not only in traditional settings but also in the electronic ones. Implications of these findings for possible intervention are discussed.

  10. Interactions among the effects of head orientation, emotional expression, and physical attractiveness on face preferences.

    Science.gov (United States)

    Main, Julie C; DeBruine, Lisa M; Little, Anthony C; Jones, Benedict C

    2010-01-01

    Previous studies have shown that preferences for direct versus averted gaze are modulated by emotional expressions and physical attractiveness. For example, preferences for direct gaze are stronger when judging happy or physically attractive faces than when judging disgusted or physically unattractive faces. Here we show that preferences for front versus three-quarter views of faces, in which gaze direction was always congruent with head orientation, are also modulated by emotional expressions and physical attractiveness; participants demonstrated preferences for front views of faces over three-quarter views of faces when judging the attractiveness of happy, physically attractive individuals, but not when judging the attractiveness of relatively unattractive individuals or those with disgusted expressions. Moreover, further analyses indicated that these interactions did not simply reflect differential perceptions of the intensity of the emotional expressions shown in each condition. Collectively, these findings present novel evidence that the effect of the direction of the attention of others on attractiveness judgments is modulated by cues to the physical attractiveness and emotional state of the depicted individual, potentially reflecting psychological adaptations for efficient allocation of social effort. These data also present the first behavioural evidence that the effect of the direction of the attention of others on attractiveness judgments reflects viewer-referenced, rather than face-referenced, coding and/or processing of gaze direction.

  11. How Context Influences Our Perception of Emotional Faces

    DEFF Research Database (Denmark)

    Calbi, Marta; Heimann, Katrin; Barratt, Daniel

    2017-01-01

    Facial expressions are of major importance in understanding the mental and emotional states of others. So far, most studies on the perception and comprehension of emotions have used isolated facial expressions as stimuli; for example, photographs of actors displaying facial expressions...... corresponding to one of the so called ‘basic emotions.’ However, our real experience during social interactions is different: facial expressions of emotion are mostly perceived in a wider context, constituted by body language, the surrounding environment, and our beliefs and expectations. Already in the early...... twentieth century, the Russian filmmaker Lev Kuleshov argued that such context, established by intermediate shots of strong emotional content, could significantly change our interpretation of facial expressions in film. Prior experiments have shown behavioral effects pointing in this direction, but have...

  12. Evaluating the Emotional State of a User Using a Webcam

    Directory of Open Access Journals (Sweden)

    Martin Magdin

    2016-09-01

    Full Text Available In online learning is more difficult for teachers identify to see how individual students behave. Student’s emotions like self-esteem, motivation, commitment, and others that are believed to be determinant in student’s performance can not be ignored, as they are known (affective states and also learning styles to greatly influence student’s learning. The ability of the computer to evaluate the emotional state of the user is getting bigger attention. By evaluating the emotional state, there is an attempt to overcome the barrier between man and non-emotional machine. Recognition of a real time emotion in e-learning by using webcams is research area in the last decade. Improving learning through webcams and microphones offers relevant feedback based upon learner’s facial expressions and verbalizations. The majority of current software does not work in real time – scans face and progressively evaluates its features. The designed software works by the use neural networks in real time which enable to apply the software into various fields of our lives and thus actively influence its quality. Validation of face emotion recognition software was annotated by using various experts. These expert findings were contrasted with the software results. An overall accuracy of our software based on the requested emotions and the recognized emotions is 78%. Online evaluation of emotions is an appropriate technology for enhancing the quality and efficacy of e-learning by including the learner´s emotional states.

  13. Social and emotional relevance in face processing: Happy faces of future interaction partners enhance the LPP

    Directory of Open Access Journals (Sweden)

    Florian eBublatzky

    2014-07-01

    Full Text Available Human face perception is modulated by both emotional valence and social relevance, but their interaction has rarely been examined. Event-related brain potentials (ERP to happy, neutral, and angry facial expressions with different degrees of social relevance were recorded. Social relevance was manipulated by presenting pictures of two specific face actors as future interaction partners (meet condition, whereas two other face actors remained non-relevant. As a further control condition all stimuli were presented without specific task instructions (passive viewing condition. A within-subject design (Facial Expression x Relevance x Task was implemented, where randomly ordered face stimuli of four actors (2 women, from the KDEF were presented for 1s to 26 participants (16 female. Results showed an augmented N170, early posterior negativity (EPN, and late positive potential (LPP for emotional in contrast to neutral facial expressions. Of particular interest, face processing varied as a function of instructed social relevance. Whereas the meet condition was accompanied with unspecific effects regardless of relevance (P1, EPN, viewing potential interaction partners was associated with increased LPP amplitudes. The LPP was specifically enhanced for happy facial expressions of the future interaction partners. This underscores that social relevance can impact face processing already at an early stage of visual processing. These findings are discussed within the framework of motivated attention and face processing theories.

  14. A face a mother could love: depression-related maternal neural responses to infant emotion faces.

    Science.gov (United States)

    Laurent, Heidemarie K; Ablow, Jennifer C

    2013-01-01

    Depressed mothers show negatively biased responses to their infants' emotional bids, perhaps due to faulty processing of infant cues. This study is the first to examine depression-related differences in mothers' neural response to their own infant's emotion faces, considering both effects of perinatal depression history and current depressive symptoms. Primiparous mothers (n = 22), half of whom had a history of major depressive episodes (with one episode occurring during pregnancy and/or postpartum), were exposed to images of their own and unfamiliar infants' joy and distress faces during functional neuroimaging. Group differences (depression vs. no-depression) and continuous effects of current depressive symptoms were tested in relation to neural response to own infant emotion faces. Compared to mothers with no psychiatric diagnoses, those with depression showed blunted responses to their own infant's distress faces in the dorsal anterior cingulate cortex. Mothers with higher levels of current symptomatology showed reduced responses to their own infant's joy faces in the orbitofrontal cortex and insula. Current symptomatology also predicted lower responses to own infant joy-distress in left-sided prefrontal and insula/striatal regions. These deficits in self-regulatory and motivational response circuits may help explain parenting difficulties in depressed mothers.

  15. Facing mixed emotions: Analytic and holistic perception of facial emotion expressions engages separate brain networks.

    Science.gov (United States)

    Meaux, Emilie; Vuilleumier, Patrik

    2016-11-01

    The ability to decode facial emotions is of primary importance for human social interactions; yet, it is still debated how we analyze faces to determine their expression. Here we compared the processing of emotional face expressions through holistic integration and/or local analysis of visual features, and determined which brain systems mediate these distinct processes. Behavioral, physiological, and brain responses to happy and angry faces were assessed by presenting congruent global configurations of expressions (e.g., happy top+happy bottom), incongruent composite configurations (e.g., angry top+happy bottom), and isolated features (e.g. happy top only). Top and bottom parts were always from the same individual. Twenty-six healthy volunteers were scanned using fMRI while they classified the expression in either the top or the bottom face part but ignored information in the other non-target part. Results indicate that the recognition of happy and anger expressions is neither strictly holistic nor analytic Both routes were involved, but with a different role for analytic and holistic information depending on the emotion type, and different weights of local features between happy and anger expressions. Dissociable neural pathways were engaged depending on emotional face configurations. In particular, regions within the face processing network differed in their sensitivity to holistic expression information, which predominantly activated fusiform, inferior occipital areas and amygdala when internal features were congruent (i.e. template matching), whereas more local analysis of independent features preferentially engaged STS and prefrontal areas (IFG/OFC) in the context of full face configurations, but early visual areas and pulvinar when seen in isolated parts. Collectively, these findings suggest that facial emotion recognition recruits separate, but interactive dorsal and ventral routes within the face processing networks, whose engagement may be shaped by

  16. The Perception of Time While Perceiving Dynamic Emotional Faces

    Directory of Open Access Journals (Sweden)

    Wang On eLi

    2015-08-01

    Full Text Available Emotion plays an essential role in the perception of time such that time is perceived to fly when events are enjoyable, while unenjoyable moments are perceived to drag. Previous studies have reported a time-drag effect when participants are presented with emotional facial expressions, regardless of the emotion presented. This effect can hardly be explained by induced emotion given the heterogeneous nature of emotional expressions. We conducted two experiments (n=44 & n=39 to examine the cognitive mechanism underlying this effect by presenting dynamic sequences of emotional expressions to participants. Each sequence started with a particular expression, then morphed to another. The presentation of dynamic facial expressions allows a comparison between the time-drag effect of homogeneous pairs of emotional expressions sharing similar valence and arousal to heterogeneous pairs. Sequences of seven durations (400ms, 600ms, 800ms, 1,000ms, 1,200ms, 1,400ms, 1,600ms were presented to participants, who were asked to judge whether the sequences were closer to 400ms or 1,600ms in a two-alternative forced choice task. The data were then collated according to conditions and fit into cumulative Gaussian curves to estimate the point of subjective equivalence indicating the perceived duration of 1,000ms. Consistent with previous reports, a feeling of time dragging is induced regardless of the sequence presented, such that 1,000ms is perceived to be longer than 1,000ms. In addition, dynamic facial expressions exert a greater effect on perceived time drag than static expressions. The effect is most prominent when the dynamics involve an angry face or a change in valence. The significance of this sensitivity is discussed in terms of emotion perception and its evolutionary significance for our attention mechanism.

  17. Grounding Context in Face Processing: Color, Emotion and Gender

    Directory of Open Access Journals (Sweden)

    Sandrine eGil

    2015-03-01

    Full Text Available In recent years, researchers have become interested in the way that the affective quality of contextual information transfers to a perceived target. We therefore examined the effect of a red (versus green, mixed red/green and achromatic background–known to be valenced−on the processing of stimuli that play a key role in human interactions, namely facial expressions. We also examined whether the valenced-color effect can be modulated by gender, which is also known to be valenced. Female and male adult participants performed a categorization task of facial expressions of emotion in which the faces of female and male posers expressing two ambiguous emotions (i.e., neutral and surprise were presented against the four different colored backgrounds. Additionally, this task was completed by collecting subjective ratings for each colored background in the form of five semantic differential scales corresponding to both discrete and dimensional perspectives of emotion. We found that the red background resulted in more negative face perception than the green background, whether the poser was female or male. However, whereas this valenced-color effect was the only effect for female posers, for male posers, the effect was modulated by both the nature of the ambiguous emotion and the decoder’s gender. Overall, our findings offer evidence that color and gender have a common valence-based dimension.

  18. Your emotion or mine: Labeling feelings alters emotional face perception- An ERP study on automatic and intentional affect labeling

    Directory of Open Access Journals (Sweden)

    Cornelia eHerbert

    2013-07-01

    Full Text Available Empirical evidence suggests that words are powerful regulators of emotion processing. Although a number of studies have used words as contextual cues for emotion processing, the role of what is being labeled by the words (i.e. one’s own emotion as compared to the emotion expressed by the sender is poorly understood. The present study reports results from two experiments which used ERP methodology to evaluate the impact of emotional faces and self- versus sender-related emotional pronoun-noun pairs (e.g. my fear vs. his fear as cues for emotional face processing. The influence of self- and sender-related cues on the processing of fearful, angry and happy faces was investigated in two contexts: an automatic (experiment 1 and intentional affect labeling task (experiment 2, along with control conditions of passive face processing. ERP patterns varied as a function of the label’s reference (self vs. sender and the intentionality of the labelling task (experiment 1 vs. experiment 2. In experiment 1, self-related labels increased the motivational relevance of the emotional faces in the time-window of the EPN component. Processing of sender-related labels improved emotion recognition specifically for fearful faces in the N170 time-window. Spontaneous processing of affective labels modulated later stages of face processing as well. Amplitudes of the late positive potential (LPP were reduced for fearful, happy, and angry faces relative to the control condition of passive viewing. During intentional regulation (experiment 2 amplitudes of the LPP were enhanced for emotional faces when subjects used the self-related emotion labels to label their own emotion during face processing, and they rated the faces as higher in arousal than the emotional faces that had been presented in the label sender’s emotion condition or the passive viewing condition. The present results argue in favor of a differentiated view of language-as-context for emotion processing.

  19. Reading emotions from faces in two indigenous societies.

    Science.gov (United States)

    Crivelli, Carlos; Jarillo, Sergio; Russell, James A; Fernández-Dols, José-Miguel

    2016-07-01

    That all humans recognize certain specific emotions from their facial expression-the Universality Thesis-is a pillar of research, theory, and application in the psychology of emotion. Its most rigorous test occurs in indigenous societies with limited contact with external cultural influences, but such tests are scarce. Here we report 2 such tests. Study 1 was of children and adolescents (N = 68; aged 6-16 years) of the Trobriand Islands (Papua New Guinea, South Pacific) with a Western control group from Spain (N = 113, of similar ages). Study 2 was of children and adolescents (N = 36; same age range) of Matemo Island (Mozambique, Africa). In both studies, participants were shown an array of prototypical facial expressions and asked to point to the person feeling a specific emotion: happiness, fear, anger, disgust, or sadness. The Spanish control group matched faces to emotions as predicted by the Universality Thesis: matching was seen on 83% to 100% of trials. For the indigenous societies, in both studies, the Universality Thesis was moderately supported for happiness: smiles were matched to happiness on 58% and 56% of trials, respectively. For other emotions, however, results were even more modest: 7% to 46% in the Trobriand Islands and 22% to 53% in Matemo Island. These results were robust across age, gender, static versus dynamic display of the facial expressions, and between- versus within-subjects design. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  20. Passive and motivated perception of emotional faces: qualitative and quantitative changes in the face processing network.

    Directory of Open Access Journals (Sweden)

    Laurie R Skelly

    Full Text Available Emotionally expressive faces are processed by a distributed network of interacting sub-cortical and cortical brain regions. The components of this network have been identified and described in large part by the stimulus properties to which they are sensitive, but as face processing research matures interest has broadened to also probe dynamic interactions between these regions and top-down influences such as task demand and context. While some research has tested the robustness of affective face processing by restricting available attentional resources, it is not known whether face network processing can be augmented by increased motivation to attend to affective face stimuli. Short videos of people expressing emotions were presented to healthy participants during functional magnetic resonance imaging. Motivation to attend to the videos was manipulated by providing an incentive for improved recall performance. During the motivated condition, there was greater coherence among nodes of the face processing network, more widespread correlation between signal intensity and performance, and selective signal increases in a task-relevant subset of face processing regions, including the posterior superior temporal sulcus and right amygdala. In addition, an unexpected task-related laterality effect was seen in the amygdala. These findings provide strong evidence that motivation augments co-activity among nodes of the face processing network and the impact of neural activity on performance. These within-subject effects highlight the necessity to consider motivation when interpreting neural function in special populations, and to further explore the effect of task demands on face processing in healthy brains.

  1. Differential emotion attribution to neutral faces of own and other races.

    Science.gov (United States)

    Hu, Chao S; Wang, Qiandong; Han, Tong; Weare, Ethan; Fu, Genyue

    2017-02-01

    Past research has demonstrated differential recognition of emotion on faces of different races. This paper reports the first study to explore differential emotion attribution to neutral faces of different races. Chinese and Caucasian adults viewed a series of Chinese and Caucasian neutral faces and judged their outward facial expression: neutral, positive, or negative. The results showed that both Chinese and Caucasian viewers perceived more Chinese faces than Caucasian faces as neutral. Nevertheless, Chinese viewers attributed positive emotion to Caucasian faces more than to Chinese faces, whereas Caucasian viewers attributed negative emotion to Caucasian faces more than to Chinese faces. Moreover, Chinese viewers attributed negative and neutral emotion to the faces of both races without significant difference in frequency, whereas Caucasian viewers mostly attributed neutral emotion to the faces. These differences between Chinese and Caucasian viewers may be due to differential visual experience, culture, racial stereotype, or expectation of the experiment. We also used eye tracking among the Chinese participants to explore the relationship between face-processing strategy and emotion attribution to neutral faces. The results showed that the interaction between emotion attribution and face race was significant on face-processing strategy, such as fixation proportion on eyes and saccade amplitude. Additionally, pupil size during processing Caucasian faces was larger than during processing Chinese faces.

  2. A motivational determinant of facial emotion recognition: regulatory focus affects recognition of emotions in faces.

    Science.gov (United States)

    Sassenrath, Claudia; Sassenberg, Kai; Ray, Devin G; Scheiter, Katharina; Jarodzka, Halszka

    2014-01-01

    Two studies examined an unexplored motivational determinant of facial emotion recognition: observer regulatory focus. It was predicted that a promotion focus would enhance facial emotion recognition relative to a prevention focus because the attentional strategies associated with promotion focus enhance performance on well-learned or innate tasks - such as facial emotion recognition. In Study 1, a promotion or a prevention focus was experimentally induced and better facial emotion recognition was observed in a promotion focus compared to a prevention focus. In Study 2, individual differences in chronic regulatory focus were assessed and attention allocation was measured using eye tracking during the facial emotion recognition task. Results indicated that the positive relation between a promotion focus and facial emotion recognition is mediated by shorter fixation duration on the face which reflects a pattern of attention allocation matched to the eager strategy in a promotion focus (i.e., striving to make hits). A prevention focus did not have an impact neither on perceptual processing nor on facial emotion recognition. Taken together, these findings demonstrate important mechanisms and consequences of observer motivational orientation for facial emotion recognition.

  3. Veiled emotions: the effect of covered faces on emotion perception and attitudes

    NARCIS (Netherlands)

    Fischer, A.H.; Gillebaart, M.; Rotteveel, M.; Becker, D.; Vliek, M.

    2012-01-01

    The present study explores the relative absence of expressive cues and the effect of contextual cues on the perception of emotions and its effect on attitudes. The visibility of expressive cues was manipulated by showing films displaying female targets whose faces were either fully visible, covered

  4. Faces of the Welfare State

    DEFF Research Database (Denmark)

    Johansen, Mette-Louise

    2012-01-01

    The paper is a short presentation of a central theme in my PhD thesis, which revolves around the way the welfare State is approached as dangerous by parents struggling to keep their family from falling apart. The thesis is an excavation of the risks and dangers they navigate in relation to both...... the welfare State and other realms of life, such as their family and neighborhood, threatening to take away their children (to crime, to prison, to foster families or to the divorced partner, etc). The paper conceptualizes immobility and confinement as embedded in the contrary; a continuous spatial placement...

  5. Dysregulation in cortical reactivity to emotional faces in PTSD patients with high dissociation symptoms

    Directory of Open Access Journals (Sweden)

    Aleksandra Klimova

    2013-09-01

    Full Text Available Background: Predominant dissociation in posttraumatic stress disorder (PTSD is characterized by restricted affective responses to positive stimuli. To date, no studies have examined neural responses to a range of emotional expressions in PTSD with high dissociative symptoms. Objective: This study tested the hypothesis that PTSD patients with high dissociative symptoms will display increased event-related potential (ERP amplitudes in early components (N1, P1 to threatening faces (angry, fearful, and reduced later ERP amplitudes (Vertex Positive Potential (VPP, P3 to happy faces compared to PTSD patients with low dissociative symptoms. Methods: Thirty-nine civilians with PTSD were classified as high dissociative (n=16 or low dissociative (n=23 according to their responses on the Clinician Administered Dissociative States Scale. ERPs were recorded, whilst participants viewed emotional (happy, angry, fear and neutral facial expressions in a passive viewing task. Results: High dissociative PTSD patients displayed significantly increased N120 amplitude to the majority of facial expressions (neutral, happy, and angry compared to low dissociative PTSD patients under conscious and preconscious conditions. The high dissociative PTSD group had significantly reduced VPP amplitude to happy faces in the conscious condition. Conclusion: High dissociative PTSD patients displayed increased early (preconscious cortical responses to emotional stimuli, and specific reductions to happy facial expressions in later (conscious, face-specific components compared to low dissociative PTSD patients. Dissociation in PTSD may act to increase initial pre-attentive processing of affective stimuli, and specifically reduce cortical reactivity to happy faces when consciously processing these stimuli.

  6. How should neuroscience study emotions? by distinguishing emotion states, concepts, and experiences

    Science.gov (United States)

    2017-01-01

    Abstract In this debate with Lisa Feldman Barrett, I defend a view of emotions as biological functional states. Affective neuroscience studies emotions in this sense, but it also studies the conscious experience of emotion (‘feelings’), our ability to attribute emotions to others and to animals (‘attribution’, ‘anthropomorphizing’), our ability to think and talk about emotion (‘concepts of emotion’, ‘semantic knowledge of emotion’) and the behaviors caused by an emotion (‘expression of emotions’, ‘emotional reactions’). I think that the most pressing challenge facing affective neuroscience is the need to carefully distinguish between these distinct aspects of ‘emotion’. I view emotion states as evolved functional states that regulate complex behavior, in both people and animals, in response to challenges that instantiate recurrent environmental themes. These functional states, in turn, can also cause conscious experiences (feelings), and their effects and our memories for those effects also contribute to our semantic knowledge of emotions (concepts). Cross-species studies, dissociations in neurological and psychiatric patients, and more ecologically valid neuroimaging designs should be used to partly separate these different phenomena. PMID:27798256

  7. Interactions between facial emotion and identity in face processing: evidence based on redundancy gains.

    Science.gov (United States)

    Yankouskaya, Alla; Booth, David A; Humphreys, Glyn

    2012-11-01

    Interactions between the processing of emotion expression and form-based information from faces (facial identity) were investigated using the redundant-target paradigm, in which we specifically tested whether identity and emotional expression are integrated in a superadditive manner (Miller, Cognitive Psychology 14:247-279, 1982). In Experiments 1 and 2, participants performed emotion and face identity judgments on faces with sad or angry emotional expressions. Responses to redundant targets were faster than responses to either single target when a universal emotion was conveyed, and performance violated the predictions from a model assuming independent processing of emotion and face identity. Experiment 4 showed that these effects were not modulated by varying interstimulus and nontarget contingencies, and Experiment 5 demonstrated that the redundancy gains were eliminated when faces were inverted. Taken together, these results suggest that the identification of emotion and facial identity interact in face processing.

  8. Different neural and cognitive response to emotional faces in healthy monozygotic twins at risk of depression.

    Science.gov (United States)

    Miskowiak, K W; Glerup, L; Vestbo, C; Harmer, C J; Reinecke, A; Macoveanu, J; Siebner, H R; Kessing, L V; Vinberg, M

    2015-05-01

    Negative cognitive bias and aberrant neural processing of emotional faces are trait-marks of depression. Yet it is unclear whether these changes constitute an endophenotype for depression and are also present in healthy individuals with hereditary risk for depression. Thirty healthy, never-depressed monozygotic (MZ) twins with a co-twin history of depression (high risk group: n = 13) or without co-twin history of depression (low-risk group: n = 17) were enrolled in a functional magnetic resonance imaging (fMRI) study. During fMRI, participants viewed fearful and happy faces while performing a gender discrimination task. After the scan, they were given a faces dot-probe task, a facial expression recognition task and questionnaires assessing mood, personality traits and coping strategies. High-risk twins showed increased neural response to happy and fearful faces in dorsal anterior cingulate cortex (ACC), dorsomedial prefrontal cortex (dmPFC), pre-supplementary motor area and occipito-parietal regions compared to low-risk twins. They also displayed stronger negative coupling between amygdala and pregenual ACC, dmPFC and temporo-parietal regions during emotional face processing. These task-related changes in neural responses in high-risk twins were accompanied by impaired gender discrimination performance during face processing. They also displayed increased attention vigilance for fearful faces and were slower at recognizing facial expressions relative to low-risk controls. These effects occurred in the absence of differences between groups in mood, subjective state or coping. Different neural response and functional connectivity within fronto-limbic and occipito-parietal regions during emotional face processing and enhanced fear vigilance may be key endophenotypes for depression.

  9. Are Max-Specified Infant Facial Expressions during Face-to-Face Interaction Consistent with Differential Emotions Theory?

    Science.gov (United States)

    Matias, Reinaldo; Cohn, Jeffrey F.

    1993-01-01

    Examined infant facial expressions at two, four, and six months of age during face-to-face play and a still-face interaction with their mothers. Contrary to differential emotions theory, at no age did proportions or durations of discrete and blended negative expressions differ; they also showed different patterns of developmental change. (MM)

  10. Adaptation to emotional conflict: evidence from a novel face emotion paradigm.

    Science.gov (United States)

    Clayson, Peter E; Larson, Michael J

    2013-01-01

    The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation) relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression). Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth) or incongruent (happy eyes, angry mouth) while high-density event-related potentials (ERPs) were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs) showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs). Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words.

  11. Adaptation to emotional conflict: evidence from a novel face emotion paradigm.

    Directory of Open Access Journals (Sweden)

    Peter E Clayson

    Full Text Available The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression. Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth or incongruent (happy eyes, angry mouth while high-density event-related potentials (ERPs were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs. Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words.

  12. Facing emotions in narcolepsy with cataplexy: haemodynamic and behavioural responses during emotional stimulation.

    Science.gov (United States)

    de Zambotti, Massimiliano; Pizza, Fabio; Covassin, Naima; Vandi, Stefano; Cellini, Nicola; Stegagno, Luciano; Plazzi, Giuseppe

    2014-08-01

    Narcolepsy with cataplexy is a complex sleep disorder that affects the modulation of emotions: cataplexy, the key symptom of narcolepsy, is indeed strongly linked with emotions that usually trigger the episodes. Our study aimed to investigate haemodynamic and behavioural responses during emotional stimulation in narco-cataplexy. Twelve adult drug-naive narcoleptic patients (five males; age: 33.3 ± 9.4 years) and 12 healthy controls (five males; age: 30.9 ± 9.5 years) were exposed to emotional stimuli (pleasant, unpleasant and neutral pictures). Heart rate, arterial blood pressure and mean cerebral blood flow velocity of the middle cerebral arteries were continuously recorded using photoplethysmography and Doppler ultrasound. Ratings of valence and arousal and coping strategies were scored by the Self-Assessment Manikin and by questionnaires, respectively. Narcoleptic patients' haemodynamic responses to pictures overlapped with the data obtained from controls: decrease of heart rate and increase of mean cerebral blood flow velocity regardless of pictures' content, increase of systolic blood pressure during the pleasant condition, and relative reduction of heart rate during pleasant and unpleasant conditions. However, when compared with controls, narcoleptic patients reported lower arousal scores during the pleasant and neutral stimulation, and lower valence scores during the pleasant condition, respectively, and also a lower score at the 'focus on and venting of emotions' dimensions of coping. Our results suggested that adult narcoleptic patients, compared with healthy controls, inhibited their emotion-expressive behaviour to emotional stimulation, and that may be related to the development of adaptive cognitive strategies to face emotions avoiding cataplexy. © 2014 European Sleep Research Society.

  13. Influence of Emotional Facial Expressions on 3-5-Year-Olds' Face Recognition

    Science.gov (United States)

    Freitag, Claudia; Schwarzer, Gudrun

    2011-01-01

    Three experiments examined 3- and 5-year-olds' recognition of faces in constant and varied emotional expressions. Children were asked to identify repeatedly presented target faces, distinguishing them from distractor faces, during an immediate recognition test and during delayed assessments after 10 min and one week. Emotional facial expression…

  14. Daily emotional states as reported by children and adolescents.

    Science.gov (United States)

    Larson, R; Lampman-Petraitis, C

    1989-10-01

    Hour-to-hour emotional states reported by children, ages 9-15, were examined in order to evaluate the hypothesis that the onset of adolescence is associated with increased emotional variability. These youths carried electronic pagers for 1 week and filled out reports on their emotional states in response to signals received at random times. To evaluate possible age-related response sets, a subset of children was asked to use the same scales to rate the emotions shown in drawings of 6 faces. The expected relation between daily emotional variability and age was not found among the boys and was small among the girls. There was, however, a linear relation between age and average mood states, with the older participants reporting more dysphoric average states, especially more mildly negative states. An absence of age difference in the ratings of the faces indicated that this relation could not be attributed to age differences in response set. Thus, these findings provide little support for the hypothesis that the onset of adolescence is associated with increased emotionality but indicate significant alterations in everyday experience associated with this age period.

  15. The electrophysiological effects of the serotonin 1A receptor agonist buspirone in emotional face processing.

    Science.gov (United States)

    Bernasconi, Fosco; Kometer, Michael; Pokorny, Thomas; Seifritz, Erich; Vollenweider, Franz X

    2015-04-01

    Emotional face processing is critically modulated by the serotonergic system, and serotonin (5-HT) receptor agonists impair emotional face processing. However, the specific contribution of the 5-HT1A receptor remains poorly understood. Here we investigated the spatiotemporal brain mechanisms underpinning the modulation of emotional face processing induced by buspirone, a partial 5-HT1A receptor agonist. In a psychophysical discrimination of emotional faces task, we observed that the discrimination fearful versus neutral faces were reduced, but not happy versus neutral faces. Electrical neuroimaging analyses were applied to visual evoked potentials elicited by emotional face images, after placebo and buspirone administration. Buspirone modulated response strength (i.e., global field power) in the interval 230-248ms after stimulus onset. Distributed source estimation over this time interval revealed that buspirone decreased the neural activity in the right dorsolateral prefrontal cortex that was evoked by fearful faces. These results indicate temporal and valence-specific effects of buspirone on the neuronal correlates of emotional face processing. Furthermore, the reduced neural activity in the dorsolateral prefrontal cortex in response to fearful faces suggests a reduced attention to fearful faces. Collectively, these findings provide new insights into the role of 5-HT1A receptors in emotional face processing and have implications for affective disorders that are characterized by an increased attention to negative stimuli. Copyright © 2015 Elsevier B.V. and ECNP. All rights reserved.

  16. Adult age-differences in subjective impression of emotional faces are reflected in emotion-related attention and memory tasks

    Directory of Open Access Journals (Sweden)

    Joakim eSvard

    2014-05-01

    Full Text Available Although younger and older adults appear to attend to and remember emotional faces differently, less is known about age-related differences in the subjective emotional impression (arousal, potency, and valence of emotional faces and how these differences, in turn, are reflected in age differences in various emotional tasks. In the current study, we used the same facial emotional stimuli (angry and happy faces in four tasks: emotional rating, attention, categorical perception, and visual short-term memory (VSTM. The aim of this study was to investigate effects of age on the subjective emotional impression of angry and happy faces and to examine whether any age differences were mirrored in measures of emotional behavior (attention, categorical perception, and memory.In addition, regression analyses were used to further study impression-behavior associations. Forty younger adults (range 20-30 years and thirty-nine older adults (range 65-75 years participated in the experiment. The emotional rating task showed that older adults perceived less arousal, potency, and valence than younger adults and that the difference was more pronounced for angry than happy faces. Similarly, the results of the attention and memory tasks demonstrated interaction effects between emotion and age, and age differences on these measures were larger for angry than for happy faces. Regression analyses confirmed that in both age groups, higher potency ratings predicted both visual search and visual short-term memory efficiency. Future studies should consider the possibility that age differences in the subjective emotional impression of facial emotional stimuli may explain age differences in attention to and memory of such stimuli.

  17. Emotional intelligence and recovering from induced negative emotional state

    Directory of Open Access Journals (Sweden)

    Joaquín T. Limonero

    2015-06-01

    Full Text Available The aim of the present study was to examine the relationship between emotional intelligence and recovering from negative emotions induction, using a performance test to measure Emotional Inteligence (EI. Sixty seven undergraduates participated in the procedure, which lasted 75 minutes and was divided into three stages. At Time 1, subjects answered the STAI-S, POMS-A, and EI was assessed by MSCEIT. At Time 2, negative emotions were induced by 9 pictures taken from the International Affective Picture System (IAPS and participants were asked to complete a second STAI-S and POMS-B questionnaires. At Time 3 participants were allowed to rest doing a distracting task and participants were asked to complete a third STAI-S and POMS-A questionnaires. Results showed that the branches of the MSCEIT emotional facilitation and emotional understanding are related to previous mood states and mood recovery, but not to mood reactivity. This finding contrasts nicely with studies on which emotional recovery was assessed in relation to EI self-reported measures, highlighting the perception and emotional regulation.

  18. Neutral face classification using personalized appearance models for fast and robust emotion detection.

    Science.gov (United States)

    Chiranjeevi, Pojala; Gopalakrishnan, Viswanath; Moogi, Pratibha

    2015-09-01

    Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning-based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, and so on, in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as user stays neutral for majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this paper, we propose a light-weight neutral versus emotion classification engine, which acts as a pre-processer to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at key emotion (KE) points using a statistical texture model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a statistical texture model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves emotion recognition (ER) accuracy and simultaneously reduces computational complexity of the ER system, as validated on multiple databases.

  19. Processing emotional body expressions: state-of-the-art.

    Science.gov (United States)

    Enea, Violeta; Iancu, Sorina

    2016-10-01

    Processing emotional body expressions has become recently an important topic in affective and social neuroscience along with the investigation of facial expressions. The objective of the study is to review the literature on emotional body expressions in order to discuss the current state of knowledge on this topic and identify directions for future research. The following electronic databases were searched: PsychINFO, Ebsco, ERIC, ProQuest, Sagepub, and SCOPUS using terms such as "body," "bodily expression," "body perception," "emotions," "posture," "body recognition" and combinations of them. The synthesis revealed several research questions that were addressed in neuroimaging, electrophysiological and behavioral studies. Among them, one important question targeted the neural mechanisms of emotional processing of body expressions to specific subsections regarding the time course for the integration of emotional signals from face and body, as well as the role of context in the perception of emotional signals. Processing bodily expression of emotion is similar to processing facial expressions, and the holistic processing is extended to the whole person. The current state-of-the-art in processing emotional body expressions may lead to a better understanding of the underlying neural mechanisms of social behavior. At the end of the review, suggestions for future research directions are presented.

  20. Congruence of happy and sad emotion in music and faces modifies cortical audiovisual activation.

    Science.gov (United States)

    Jeong, Jeong-Won; Diwadkar, Vaibhav A; Chugani, Carla D; Sinsoongsud, Piti; Muzik, Otto; Behen, Michael E; Chugani, Harry T; Chugani, Diane C

    2011-02-14

    The powerful emotion inducing properties of music are well-known, yet music may convey differing emotional responses depending on environmental factors. We hypothesized that neural mechanisms involved in listening to music may differ when presented together with visual stimuli that conveyed the same emotion as the music when compared to visual stimuli with incongruent emotional content. We designed this study to determine the effect of auditory (happy and sad instrumental music) and visual stimuli (happy and sad faces) congruent or incongruent for emotional content on audiovisual processing using fMRI blood oxygenation level-dependent (BOLD) signal contrast. The experiment was conducted in the context of a conventional block-design experiment. A block consisted of three emotional ON periods, music alone (happy or sad music), face alone (happy or sad faces), and music combined with faces where the music excerpt was played while presenting either congruent emotional faces or incongruent emotional faces. We found activity in the superior temporal gyrus (STG) and fusiform gyrus (FG) to be differentially modulated by music and faces depending on the congruence of emotional content. There was a greater BOLD response in STG when the emotion signaled by the music and faces was congruent. Furthermore, the magnitude of these changes differed for happy congruence and sad congruence, i.e., the activation of STG when happy music was presented with happy faces was greater than the activation seen when sad music was presented with sad faces. In contrast, incongruent stimuli diminished the BOLD response in STG and elicited greater signal change in bilateral FG. Behavioral testing supplemented these findings by showing that subject ratings of emotion in faces were influenced by emotion in music. When presented with happy music, happy faces were rated as more happy (p=0.051) and sad faces were rated as less sad (p=0.030). When presented with sad music, happy faces were rated as less

  1. Detection of Emotional Faces: Salient Physical Features Guide Effective Visual Search

    Science.gov (United States)

    Calvo, Manuel G.; Nummenmaa, Lauri

    2008-01-01

    In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent,…

  2. Electrophysiological correlates of emotional face processing in typically developing adults and adults with high functioning Autism

    OpenAIRE

    Barrie, Jennifer Nicole

    2012-01-01

    Emotional expressions have been found to affect various event-related potentials (ERPs). Furthermore, socio-emotional functioning is altered in individuals with autism, and a growing body of neuroimaging and electrophysiological evidence substantiates underlying neural differences for face processing in this population. However, relatively few studies have examined the time-course of emotional face processing in autism. This study examined how implicit (not the intended focus of attention) ve...

  3. Acute pharmacologically induced shifts in serotonin availability abolish emotion-selective responses to negative face emotions in distinct brain networks

    DEFF Research Database (Denmark)

    Grady, Cheryl Lynn; Siebner, Hartwig R; Hornboll, Bettina

    2013-01-01

    Pharmacological manipulation of serotonin availability can alter the processing of facial expressions of emotion. Using a within-subject design, we measured the effect of serotonin on the brain's response to aversive face emotions with functional MRI while 20 participants judged the gender...... of neutral, fearful and angry faces. In three separate and counterbalanced sessions, participants received citalopram (CIT) to raise serotonin levels, underwent acute tryptophan depletion (ATD) to lower serotonin, or were studied without pharmacological challenge (Control). An analysis designed to identify...

  4. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias

    OpenAIRE

    Invitto, Sara; Calcagn?, Antonio; Mignozzi, Arianna; Scardino, Rosanna; Piraino, Giulia; Turchi, Daniele; De Feudis, Irio; Brunetti, Antonio; Bevilacqua, Vitoantonio; de Tommaso, Marina

    2017-01-01

    Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emo...

  5. Emotion regulation in mothers and young children faced with trauma.

    Science.gov (United States)

    Pat-Horenczyk, Ruth; Cohen, S; Ziv, Y; Achituv, M; Asulin-Peretz, L; Blanchard, T R; Schiff, M; Brom, D

    2015-01-01

    The present study investigated maternal emotion regulation as mediating the association between maternal posttraumatic stress symptoms and children's emotional dysregulation in a community sample of 431 Israeli mothers and children exposed to trauma. Little is known about the specific pathways through which maternal posttraumatic symptoms and deficits in emotion regulation contribute to emotional dysregulation. Inspired by the intergenerational process of relational posttraumatic stress disorder (PTSD), in which posttraumatic distress is transmitted from mothers to children, we suggest an analogous concept of relational emotion regulation, by which maternal emotion regulation problems may contribute to child emotion regulation deficits. Child emotion regulation problems were measured using the Child Behavior Checklist-Dysregulation Profile (CBCL-DP; T.M. Achenbach & I. Rescorla, 2000), which is comprised of three subscales of the CBCL: Attention, Aggression, and Anxiety/Depression. Maternal PTSD symptoms were assessed by the Posttraumatic Diagnostic Scale (E.B. Foa, L. Cashman, L. Jaycox, & K. Perry, 1997) and maternal emotion regulation by the Difficulties in Emotion Regulation Scale (K.L. Gratz & L. Roemer, 2004). Results showed that the child's emotion regulation problems were associated with both maternal posttraumatic symptoms and maternal emotion dysregulation. Further, maternal emotion regulation mediated the association between maternal posttraumatic symptoms and the child's regulation deficits. These findings highlight the central role of mothers' emotion regulation skills in the aftermath of trauma as it relates to children's emotion regulation skills. The degree of mothers' regulatory skills in the context of posttraumatic stress symptoms reflects a key process through which the intergenerational transmission of trauma may occur. Study results have critical implications for planning and developing clinical interventions geared toward the treatment of

  6. Time for a Change: College Students' Preference for Technology-Mediated Versus Face-to-Face Help for Emotional Distress.

    Science.gov (United States)

    Lungu, Anita; Sun, Michael

    2016-12-01

    Even with recent advances in psychological treatments and mobile technology, online computerized therapy is not yet popular. College students, with ubiquitous access to technology, experiencing high distress, and often nontreatment seekers, could be an important area for online treatment dissemination. Finding ways to reach out to college students by offering psychological interventions through technology, devices, and applications they often use, might increase their engagement in treatment. This study evaluates college students' reported willingness to seek help for emotional distress through novel delivery mediums, to play computer games for learning emotional coping skills, and to disclose personal information online. We also evaluated the role of ethnicity and level of emotional distress in help-seeking patterns. A survey exploring our domains of interest and the Mental Health Inventory ([MHI] as mental health index) were completed by 572 students (mean age 18.7 years, predominantly Asian American, female, and freshmen in college). More participants expressed preference for online versus face-to-face professional help. We found no relationship between MHI and help-seeking preference. A third of participants were likely to disclose at least as much information online as face-to-face. Ownership of mobile technology was pervasive. Asian Americans were more likely to be nontreatment seekers than Caucasians. Most participants were interested in serious games for emotional distress. Our results suggest that college students are very open to creative ways of receiving emotional help such as playing games and seeking emotional help online, suggesting a need for online evidence-based treatments.

  7. Faces and bodies: perception and mimicry of emotionally congruent and incongruent facial and bodily expressions

    Directory of Open Access Journals (Sweden)

    Mariska eKret

    2013-02-01

    Full Text Available Traditional emotion theories stress the importance of the face in the expression of emotions but bodily expressions are becoming increasingly important. Here we tested the hypothesis that similar physiological responses can be evoked by observing emotional face and body signals and that the reaction to angry signals is amplified in anxious individuals. We designed three experiments in which participants categorized emotional expressions from isolated facial and bodily expressions and from emotionally congruent and incongruent face-body compounds. Participants’ fixations were measured and their pupil size recorded with eye-tracking equipment, and their facial reactions measured with electromyography (EMG. The behavioral results support our prediction that the recognition of a facial expression is improved in the context of a matching posture and importantly, also vice versa. From their facial expression, it appeared that observers acted with signs of negative emotionality (increased corrugator activity to angry and fearful facial expressions and with positive emotionality (increased zygomaticus to happy facial expressions. What we predicted and found, was that angry and fearful cues from the face or the body, attracted more attention than happy cues. We further observed that responses evoked by angry cues were amplified in individuals with high anxiety scores. In sum, we show that people process bodily expressions of emotion in a similar fashion as facial expressions and that the congruency between the emotional signals from the face and body ameliorates the recognition of the emotion.

  8. Testing the effects of expression, intensity and age on emotional face processing in ASD.

    Science.gov (United States)

    Luyster, Rhiannon J; Bick, Johanna; Westerlund, Alissa; Nelson, Charles A

    2017-06-21

    Individuals with autism spectrum disorder (ASD) commonly show global deficits in the processing of facial emotion, including impairments in emotion recognition and slowed processing of emotional faces. Growing evidence has suggested that these challenges may increase with age, perhaps due to minimal improvement with age in individuals with ASD. In the present study, we explored the role of age, emotion type and emotion intensity in face processing for individuals with and without ASD. Twelve- and 18-22- year-old children with and without ASD participated. No significant diagnostic group differences were observed on behavioral measures of emotion processing for younger versus older individuals with and without ASD. However, there were significant group differences in neural responses to emotional faces. Relative to TD, at 12 years of age and during adulthood, individuals with ASD showed slower N170 to emotional faces. While the TD groups' P1 latency was significantly shorter in adults when compared to 12 year olds, there was no significant age-related difference in P1 latency among individuals with ASD. Findings point to potential differences in the maturation of cortical networks that support visual processing (whether of faces or stimuli more broadly), among individuals with and without ASD between late childhood and adulthood. Finally, associations between ERP amplitudes and behavioral responses on emotion processing tasks suggest possible neural markers for emotional and behavioral deficits among individuals with ASD. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Young Adults with Autism Spectrum Disorder Show Early Atypical Neural Activity during Emotional Face Processing

    Directory of Open Access Journals (Sweden)

    Rachel C. Leung

    2018-02-01

    Full Text Available Social cognition is impaired in autism spectrum disorder (ASD. The ability to perceive and interpret affect is integral to successful social functioning and has an extended developmental course. However, the neural mechanisms underlying emotional face processing in ASD are unclear. Using magnetoencephalography (MEG, the present study explored neural activation during implicit emotional face processing in young adults with and without ASD. Twenty-six young adults with ASD and 26 healthy controls were recruited. Participants indicated the location of a scrambled pattern (target that was presented alongside a happy or angry face. Emotion-related activation sources for each emotion were estimated using the Empirical Bayes Beamformer (pcorr ≤ 0.001 in Statistical Parametric Mapping 12 (SPM12. Emotional faces elicited elevated fusiform, amygdala and anterior insula and reduced anterior cingulate cortex (ACC activity in adults with ASD relative to controls. Within group comparisons revealed that angry vs. happy faces elicited distinct neural activity in typically developing adults; there was no distinction in young adults with ASD. Our data suggest difficulties in affect processing in ASD reflect atypical recruitment of traditional emotional processing areas. These early differences may contribute to difficulties in deriving social reward from faces, ascribing salience to faces, and an immature threat processing system, which collectively could result in deficits in emotional face processing.

  10. Embodied Appraisals and Non-emotional States

    Czech Academy of Sciences Publication Activity Database

    Hvorecký, Juraj

    2010-01-01

    Roč. 20, č. 3 (2010), s. 215-223 ISSN 1210-3055 R&D Projects: GA AV ČR(CZ) KJB900090802 Institutional research plan: CEZ:AV0Z90090514 Keywords : embodied appraisal * non-emotional mental states * valence * emotion Subject RIV: AA - Philosophy ; Religion

  11. No Differences in Emotion Recognition Strategies in Children with Autism Spectrum Disorder: Evidence from Hybrid Faces

    Directory of Open Access Journals (Sweden)

    Kris Evers

    2014-01-01

    Full Text Available Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD. However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness or in the mouth region (so-called bottom-emotions: sadness, anger, and fear. No stronger reliance on mouth information was found in children with ASD.

  12. No differences in emotion recognition strategies in children with autism spectrum disorder: evidence from hybrid faces.

    Science.gov (United States)

    Evers, Kris; Kerkhof, Inneke; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2014-01-01

    Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD). However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD) group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region) with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness) or in the mouth region (so-called bottom-emotions: sadness, anger, and fear). No stronger reliance on mouth information was found in children with ASD.

  13. Differences in neural and cognitive response to emotional faces in middle-aged dizygotic twins at familial risk of depression

    DEFF Research Database (Denmark)

    Miskowiak, K W; Svendsen, A M B; Harmer, C J

    2017-01-01

    -twin history of depression (high-risk) and 20 were without co-twin history of depression (low-risk). During fMRI, participants viewed fearful and happy faces while performing a gender discrimination task. After the scan, they were given a faces dot-probe task, a facial expression recognition task......BACKGROUND: Negative bias and aberrant neural processing of emotional faces are trait-marks of depression but findings in healthy high-risk groups are conflicting. METHODS: Healthy middle-aged dizygotic twins (N = 42) underwent functional magnetic resonance imaging (fMRI): 22 twins had a co...... the amygdala and ventral prefrontal cortex and pregenual anterior cingulate. This was accompanied by greater fear-specific fronto-temporal response and reduced fronto-occipital response to all emotional faces relative to baseline. The risk groups showed no differences in mood, subjective state or coping...

  14. An emotional Stroop task with faces and words. A comparison of young and older adults.

    Science.gov (United States)

    Agustí, Ana I; Satorres, Encarnación; Pitarque, Alfonso; Meléndez, Juan C

    2017-08-01

    Given the contradictions of previous studies on the changes in attentional responses produced in aging a Stroop emotional task was proposed to compare young and older adults to words or faces with an emotional valence. The words happy or sad were superimposed on faces that express the emotion of happiness or sadness. The emotion expressed by the word and the face could agree or not (cued and uncued trials, respectively). 85 young and 66 healthy older adults had to identify both faces and words separately, and the interference between the two types of stimuli was examined. An interference effect was observed for both types of stimuli in both groups. There was more interference on positive faces and words than on negative stimuli. Older adults had more difficulty than younger in focusing on positive uncued trials, whereas there was no difference across samples on negative uncued trials. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Lateralized hybrid faces: evidence of a valence-specific bias in the processing of implicit emotions.

    Science.gov (United States)

    Prete, Giulia; Laeng, Bruno; Tommasi, Luca

    2014-01-01

    It is well known that hemispheric asymmetries exist for both the analyses of low-level visual information (such as spatial frequency) and high-level visual information (such as emotional expressions). In this study, we assessed which of the above factors underlies perceptual laterality effects with "hybrid faces": a type of stimulus that allows testing for unaware processing of emotional expressions, when the emotion is displayed in the low-frequency information while an image of the same face with a neutral expression is superimposed to it. Despite hybrid faces being perceived as neutral, the emotional information modulates observers' social judgements. In the present study, participants were asked to assess friendliness of hybrid faces displayed tachistoscopically, either centrally or laterally to fixation. We found a clear influence of the hidden emotions also with lateral presentations. Happy faces were rated as more friendly and angry faces as less friendly with respect to neutral faces. In general, hybrid faces were evaluated as less friendly when they were presented in the left visual field/right hemisphere than in the right visual field/left hemisphere. The results extend the validity of the valence hypothesis in the specific domain of unaware (subcortical) emotion processing.

  16. Emotional Faces Capture Spatial Attention in 5-Year-Old Children

    Directory of Open Access Journals (Sweden)

    Kit K. Elam

    2010-10-01

    Full Text Available Emotional facial expressions are important social cues that convey salient affective information. Infants, younger children, and adults all appear to orient spatial attention to emotional faces with a particularly strong bias to fearful faces. Yet in young children it is unclear whether or not both happy and fearful faces extract attention. Given that the processing of emotional faces is believed by some to serve an evolutionarily adaptive purpose, attentional biases to both fearful and happy expressions would be expected in younger children. However, the extent to which this ability is present in young children and whether or not this ability is genetically mediated is untested. Therefore, the aims of the current study were to assess the spatial-attentional properties of emotional faces in young children, with a preliminary test of whether this effect was influenced by genetics. Five-year-old twin pairs performed a dot-probe task. The results suggest that children preferentially direct spatial attention to emotional faces, particularly right visual field faces. The results provide support for the notion that the direction of spatial attention to emotional faces serves an evolutionarily adaptive function and may be mediated by genetic mechanisms.

  17. Modulation of the composite face effect by unintended emotion cues

    OpenAIRE

    Gray, Katie L. H.; Murphy, Jennifer; Marsh, Jade E.; Cook, Richard

    2017-01-01

    When upper and lower regions from different emotionless faces are aligned to form a facial composite, observers ‘fuse’ the two halves together, perceptually. The illusory distortion induced by task-irrelevant (‘distractor’) halves hinders participants’ judgments about task-relevant (‘target’) halves. This composite-face effect reveals a tendency to integrate feature information from disparate regions of intact upright faces, consistent with theories of holistic face processing. However, obser...

  18. Effects of acute psychosocial stress on neural activity to emotional and neutral faces in a face recognition memory paradigm.

    Science.gov (United States)

    Li, Shijia; Weerda, Riklef; Milde, Christopher; Wolf, Oliver T; Thiel, Christiane M

    2014-12-01

    Previous studies have shown that acute psychosocial stress impairs recognition of declarative memory and that emotional material is especially sensitive to this effect. Animal studies suggest a central role of the amygdala which modulates memory processes in hippocampus, prefrontal cortex and other brain areas. We used functional magnetic resonance imaging (fMRI) to investigate neural correlates of stress-induced modulation of emotional recognition memory in humans. Twenty-seven healthy, right-handed, non-smoker male volunteers performed an emotional face recognition task. During encoding, participants were presented with 50 fearful and 50 neutral faces. One hour later, they underwent either a stress (Trier Social Stress Test) or a control procedure outside the scanner which was followed immediately by the recognition session inside the scanner, where participants had to discriminate between 100 old and 50 new faces. Stress increased salivary cortisol, blood pressure and pulse, and decreased the mood of participants but did not impact recognition memory. BOLD data during recognition revealed a stress condition by emotion interaction in the left inferior frontal gyrus and right hippocampus which was due to a stress-induced increase of neural activity to fearful and a decrease to neutral faces. Functional connectivity analyses revealed a stress-induced increase in coupling between the right amygdala and the right fusiform gyrus, when processing fearful as compared to neutral faces. Our results provide evidence that acute psychosocial stress affects medial temporal and frontal brain areas differentially for neutral and emotional items, with a stress-induced privileged processing of emotional stimuli.

  19. Social anhedonia is associated with neural abnormalities during face emotion processing.

    Science.gov (United States)

    Germine, Laura T; Garrido, Lucia; Bruce, Lori; Hooker, Christine

    2011-10-01

    Human beings are social organisms with an intrinsic desire to seek and participate in social interactions. Social anhedonia is a personality trait characterized by a reduced desire for social affiliation and reduced pleasure derived from interpersonal interactions. Abnormally high levels of social anhedonia prospectively predict the development of schizophrenia and contribute to poorer outcomes for schizophrenia patients. Despite the strong association between social anhedonia and schizophrenia, the neural mechanisms that underlie individual differences in social anhedonia have not been studied and are thus poorly understood. Deficits in face emotion recognition are related to poorer social outcomes in schizophrenia, and it has been suggested that face emotion recognition deficits may be a behavioral marker for schizophrenia liability. In the current study, we used functional magnetic resonance imaging (fMRI) to see whether there are differences in the brain networks underlying basic face emotion processing in a community sample of individuals low vs. high in social anhedonia. We isolated the neural mechanisms related to face emotion processing by comparing face emotion discrimination with four other baseline conditions (identity discrimination of emotional faces, identity discrimination of neutral faces, object discrimination, and pattern discrimination). Results showed a group (high/low social anhedonia) × condition (emotion discrimination/control condition) interaction in the anterior portion of the rostral medial prefrontal cortex, right superior temporal gyrus, and left somatosensory cortex. As predicted, high (relative to low) social anhedonia participants showed less neural activity in face emotion processing regions during emotion discrimination as compared to each control condition. The findings suggest that social anhedonia is associated with abnormalities in networks responsible for basic processes associated with social cognition, and provide a

  20. Processing Distracting Non-face Emotional Images: No Evidence of an Age-Related Positivity Effect.

    Science.gov (United States)

    Madill, Mark; Murray, Janice E

    2017-01-01

    Cognitive aging may be accompanied by increased prioritization of social and emotional goals that enhance positive experiences and emotional states. The socioemotional selectivity theory suggests this may be achieved by giving preference to positive information and avoiding or suppressing negative information. Although there is some evidence of a positivity bias in controlled attention tasks, it remains unclear whether a positivity bias extends to the processing of affective stimuli presented outside focused attention. In two experiments, we investigated age-related differences in the effects of to-be-ignored non-face affective images on target processing. In Experiment 1, 27 older (64-90 years) and 25 young adults (19-29 years) made speeded valence judgments about centrally presented positive or negative target images taken from the International Affective Picture System. To-be-ignored distractor images were presented above and below the target image and were either positive, negative, or neutral in valence. The distractors were considered task relevant because they shared emotional characteristics with the target stimuli. Both older and young adults responded slower to targets when distractor valence was incongruent with target valence relative to when distractors were neutral. Older adults responded faster to positive than to negative targets but did not show increased interference effects from positive distractors. In Experiment 2, affective distractors were task irrelevant as the target was a three-digit array and did not share emotional characteristics with the distractors. Twenty-six older (63-84 years) and 30 young adults (18-30 years) gave speeded responses on a digit disparity task while ignoring the affective distractors positioned in the periphery. Task performance in either age group was not influenced by the task-irrelevant affective images. In keeping with the socioemotional selectivity theory, these findings suggest that older adults preferentially

  1. How Context Influences Our Perception of Emotional Faces

    DEFF Research Database (Denmark)

    Calbi, Marta; Heimann, Katrin; Barratt, Daniel

    2017-01-01

    corresponding to one of the so called ‘basic emotions.’ However, our real experience during social interactions is different: facial expressions of emotion are mostly perceived in a wider context, constituted by body language, the surrounding environment, and our beliefs and expectations. Already in the early...... twentieth century, the Russian filmmaker Lev Kuleshov argued that such context, established by intermediate shots of strong emotional content, could significantly change our interpretation of facial expressions in film. Prior experiments have shown behavioral effects pointing in this direction, but have...

  2. Cyber Victimization in High School: Measurement, Overlap with Face-to-Face Victimization, and Associations with Social-Emotional Outcomes

    Science.gov (United States)

    Brown, Christina Flynn; Demaray, Michelle Kilpatrick; Tennant, Jaclyn E.; Jenkins, Lyndsay N.

    2017-01-01

    Cyber victimization is a contemporary problem facing youth and adolescents (Diamanduros, Downs, & Jenkins, 2008; Kowalski & Limber, 2007). It is imperative for researchers and school personnel to understand the associations between cyber victimization and student social-emotional outcomes. This article explores (a) gender differences in…

  3. Memory for faces and voices varies as a function of sex and expressed emotion.

    Science.gov (United States)

    S Cortes, Diana; Laukka, Petri; Lindahl, Christina; Fischer, Håkan

    2017-01-01

    We investigated how memory for faces and voices (presented separately and in combination) varies as a function of sex and emotional expression (anger, disgust, fear, happiness, sadness, and neutral). At encoding, participants judged the expressed emotion of items in forced-choice tasks, followed by incidental Remember/Know recognition tasks. Results from 600 participants showed that accuracy (hits minus false alarms) was consistently higher for neutral compared to emotional items, whereas accuracy for specific emotions varied across the presentation modalities (i.e., faces, voices, and face-voice combinations). For the subjective sense of recollection ("remember" hits), neutral items received the highest hit rates only for faces, whereas for voices and face-voice combinations anger and fear expressions instead received the highest recollection rates. We also observed better accuracy for items by female expressers, and own-sex bias where female participants displayed memory advantage for female faces and face-voice combinations. Results further suggest that own-sex bias can be explained by recollection, rather than familiarity, rates. Overall, results show that memory for faces and voices may be influenced by the expressions that they carry, as well as by the sex of both items and participants. Emotion expressions may also enhance the subjective sense of recollection without enhancing memory accuracy.

  4. The Effect of Self-Referential Expectation on Emotional Face Processing.

    Directory of Open Access Journals (Sweden)

    Mel McKendrick

    Full Text Available The role of self-relevance has been somewhat neglected in static face processing paradigms but may be important in understanding how emotional faces impact on attention, cognition and affect. The aim of the current study was to investigate the effect of self-relevant primes on processing emotional composite faces. Sentence primes created an expectation of the emotion of the face before sad, happy, neutral or composite face photos were viewed. Eye movements were recorded and subsequent responses measured the cognitive and affective impact of the emotion expressed. Results indicated that primes did not guide attention, but impacted on judgments of valence intensity and self-esteem ratings. Negative self-relevant primes led to the most negative self-esteem ratings, although the effect of the prime was qualified by salient facial features. Self-relevant expectations about the emotion of a face and subsequent attention to a face that is congruent with these expectations strengthened the affective impact of viewing the face.

  5. Memory for faces and voices varies as a function of sex and expressed emotion.

    Directory of Open Access Journals (Sweden)

    Diana S Cortes

    Full Text Available We investigated how memory for faces and voices (presented separately and in combination varies as a function of sex and emotional expression (anger, disgust, fear, happiness, sadness, and neutral. At encoding, participants judged the expressed emotion of items in forced-choice tasks, followed by incidental Remember/Know recognition tasks. Results from 600 participants showed that accuracy (hits minus false alarms was consistently higher for neutral compared to emotional items, whereas accuracy for specific emotions varied across the presentation modalities (i.e., faces, voices, and face-voice combinations. For the subjective sense of recollection ("remember" hits, neutral items received the highest hit rates only for faces, whereas for voices and face-voice combinations anger and fear expressions instead received the highest recollection rates. We also observed better accuracy for items by female expressers, and own-sex bias where female participants displayed memory advantage for female faces and face-voice combinations. Results further suggest that own-sex bias can be explained by recollection, rather than familiarity, rates. Overall, results show that memory for faces and voices may be influenced by the expressions that they carry, as well as by the sex of both items and participants. Emotion expressions may also enhance the subjective sense of recollection without enhancing memory accuracy.

  6. Brain and behavioral inhibitory control of kindergartners facing negative emotions.

    Science.gov (United States)

    Farbiash, Tali; Berger, Andrea

    2016-09-01

    Inhibitory control (IC) - one of the most critical functions underlying a child's ability to self-regulate - develops significantly throughout the kindergarten years. Experiencing negative emotions imposes challenges on executive functioning and may specifically affect IC. In this study, we examined kindergartners' IC and its related brain activity during a negative emotional situation: 58 children (aged 5.5-6.5 years) performed an emotion-induction Go/NoGo task. During this task, we recorded children's performance and brain activity, focusing on the fronto-central N2 component in the event-related potential (ERP) and the power of its underlying theta frequency. Compared to Go trials, inhibition of NoGo trials was associated with larger N2 amplitudes and theta power. The negative emotional experience resulted in better IC performance and, at the brain level, in larger theta power. Source localization of this effect showed that the brain activity related to IC during the negative emotional experience was principally generated in the posterior frontal regions. Furthermore, the band power measure was found to be a more sensitive index for children's inhibitory processes than N2 amplitudes. This is the first study to focus on kindergartners' IC while manipulating their emotional experience to induce negative emotions. Our findings suggest that a kindergartner's experience of negative emotion can result in improved IC and increases in associated aspects of brain activity. Our results also suggest the utility of time-frequency analyses in the study of brain processes associated with response inhibition in young children. © 2015 John Wiley & Sons Ltd.

  7. Age-Group Differences in Interference from Young and Older Emotional Faces.

    Science.gov (United States)

    Ebner, Natalie C; Johnson, Marcia K

    2010-11-01

    Human attention is selective, focusing on some aspects of events at the expense of others. In particular, angry faces engage attention. Most studies have used pictures of young faces, even when comparing young and older age groups. Two experiments asked (1) whether task-irrelevant faces of young and older individuals with happy, angry, and neutral expressions disrupt performance on a face-unrelated task, (2) whether interference varies for faces of different ages and different facial expressions, and (3) whether young and older adults differ in this regard. Participants gave speeded responses on a number task while irrelevant faces appeared in the background. Both age groups were more distracted by own than other-age faces. In addition, young participants' responses were slower for angry than happy faces, whereas older participants' responses were slower for happy than angry faces. Factors underlying age-group differences in interference from emotional faces of different ages are discussed.

  8. Effects of facial emotion recognition remediation on visual scanning of novel face stimuli.

    Science.gov (United States)

    Marsh, Pamela J; Luckett, Gemma; Russell, Tamara; Coltheart, Max; Green, Melissa J

    2012-11-01

    Previous research shows that emotion recognition in schizophrenia can be improved with targeted remediation that draws attention to important facial features (eyes, nose, mouth). Moreover, the effects of training have been shown to last for up to one month after training. The aim of this study was to investigate whether improved emotion recognition of novel faces is associated with concomitant changes in visual scanning of these same novel facial expressions. Thirty-nine participants with schizophrenia received emotion recognition training using Ekman's Micro-Expression Training Tool (METT), with emotion recognition and visual scanpath (VSP) recordings to face stimuli collected simultaneously. Baseline ratings of interpersonal and cognitive functioning were also collected from all participants. Post-METT training, participants showed changes in foveal attention to the features of facial expressions of emotion not used in METT training, which were generally consistent with the information about important features from the METT. In particular, there were changes in how participants looked at the features of facial expressions of emotion surprise, disgust, fear, happiness, and neutral, demonstrating that improved emotion recognition is paralleled by changes in the way participants with schizophrenia viewed novel facial expressions of emotion. However, there were overall decreases in foveal attention to sad and neutral faces that indicate more intensive instruction might be needed for these faces during training. Most importantly, the evidence shows that participant gender may affect training outcomes. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Asymmetric Engagement of Amygdala and Its Gamma Connectivity in Early Emotional Face Processing

    Science.gov (United States)

    Liu, Tai-Ying; Chen, Yong-Sheng; Hsieh, Jen-Chuen; Chen, Li-Fen

    2015-01-01

    The amygdala has been regarded as a key substrate for emotion processing. However, the engagement of the left and right amygdala during the early perceptual processing of different emotional faces remains unclear. We investigated the temporal profiles of oscillatory gamma activity in the amygdala and effective connectivity of the amygdala with the thalamus and cortical areas during implicit emotion-perceptual tasks using event-related magnetoencephalography (MEG). We found that within 100 ms after stimulus onset the right amygdala habituated to emotional faces rapidly (with duration around 20–30 ms), whereas activity in the left amygdala (with duration around 50–60 ms) sustained longer than that in the right. Our data suggest that the right amygdala could be linked to autonomic arousal generated by facial emotions and the left amygdala might be involved in decoding or evaluating expressive faces in the early perceptual emotion processing. The results of effective connectivity provide evidence that only negative emotional processing engages both cortical and subcortical pathways connected to the right amygdala, representing its evolutional significance (survival). These findings demonstrate the asymmetric engagement of bilateral amygdala in emotional face processing as well as the capability of MEG for assessing thalamo-cortico-limbic circuitry. PMID:25629899

  10. Investigating emotional top down modulation of ambiguous faces by single pulse TMS on early visual cortices

    Directory of Open Access Journals (Sweden)

    Zachary Adam Yaple

    2016-06-01

    Full Text Available Top-down processing is a mechanism in which memory, context and expectation are used to perceive stimuli. For this study we investigated how emotion content, induced by music mood, influences perception of happy and sad emoticons. Using single pulse TMS we stimulated right occipital face area (rOFA, primary visual cortex (V1 and vertex while subjects performed a face-detection task and listened to happy and sad music. At baseline, incongruent audio-visual pairings decreased performance, demonstrating dependence of emotion while perceiving ambiguous faces. However, performance of face identification decreased during rOFA stimulation regardless of emotional content. No effects were found between Cz and V1 stimulation. These results suggest that while rOFA is important for processing faces regardless of emotion, V1 stimulation had no effect. Our findings suggest that early visual cortex activity may not integrate emotional auditory information with visual information during emotion top-down modulation of faces.

  11. Neuropsychology of facial expressions. The role of consciousness in processing emotional faces

    Directory of Open Access Journals (Sweden)

    Michela Balconi

    2012-04-01

    Full Text Available Neuropsychological studies have underlined the significant presence of distinct brain correlates deputed to analyze facial expression of emotion. It was observed that some cerebral circuits were considered as specific for emotional face comprehension as a function of conscious vs. unconscious processing of emotional information. Moreover, the emotional content of faces (i.e. positive vs. negative; more or less arousing may have an effect in activating specific cortical networks. Between the others, recent studies have explained the contribution of hemispheres in comprehending face, as a function of type of emotions (mainly related to the distinction positive vs. negative and of specific tasks (comprehending vs. producing facial expressions. Specifically, ERPs (event-related potentials analysis overview is proposed in order to comprehend how face may be processed by an observer and how he can make face a meaningful construct even in absence of awareness. Finally, brain oscillations is considered in order to explain the synchronization of neural populations in response to emotional faces when a conscious vs. unconscious processing is activated

  12. The Impact of Top-Down Prediction on Emotional Face Processing in Social Anxiety

    Directory of Open Access Journals (Sweden)

    Guangming Ran

    2017-07-01

    Full Text Available There is evidence that people with social anxiety show abnormal processing of emotional faces. To investigate the impact of top-down prediction on emotional face processing in social anxiety, brain responses of participants with high and low social anxiety (LSA were recorded, while they performed a variation of the emotional task, using high temporal resolution event-related potential techniques. Behaviorally, we reported an effect of prediction with higher accuracy for predictable than unpredictable faces. Furthermore, we found that participants with high social anxiety (HSA, but not with LSA, recognized angry faces more accurately than happy faces. For the P100 and P200 components, HSA participants showed enhanced brain activity for angry faces compared to happy faces, suggesting a hypervigilance to angry faces. Importantly, HSA participants exhibited larger N170 amplitudes in the right hemisphere electrodes than LSA participants when they observed unpredictable angry faces, but not when the angry faces were predictable. This probably reflects the top-down prediction improving the deficiency at building a holistic face representation in HSA participants.

  13. Age-related differences in event-related potentials for early visual processing of emotional faces.

    Science.gov (United States)

    Hilimire, Matthew R; Mienaltowski, Andrew; Blanchard-Fields, Fredda; Corballis, Paul M

    2014-07-01

    With advancing age, processing resources are shifted away from negative emotional stimuli and toward positive ones. Here, we explored this 'positivity effect' using event-related potentials (ERPs). Participants identified the presence or absence of a visual probe that appeared over photographs of emotional faces. The ERPs elicited by the onsets of angry, sad, happy and neutral faces were recorded. We examined the frontocentral emotional positivity (FcEP), which is defined as a positive deflection in the waveforms elicited by emotional expressions relative to neutral faces early on in the time course of the ERP. The FcEP is thought to reflect enhanced early processing of emotional expressions. The results show that within the first 130 ms young adults show an FcEP to negative emotional expressions, whereas older adults show an FcEP to positive emotional expressions. These findings provide additional evidence that the age-related positivity effect in emotion processing can be traced to automatic processes that are evident very early in the processing of emotional facial expressions. © The Author (2013). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  14. A leftward bias however you look at it: Revisiting the emotional chimeric face task as a tool for measuring emotion lateralization.

    Science.gov (United States)

    R Innes, Bobby; Burt, D Michael; Birch, Yan K; Hausmann, Markus

    2015-12-28

    Left hemiface biases observed within the Emotional Chimeric Face Task (ECFT) support emotional face perception models whereby all expressions are preferentially processed by the right hemisphere. However, previous research using this task has not considered that the visible midline between hemifaces might engage atypical facial emotion processing strategies in upright or inverted conditions, nor controlled for left visual field (thus right hemispheric) visuospatial attention biases. This study used novel emotional chimeric faces (blended at the midline) to examine laterality biases for all basic emotions. Left hemiface biases were demonstrated across all emotional expressions and were reduced, but not reversed, for inverted faces. The ECFT bias in upright faces was significantly increased in participants with a large attention bias. These results support the theory that left hemiface biases reflect a genuine bias in emotional face processing, and this bias can interact with attention processes similarly localized in the right hemisphere.

  15. Music-Elicited Emotion Identification Using Optical Flow Analysis of Human Face

    Science.gov (United States)

    Kniaz, V. V.; Smirnova, Z. N.

    2015-05-01

    Human emotion identification from image sequences is highly demanded nowadays. The range of possible applications can vary from an automatic smile shutter function of consumer grade digital cameras to Biofied Building technologies, which enables communication between building space and residents. The highly perceptual nature of human emotions leads to the complexity of their classification and identification. The main question arises from the subjective quality of emotional classification of events that elicit human emotions. A variety of methods for formal classification of emotions were developed in musical psychology. This work is focused on identification of human emotions evoked by musical pieces using human face tracking and optical flow analysis. Facial feature tracking algorithm used for facial feature speed and position estimation is presented. Facial features were extracted from each image sequence using human face tracking with local binary patterns (LBP) features. Accurate relative speeds of facial features were estimated using optical flow analysis. Obtained relative positions and speeds were used as the output facial emotion vector. The algorithm was tested using original software and recorded image sequences. The proposed technique proves to give a robust identification of human emotions elicited by musical pieces. The estimated models could be used for human emotion identification from image sequences in such fields as emotion based musical background or mood dependent radio.

  16. Cognitive Biases for Emotional Faces in High- and Low-Trait Depressive Participants

    Directory of Open Access Journals (Sweden)

    Yi-Hsing Hsieh

    2004-10-01

    Full Text Available This study examined the association between trait depression and information-processing biases. Thirty participants were divided into high- and low-trait depressive groups based on the median of their depressive subscale scores according to the Basic Personality Inventory. Information-processing biases were measured using a deployment-of-attention task (DOAT and a recognition memory task (RMT. For the DOAT, participants saw one emotional face paired with a neutral face of the same person, and then were forced to choose on which face the color patch had first occurred. The percentage of participants' choices favoring the happy, angry, or sad faces represented the selective attentional bias score for each emotion, respectively. For the RMT, participants rated different types of emotional faces and subsequently discriminated old faces from new faces. The memory strength for each type of face was calculated from hit and false-positive rates, based on the signal detection theory. Compared with the low-trait depressive group, the high-trait depressive group showed a negative cognitive style. This was an enhanced recognition memory for sad faces and a weakened inhibition of attending to sad faces, suggesting that those with high depressive trait may be vulnerable to interpersonal withdrawal.

  17. Face Processing and Facial Emotion Recognition in Adults with Down Syndrome

    Science.gov (United States)

    Barisnikov, Koviljka; Hippolyte, Loyse; Van der Linden, Martial

    2008-01-01

    Face processing and facial expression recognition was investigated in 17 adults with Down syndrome, and results were compared with those of a child control group matched for receptive vocabulary. On the tasks involving faces without emotional content, the adults with Down syndrome performed significantly worse than did the controls. However, their…

  18. Face-body integration of intense emotional expressions of victory and defeat.

    Directory of Open Access Journals (Sweden)

    Lili Wang

    Full Text Available Human facial expressions can be recognized rapidly and effortlessly. However, for intense emotions from real life, positive and negative facial expressions are difficult to discriminate and the judgment of facial expressions is biased towards simultaneously perceived body expressions. This study employed event-related potentials (ERPs to investigate the neural dynamics involved in the integration of emotional signals from facial and body expressions of victory and defeat. Emotional expressions of professional players were used to create pictures of face-body compounds, with either matched or mismatched emotional expressions in faces and bodies. Behavioral results showed that congruent emotional information of face and body facilitated the recognition of facial expressions. ERP data revealed larger P1 amplitudes for incongruent compared to congruent stimuli. Also, a main effect of body valence on the P1 was observed, with enhanced amplitudes for the stimuli with losing compared to winning bodies. The main effect of body expression was also observed in N170 and N2, with winning bodies producing larger N170/N2 amplitudes. In the later stage, a significant interaction of congruence by body valence was found on the P3 component. Winning bodies elicited lager P3 amplitudes than losing bodies did when face and body conveyed congruent emotional signals. Beyond the knowledge based on prototypical facial and body expressions, the results of this study facilitate us to understand the complexity of emotion evaluation and categorization out of laboratory.

  19. The impact of emotional faces on social motivation in schizophrenia.

    Science.gov (United States)

    Radke, Sina; Pfersmann, Vera; Derntl, Birgit

    2015-10-01

    Impairments in emotion recognition and psychosocial functioning are a robust phenomenon in schizophrenia and may affect motivational behavior, particularly during socio-emotional interactions. To characterize potential deficits and their interplay, we assessed social motivation covering various facets, such as implicit and explicit approach-avoidance tendencies to facial expressions, in 27 patients with schizophrenia (SZP) and 27 matched healthy controls (HC). Moreover, emotion recognition abilities as well as self-reported behavioral activation and inhibition were evaluated. Compared to HC, SZP exhibited less pronounced approach-avoidance ratings to happy and angry expressions along with prolonged reactions during automatic approach-avoidance. Although deficits in emotion recognition were replicated, these were not associated with alterations in social motivation. Together with additional connections between psychopathology and several approach-avoidance processes, these results identify motivational impairments in SZP and suggest a complex relationship between different aspects of social motivation. In the context of specialized interventions aimed at improving social cognitive abilities in SZP, the link between such dynamic measures, motivational profiles and functional outcomes warrants further investigations, which can provide important leverage points for treatment. Crucially, our findings present first insights into the assessment and identification of target features of social motivation.

  20. Amygdala hypersensitivity in response to emotional faces in Tourette's patients.

    Science.gov (United States)

    Neuner, Irene; Kellermann, Thilo; Stöcker, Tony; Kircher, Tilo; Habel, Ute; Shah, Jon N; Schneider, Frank

    2010-10-01

    Tourette's syndrome is characterised by motor and vocal tics as well as a high level of impulsivity and emotional dysregulation. Neuroimaging studies point to structural changes of the basal ganglia, prefrontal cortex and parts of the limbic system. However, there is no link between behavioural symptoms and the structural changes in the amygdala. One aspect of daily social interaction is the perception of emotional facial expressions, closely linked to amgydala function. We therefore investigated via fMRI the implicit discrimination of six emotional facial expressions in 19 adult Tourette's patients. In comparison to healthy control group, Tourette's patients showed significantly higher amygdala activation, especially pronounced for fearful, angry and neutral expressions. The BOLD-activity of the left amygdala correlated negatively with the personality trait extraversion. We will discuss these findings as a result of either deficient frontal inhibition due to structural changes or a desynchronization in the interaction of the cortico-striato-thalamo-cortical network within structures of the limbic system. Our data show an altered pattern of implicit emotion discrimination and emphasize the need to consider motor and non-motor symptoms in Tourette's syndrome in the choice of both behavioural and pharmacological treatment.

  1. A new method for face detection in colour images for emotional bio-robots

    Institute of Scientific and Technical Information of China (English)

    HAPESHI; Kevin

    2010-01-01

    Emotional bio-robots have become a hot research topic in last two decades. Though there have been some progress in research, design and development of various emotional bio-robots, few of them can be used in practical applications. The study of emotional bio-robots demands multi-disciplinary co-operation. It involves computer science, artificial intelligence, 3D computation, engineering system modelling, analysis and simulation, bionics engineering, automatic control, image processing and pattern recognition etc. Among them, face detection belongs to image processing and pattern recognition. An emotional robot must have the ability to recognize various objects, particularly, it is very important for a bio-robot to be able to recognize human faces from an image. In this paper, a face detection method is proposed for identifying any human faces in colour images using human skin model and eye detection method. Firstly, this method can be used to detect skin regions from the input colour image after normalizing its luminance. Then, all face candidates are identified using an eye detection method. Comparing with existing algorithms, this method only relies on the colour and geometrical data of human face rather than using training datasets. From experimental results, it is shown that this method is effective and fast and it can be applied to the development of an emotional bio-robot with further improvements of its speed and accuracy.

  2. Emotional facial expressions differentially influence predictions and performance for face recognition.

    Science.gov (United States)

    Nomi, Jason S; Rhodes, Matthew G; Cleary, Anne M

    2013-01-01

    This study examined how participants' predictions of future memory performance are influenced by emotional facial expressions. Participants made judgements of learning (JOLs) predicting the likelihood that they would correctly identify a face displaying a happy, angry, or neutral emotional expression in a future two-alternative forced-choice recognition test of identity (i.e., recognition that a person's face was seen before). JOLs were higher for studied faces with happy and angry emotional expressions than for neutral faces. However, neutral test faces with studied neutral expressions had significantly higher identity recognition rates than neutral test faces studied with happy or angry expressions. Thus, these data are the first to demonstrate that people believe happy and angry emotional expressions will lead to better identity recognition in the future relative to neutral expressions. This occurred despite the fact that neutral expressions elicited better identity recognition than happy and angry expressions. These findings contribute to the growing literature examining the interaction of cognition and emotion.

  3. 3D Face Model Dataset: Automatic Detection of Facial Expressions and Emotions for Educational Environments

    Science.gov (United States)

    Chickerur, Satyadhyan; Joshi, Kartik

    2015-01-01

    Emotion detection using facial images is a technique that researchers have been using for the last two decades to try to analyze a person's emotional state given his/her image. Detection of various kinds of emotion using facial expressions of students in educational environment is useful in providing insight into the effectiveness of tutoring…

  4. Infants' Temperament and Mothers', and Fathers' Depression Predict Infants' Attention to Objects Paired with Emotional Faces.

    Science.gov (United States)

    Aktar, Evin; Mandell, Dorothy J; de Vente, Wieke; Majdandžić, Mirjana; Raijmakers, Maartje E J; Bögels, Susan M

    2016-07-01

    Between 10 and 14 months, infants gain the ability to learn about unfamiliar stimuli by observing others' emotional reactions to those stimuli, so called social referencing (SR). Joint processing of emotion and head/gaze direction is essential for SR. This study tested emotion and head/gaze direction effects on infants' attention via pupillometry in the period following the emergence of SR. Pupil responses of 14-to-17-month-old infants (N = 57) were measured during computerized presentations of unfamiliar objects alone, before-and-after being paired with emotional (happy, sad, fearful vs. neutral) faces gazing towards (vs. away) from objects. Additionally, the associations of infants' temperament, and parents' negative affect/depression/anxiety with infants' pupil responses were explored. Both mothers and fathers of participating infants completed questionnaires about their negative affect, depression and anxiety symptoms and their infants' negative temperament. Infants allocated more attention (larger pupils) to negative vs. neutral faces when the faces were presented alone, while they allocated less attention to objects paired with emotional vs. neutral faces independent of head/gaze direction. Sad (but not fearful) temperament predicted more attention to emotional faces. Infants' sad temperament moderated the associations of mothers' depression (but not anxiety) with infants' attention to objects. Maternal depression predicted more attention to objects paired with emotional expressions in infants low in sad temperament, while it predicted less attention in infants high in sad temperament. Fathers' depression (but not anxiety) predicted more attention to objects paired with emotional expressions independent of infants' temperament. We conclude that infants' own temperamental dispositions for sadness, and their exposure to mothers' and fathers' depressed moods may influence infants' attention to emotion-object associations in social learning contexts.

  5. Cognitive emotion regulation in children: Reappraisal of emotional faces modulates neural source activity in a frontoparietal network.

    Science.gov (United States)

    Wessing, Ida; Rehbein, Maimu A; Romer, Georg; Achtergarde, Sandra; Dobel, Christian; Zwitserlood, Pienie; Fürniss, Tilman; Junghöfer, Markus

    2015-06-01

    Emotion regulation has an important role in child development and psychopathology. Reappraisal as cognitive regulation technique can be used effectively by children. Moreover, an ERP component known to reflect emotional processing called late positive potential (LPP) can be modulated by children using reappraisal and this modulation is also related to children's emotional adjustment. The present study seeks to elucidate the neural generators of such LPP effects. To this end, children aged 8-14 years reappraised emotional faces, while neural activity in an LPP time window was estimated using magnetoencephalography-based source localization. Additionally, neural activity was correlated with two indexes of emotional adjustment and age. Reappraisal reduced activity in the left dorsolateral prefrontal cortex during down-regulation and enhanced activity in the right parietal cortex during up-regulation. Activity in the visual cortex decreased with increasing age, more adaptive emotion regulation and less anxiety. Results demonstrate that reappraisal changed activity within a frontoparietal network in children. Decreasing activity in the visual cortex with increasing age is suggested to reflect neural maturation. A similar decrease with adaptive emotion regulation and less anxiety implies that better emotional adjustment may be associated with an advance in neural maturation. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Interdependent mechanisms for processing gender and emotion:The special status of angry male faces

    Directory of Open Access Journals (Sweden)

    Daniel A Harris

    2016-07-01

    Full Text Available While some models of how various attributes of a face are processed have posited that face features, invariant physical cues such as gender or ethnicity as well as variant social cues such as emotion, may be processed independently (e.g., Bruce & Young, 1986, other models suggest a more distributed representation and interdependent processing (e.g., Haxby, Hoffman, & Gobbini, 2000. Here we use a contingent adaptation paradigm to investigate if mechanisms for processing the gender and emotion of a face are interdependent and symmetric across the happy-angry emotional continuum and regardless of the gender of the face. We simultaneously adapted participants to angry female faces and happy male faces (Experiment 1 or to happy female faces and angry male faces (Experiment 2. In Experiment 1 we found evidence for contingent adaptation, with simultaneous aftereffects in opposite directions: male faces were biased towards angry while female faces were biased towards happy. Interestingly, in the complementary Experiment 2 we did not find evidence for contingent adaptation, with both male and female faces biased towards angry. Our results highlight that evidence for contingent adaptation and the underlying interdependent face processing mechanisms that would allow for contingent adaptation may only be evident for certain combinations of face features. Such limits may be especially important in the case of social cues given how maladaptive it may be to stop responding to threatening information, with male angry faces considered to be the most threatening. The underlying neuronal mechanisms that could account for such asymmetric effects in contingent adaptation remain to be elucidated.

  7. Human sex differences in emotional processing of own-race and other-race faces.

    Science.gov (United States)

    Ran, Guangming; Chen, Xu; Pan, Yangu

    2014-06-18

    There is evidence that women and men show differences in the perception of affective facial expressions. However, none of the previous studies directly investigated sex differences in emotional processing of own-race and other-race faces. The current study addressed this issue using high time resolution event-related potential techniques. In total, data from 25 participants (13 women and 12 men) were analyzed. It was found that women showed increased N170 amplitudes to negative White faces compared with negative Chinese faces over the right hemisphere electrodes. This result suggests that women show enhanced sensitivity to other-race faces showing negative emotions (fear or disgust), which may contribute toward evolution. However, the current data showed that men had increased N170 amplitudes to happy Chinese versus happy White faces over the left hemisphere electrodes, indicating that men show enhanced sensitivity to own-race faces showing positive emotions (happiness). In this respect, men might use past pleasant emotional experiences to boost recognition of own-race faces.

  8. ERP Correlates of Target-Distracter Differentiation in Repeated Runs of a Continuous Recognition Task with Emotional and Neutral Faces

    Science.gov (United States)

    Treese, Anne-Cecile; Johansson, Mikael; Lindgren, Magnus

    2010-01-01

    The emotional salience of faces has previously been shown to induce memory distortions in recognition memory tasks. This event-related potential (ERP) study used repeated runs of a continuous recognition task with emotional and neutral faces to investigate emotion-induced memory distortions. In the second and third runs, participants made more…

  9. Visual attention to emotional face in schizophrenia: an eye tracking study.

    Directory of Open Access Journals (Sweden)

    Mania Asgharpour

    2015-03-01

    Full Text Available Deficits in the processing of facial emotions have been reported extensively in patients with schizophrenia. To explore whether restricted attention is the cause of impaired emotion processing in these patients, we examined visual attention through tracking eye movements in response to emotional and neutral face stimuli in a group of patients with schizophrenia and healthy individuals. We also examined the correlation between visual attention allocation and symptoms severity in our patient group.Thirty adult patients with schizophrenia and 30 matched healthy controls participated in this study. Visual attention data were recorded while participants passively viewed emotional-neutral face pairs for 500 ms. The relationship between the visual attention and symptoms severity were assessed by the Positive and Negative Syndrome Scale (PANSS in the schizophrenia group. Repeated Measures ANOVAs were used to compare the groups.Comparing the number of fixations made during face-pairs presentation, we found that patients with schizophrenia made fewer fixations on faces, regardless of the expression of the face. Analysis of the number of fixations on negative-neutral pairs also revealed that the patients made fewer fixations on both neutral and negative faces. Analysis of number of fixations on positive-neutral pairs only showed more fixations on positive relative to neutral expressions in both groups. We found no correlations between visual attention pattern to faces and symptom severity in schizophrenic patients.The results of this study suggest that the facial recognition deficit in schizophrenia is related to decreased attention to face stimuli. Finding of no difference in visual attention for positive-neutral face pairs between the groups is in line with studies that have shown increased ability to positive emotional perception in these patients.

  10. Is emotion recognition the only problem in ADHD? effects of pharmacotherapy on face and emotion recognition in children with ADHD.

    Science.gov (United States)

    Demirci, Esra; Erdogan, Ayten

    2016-12-01

    The objectives of this study were to evaluate both face and emotion recognition, to detect differences among attention deficit and hyperactivity disorder (ADHD) subgroups, to identify effects of the gender and to assess the effects of methylphenidate and atomoxetine treatment on both face and emotion recognition in patients with ADHD. The study sample consisted of 41 male, 29 female patients, 8-15 years of age, who were diagnosed as having combined type ADHD (N = 26), hyperactive/impulsive type ADHD (N = 21) or inattentive type ADHD (N = 23) but had not previously used any medication for ADHD and 35 male, 25 female healthy individuals. Long-acting methylphenidate (OROS-MPH) was prescribed to 38 patients, whereas atomoxetine was prescribed to 32 patients. The reading the mind in the eyes test (RMET) and Benton face recognition test (BFRT) were applied to all participants before and after treatment. The patients with ADHD had a significantly lower number of correct answers in child and adolescent RMET and in BFRT than the healthy controls. Among the ADHD subtypes, the hyperactive/impulsive subtype had a lower number of correct answers in the RMET than the inattentive subtypes, and the hyperactive/impulsive subtype had a lower number of correct answers in short and long form of BFRT than the combined and inattentive subtypes. Male and female patients with ADHD did not differ significantly with respect to the number of correct answers on the RMET and BFRT. The patients showed significant improvement in RMET and BFRT after treatment with OROS-MPH or atomoxetine. Patients with ADHD have difficulties in face recognition as well as emotion recognition. Both OROS-MPH and atomoxetine affect emotion recognition. However, further studies on the face and emotion recognition are needed in ADHD.

  11. Face and emotion expression processing and the serotonin transporter polymorphism 5-HTTLPR/rs22531.

    Science.gov (United States)

    Hildebrandt, A; Kiy, A; Reuter, M; Sommer, W; Wilhelm, O

    2016-06-01

    Face cognition, including face identity and facial expression processing, is a crucial component of socio-emotional abilities, characterizing humans as highest developed social beings. However, for these trait domains molecular genetic studies investigating gene-behavior associations based on well-founded phenotype definitions are still rare. We examined the relationship between 5-HTTLPR/rs25531 polymorphisms - related to serotonin-reuptake - and the ability to perceive and recognize faces and emotional expressions in human faces. For this aim we conducted structural equation modeling on data from 230 young adults, obtained by using a comprehensive, multivariate task battery with maximal effort tasks. By additionally modeling fluid intelligence and immediate and delayed memory factors, we aimed to address the discriminant relationships of the 5-HTTLPR/rs25531 polymorphisms with socio-emotional abilities. We found a robust association between the 5-HTTLPR/rs25531 polymorphism and facial emotion perception. Carriers of two long (L) alleles outperformed carriers of one or two S alleles. Weaker associations were present for face identity perception and memory for emotional facial expressions. There was no association between the 5-HTTLPR/rs25531 polymorphism and non-social abilities, demonstrating discriminant validity of the relationships. We discuss the implications and possible neural mechanisms underlying these novel findings. © 2016 John Wiley & Sons Ltd and International Behavioural and Neural Genetics Society.

  12. Putting the face in context: Body expressions impact facial emotion processing in human infants

    Directory of Open Access Journals (Sweden)

    Purva Rajhans

    2016-06-01

    Full Text Available Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs. We primed infants with body postures (fearful, happy that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception.

  13. Putting the face in context: Body expressions impact facial emotion processing in human infants.

    Science.gov (United States)

    Rajhans, Purva; Jessen, Sarah; Missana, Manuela; Grossmann, Tobias

    2016-06-01

    Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs). We primed infants with body postures (fearful, happy) that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Reaction times and face discrimination with emotional content

    Directory of Open Access Journals (Sweden)

    ANA MARÍA MARTÍNEZ

    2002-07-01

    Full Text Available Sixty-two university subjects students located in two groups, with a stocking of age of 21.6 for thegroup of women and 22 for the group of men with the purpose to carry out a study upon visual timesof reaction TRV with emotional content keeping in mind the position: start, half and end; the emotionalcontent: neutral, friendly and threatening; and the combinations of the stimuli. The group of womenI present TR more prolonged than that of the men in all the experimental conditions. Also it wasobserved, that more are prolonged when the stimulus to discriminate this located in the half so muchin men as women.

  15. Face to face: blocking facial mimicry can selectively impair recognition of emotional expressions.

    Science.gov (United States)

    Oberman, Lindsay M; Winkielman, Piotr; Ramachandran, Vilayanur S

    2007-01-01

    People spontaneously mimic a variety of behaviors, including emotional facial expressions. Embodied cognition theories suggest that mimicry reflects internal simulation of perceived emotion in order to facilitate its understanding. If so, blocking facial mimicry should impair recognition of expressions, especially of emotions that are simulated using facial musculature. The current research tested this hypothesis using four expressions (happy, disgust, fear, and sad) and two mimicry-interfering manipulations (1) biting on a pen and (2) chewing gum, as well as two control conditions. Experiment 1 used electromyography over cheek, mouth, and nose regions. The bite manipulation consistently activated assessed muscles, whereas the chew manipulation activated muscles only intermittently. Further, expressing happiness generated most facial action. Experiment 2 found that the bite manipulation interfered most with recognition of happiness. These findings suggest that facial mimicry differentially contributes to recognition of specific facial expressions, thus allowing for more refined predictions from embodied cognition theories.

  16. Psilocybin modulates functional connectivity of the amygdala during emotional face discrimination.

    Science.gov (United States)

    Grimm, O; Kraehenmann, R; Preller, K H; Seifritz, E; Vollenweider, F X

    2018-04-24

    Recent studies suggest that the antidepressant effects of the psychedelic 5-HT2A receptor agonist psilocybin are mediated through its modulatory properties on prefrontal and limbic brain regions including the amygdala. To further investigate the effects of psilocybin on emotion processing networks, we studied for the first-time psilocybin's acute effects on amygdala seed-to-voxel connectivity in an event-related face discrimination task in 18 healthy volunteers who received psilocybin and placebo in a double-blind balanced cross-over design. The amygdala has been implicated as a salience detector especially involved in the immediate response to emotional face content. We used beta-series amygdala seed-to-voxel connectivity during an emotional face discrimination task to elucidate the connectivity pattern of the amygdala over the entire brain. When we compared psilocybin to placebo, an increase in reaction time for all three categories of affective stimuli was found. Psilocybin decreased the connectivity between amygdala and the striatum during angry face discrimination. During happy face discrimination, the connectivity between the amygdala and the frontal pole was decreased. No effect was seen during discrimination of fearful faces. Thus, we show psilocybin's effect as a modulator of major connectivity hubs of the amygdala. Psilocybin decreases the connectivity between important nodes linked to emotion processing like the frontal pole or the striatum. Future studies are needed to clarify whether connectivity changes predict therapeutic effects in psychiatric patients. Copyright © 2018 Elsevier B.V. and ECNP. All rights reserved.

  17. Psilocybin with psychological support improves emotional face recognition in treatment-resistant depression.

    Science.gov (United States)

    Stroud, J B; Freeman, T P; Leech, R; Hindocha, C; Lawn, W; Nutt, D J; Curran, H V; Carhart-Harris, R L

    2018-02-01

    Depressed patients robustly exhibit affective biases in emotional processing which are altered by SSRIs and predict clinical outcome. The objective of this study is to investigate whether psilocybin, recently shown to rapidly improve mood in treatment-resistant depression (TRD), alters patients' emotional processing biases. Seventeen patients with treatment-resistant depression completed a dynamic emotional face recognition task at baseline and 1 month later after two doses of psilocybin with psychological support. Sixteen controls completed the emotional recognition task over the same time frame but did not receive psilocybin. We found evidence for a group × time interaction on speed of emotion recognition (p = .035). At baseline, patients were slower at recognising facial emotions compared with controls (p psilocybin, this difference was remediated (p = .208). Emotion recognition was faster at follow-up compared with baseline in patients (p = .004, d = .876) but not controls (p = .263, d = .302). In patients, this change was significantly correlated with a reduction in anhedonia over the same time period (r = .640, p = .010). Psilocybin with psychological support appears to improve processing of emotional faces in treatment-resistant depression, and this correlates with reduced anhedonia. Placebo-controlled studies are warranted to follow up these preliminary findings.

  18. The not face: A grammaticalization of facial expressions of emotion.

    Science.gov (United States)

    Benitez-Quiroz, C Fabian; Wilbur, Ronnie B; Martinez, Aleix M

    2016-05-01

    Facial expressions of emotion are thought to have evolved from the development of facial muscles used in sensory regulation and later adapted to express moral judgment. Negative moral judgment includes the expressions of anger, disgust and contempt. Here, we study the hypothesis that these facial expressions of negative moral judgment have further evolved into a facial expression of negation regularly used as a grammatical marker in human language. Specifically, we show that people from different cultures expressing negation use the same facial muscles as those employed to express negative moral judgment. We then show that this nonverbal signal is used as a co-articulator in speech and that, in American Sign Language, it has been grammaticalized as a non-manual marker. Furthermore, this facial expression of negation exhibits the theta oscillation (3-8 Hz) universally seen in syllable and mouthing production in speech and signing. These results provide evidence for the hypothesis that some components of human language have evolved from facial expressions of emotion, and suggest an evolutionary route for the emergence of grammatical markers. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions.

    Science.gov (United States)

    Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the

  20. The recognition of emotional expression in prosopagnosia: decoding whole and part faces.

    Science.gov (United States)

    Stephan, Blossom Christa Maree; Breen, Nora; Caine, Diana

    2006-11-01

    Prosopagnosia is currently viewed within the constraints of two competing theories of face recognition, one highlighting the analysis of features, the other focusing on configural processing of the whole face. This study investigated the role of feature analysis versus whole face configural processing in the recognition of facial expression. A prosopagnosic patient, SC made expression decisions from whole and incomplete (eyes-only and mouth-only) faces where features had been obscured. SC was impaired at recognizing some (e.g., anger, sadness, and fear), but not all (e.g., happiness) emotional expressions from the whole face. Analyses of his performance on incomplete faces indicated that his recognition of some expressions actually improved relative to his performance on the whole face condition. We argue that in SC interference from damaged configural processes seem to override an intact ability to utilize part-based or local feature cues.

  1. Implicit conditioning of faces via the social regulation of emotion: ERP evidence of early attentional biases for security conditioned faces.

    Science.gov (United States)

    Beckes, Lane; Coan, James A; Morris, James P

    2013-08-01

    Not much is known about the neural and psychological processes that promote the initial conditions necessary for positive social bonding. This study explores one method of conditioned bonding utilizing dynamics related to the social regulation of emotion and attachment theory. This form of conditioning involves repeated presentations of negative stimuli followed by images of warm, smiling faces. L. Beckes, J. Simpson, and A. Erickson (2010) found that this conditioning procedure results in positive associations with the faces measured via a lexical decision task, suggesting they are perceived as comforting. This study found that the P1 ERP was similarly modified by this conditioning procedure and the P1 amplitude predicted lexical decision times to insecure words primed by the faces. The findings have implications for understanding how the brain detects supportive people, the flexibility and modifiability of early ERP components, and social bonding more broadly. Copyright © 2013 Society for Psychophysiological Research.

  2. Cultural in-group advantage: emotion recognition in African American and European American faces and voices.

    Science.gov (United States)

    Wickline, Virginia B; Bailey, Wendy; Nowicki, Stephen

    2009-03-01

    The authors explored whether there were in-group advantages in emotion recognition of faces and voices by culture or geographic region. Participants were 72 African American students (33 men, 39 women), 102 European American students (30 men, 72 women), 30 African international students (16 men, 14 women), and 30 European international students (15 men, 15 women). The participants determined emotions in African American and European American faces and voices. Results showed an in-group advantage-sometimes by culture, less often by race-in recognizing facial and vocal emotional expressions. African international students were generally less accurate at interpreting American nonverbal stimuli than were European American, African American, and European international peers. Results suggest that, although partly universal, emotional expressions have subtle differences across cultures that persons must learn.

  3. Face puzzle—two new video-based tasks for measuring explicit and implicit aspects of facial emotion recognition

    Science.gov (United States)

    Kliemann, Dorit; Rosenblau, Gabriela; Bölte, Sven; Heekeren, Hauke R.; Dziobek, Isabel

    2013-01-01

    Recognizing others' emotional states is crucial for effective social interaction. While most facial emotion recognition tasks use explicit prompts that trigger consciously controlled processing, emotional faces are almost exclusively processed implicitly in real life. Recent attempts in social cognition suggest a dual process perspective, whereby explicit and implicit processes largely operate independently. However, due to differences in methodology the direct comparison of implicit and explicit social cognition has remained a challenge. Here, we introduce a new tool to comparably measure implicit and explicit processing aspects comprising basic and complex emotions in facial expressions. We developed two video-based tasks with similar answer formats to assess performance in respective facial emotion recognition processes: Face Puzzle, implicit and explicit. To assess the tasks' sensitivity to atypical social cognition and to infer interrelationship patterns between explicit and implicit processes in typical and atypical development, we included healthy adults (NT, n = 24) and adults with autism spectrum disorder (ASD, n = 24). Item analyses yielded good reliability of the new tasks. Group-specific results indicated sensitivity to subtle social impairments in high-functioning ASD. Correlation analyses with established implicit and explicit socio-cognitive measures were further in favor of the tasks' external validity. Between group comparisons provide first hints of differential relations between implicit and explicit aspects of facial emotion recognition processes in healthy compared to ASD participants. In addition, an increased magnitude of between group differences in the implicit task was found for a speed-accuracy composite measure. The new Face Puzzle tool thus provides two new tasks to separately assess explicit and implicit social functioning, for instance, to measure subtle impairments as well as potential improvements due to social cognitive

  4. Emotional memory and perception of emotional faces in patients suffering from depersonalization disorder.

    NARCIS (Netherlands)

    Montagne, B.; Sierra, M.; Medford, N.; Hunter, E.; Baker, D.J.; Kessels, R.P.C.; Haan, E.H.F. de; David, A.S.

    2007-01-01

    Previous work has shown that patients with depersonalization disorder (DPD) have reduced physiological responses to emotional stimuli, which may be related to subjective emotional numbing. This study investigated two aspects of affective processing in 13 patients with DPD according to the DSM-IV

  5. No Differences in Emotion Recognition Strategies in Children with Autism Spectrum Disorder: Evidence from Hybrid Faces

    OpenAIRE

    Evers, Kris; Kerkhof, Inneke; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2014-01-01

    Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD). However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybr...

  6. Avoidant decision making in social anxiety: the interaction of angry faces and emotional responses

    Science.gov (United States)

    Pittig, Andre; Pawlikowski, Mirko; Craske, Michelle G.; Alpers, Georg W.

    2014-01-01

    Recent research indicates that angry facial expressions are preferentially processed and may facilitate automatic avoidance response, especially in socially anxious individuals. However, few studies have examined whether this bias also expresses itself in more complex cognitive processes and behavior such as decision making. We recently introduced a variation of the Iowa Gambling Task which allowed us to document the influence of task-irrelevant emotional cues on rational decision making. The present study used a modified gambling task to investigate the impact of angry facial expressions on decision making in 38 individuals with a wide range of social anxiety. Participants were to find out which choices were (dis-) advantageous to maximize overall gain. To create a decision conflict between approach of reward and avoidance of fear-relevant angry faces, advantageous choices were associated with angry facial expressions, whereas disadvantageous choices were associated with happy facial expressions. Results indicated that higher social avoidance predicted less advantageous decisions in the beginning of the task, i.e., when contingencies were still uncertain. Interactions with specific skin conductance responses further clarified that this initial avoidance only occurred in combination with elevated responses before choosing an angry facial expressions. In addition, an interaction between high trait anxiety and elevated responses to early losses predicted faster learning of an advantageous strategy. These effects were independent of intelligence, general risky decision-making, self-reported state anxiety, and depression. Thus, socially avoidant individuals who respond emotionally to angry facial expressions are more likely to show avoidance of these faces under uncertainty. This novel laboratory paradigm may be an appropriate analog for central features of social anxiety. PMID:25324792

  7. Avoidant decision making in social anxiety: the interaction of angry faces and emotional responses.

    Science.gov (United States)

    Pittig, Andre; Pawlikowski, Mirko; Craske, Michelle G; Alpers, Georg W

    2014-01-01

    Recent research indicates that angry facial expressions are preferentially processed and may facilitate automatic avoidance response, especially in socially anxious individuals. However, few studies have examined whether this bias also expresses itself in more complex cognitive processes and behavior such as decision making. We recently introduced a variation of the Iowa Gambling Task which allowed us to document the influence of task-irrelevant emotional cues on rational decision making. The present study used a modified gambling task to investigate the impact of angry facial expressions on decision making in 38 individuals with a wide range of social anxiety. Participants were to find out which choices were (dis-) advantageous to maximize overall gain. To create a decision conflict between approach of reward and avoidance of fear-relevant angry faces, advantageous choices were associated with angry facial expressions, whereas disadvantageous choices were associated with happy facial expressions. Results indicated that higher social avoidance predicted less advantageous decisions in the beginning of the task, i.e., when contingencies were still uncertain. Interactions with specific skin conductance responses further clarified that this initial avoidance only occurred in combination with elevated responses before choosing an angry facial expressions. In addition, an interaction between high trait anxiety and elevated responses to early losses predicted faster learning of an advantageous strategy. These effects were independent of intelligence, general risky decision-making, self-reported state anxiety, and depression. Thus, socially avoidant individuals who respond emotionally to angry facial expressions are more likely to show avoidance of these faces under uncertainty. This novel laboratory paradigm may be an appropriate analog for central features of social anxiety.

  8. Avoidant decision making in social anxiety: The interaction of angry faces and emotional responses

    Directory of Open Access Journals (Sweden)

    Andre ePittig

    2014-09-01

    Full Text Available Recent research indicates that angry facial expressions are preferentially processed and may facilitate automatic avoidance response, especially in socially anxious individuals. However, few studies have examined whether this bias also expresses itself in more complex cognitive processes and behavior such as decision making. We recently introduced a variation of the Iowa Gambling Task which allowed us to document the influence of task-irrelevant emotional cues on rational decision making. The present study used a modified gambling task to investigate the impact of angry facial expressions on decision making in 38 individuals with a wide range of social anxiety. Participants were to find out which choices were (dis- advantageous to maximize overall gain. To create a decision conflict between approach of rewards and avoidance of fear-relevant angry faces, advantageous choices were associated with angry facial expressions, whereas disadvantageous choices were associated with happy facial expressions. Results indicated that higher social avoidance predicted less advantageous decisions in the beginning of the task, i.e., when contingencies were still uncertain. Interactions with specific skin conductance responses further clarified that this initial avoidance only occurred in combination with elevated responses before choosing an angry facial expressions. In addition, an interaction between high trait anxiety and elevated responses to early losses predicted faster learning of an advantageous strategy. These effects were independent of intelligence, general risky decision-making, self-reported state anxiety, and depression. Thus, socially avoidant individuals who respond emotionally to angry facial expressions are more likely to show avoidance of these faces under uncertainty. This novel laboratory paradigm may be an appropriate analog for central features of social anxiety.

  9. Cross-modal perception (face and voice in emotions. ERPs and behavioural measures

    Directory of Open Access Journals (Sweden)

    Michela Balconi

    2007-04-01

    Full Text Available Emotion decoding constitutes a case of multimodal processing of cues from multiple channels. Previous behavioural and neuropsychological studies indicated that, when we have to decode emotions on the basis of multiple perceptive information, a cross-modal integration has place. The present study investigates the simultaneous processing of emotional tone of voice and emotional facial expression by event-related potentials (ERPs, through an ample range of different emotions (happiness, sadness, fear, anger, surprise, and disgust. Auditory emotional stimuli (a neutral word pronounced in an affective tone and visual patterns (emotional facial expressions were matched in congruous (the same emotion in face and voice and incongruous (different emotions pairs. Subjects (N=30 were required to process the stimuli and to indicate their comprehension (by stimpad. ERPs variations and behavioural data (response time, RTs were submitted to repeated measures analysis of variance (ANOVA. We considered two time intervals (150-250; 250-350 ms post-stimulus, in order to explore the ERP variations. ANOVA showed two different ERP effects, a negative deflection (N2, more anterior-distributed (Fz, and a positive deflection (P2, more posterior-distributed, with different cognitive functions. In the first case N2 may be considered a marker of the emotional content (sensitive to type of emotion, whereas P2 may represent a cross-modal integration marker, it being varied as a function of the congruous/incongruous condition, showing a higher peak for congruous stimuli than incongruous stimuli. Finally, a RT reduction was found for some emotion types for congruous condition (i.e. sadness and an inverted effect for other emotions (i.e. fear, anger, and surprise.

  10. Emotional face processing and flat affect in schizophrenia: functional and structural neural correlates.

    Science.gov (United States)

    Lepage, M; Sergerie, K; Benoit, A; Czechowska, Y; Dickie, E; Armony, J L

    2011-09-01

    There is a general consensus in the literature that schizophrenia causes difficulties with facial emotion perception and discrimination. Functional brain imaging studies have observed reduced limbic activity during facial emotion perception but few studies have examined the relation to flat affect severity. A total of 26 people with schizophrenia and 26 healthy controls took part in this event-related functional magnetic resonance imaging study. Sad, happy and neutral faces were presented in a pseudo-random order and participants indicated the gender of the face presented. Manual segmentation of the amygdala was performed on a structural T1 image. Both the schizophrenia group and the healthy control group rated the emotional valence of facial expressions similarly. Both groups exhibited increased brain activity during the perception of emotional faces relative to neutral ones in multiple brain regions, including multiple prefrontal regions bilaterally, the right amygdala, right cingulate cortex and cuneus. Group comparisons, however, revealed increased activity in the healthy group in the anterior cingulate, right parahippocampal gyrus and multiple visual areas. In schizophrenia, the severity of flat affect correlated significantly with neural activity in several brain areas including the amygdala and parahippocampal region bilaterally. These results suggest that many of the brain regions involved in emotional face perception, including the amygdala, are equally recruited in both schizophrenia and controls, but flat affect can also moderate activity in some other brain regions, notably in the left amygdala and parahippocampal gyrus bilaterally. There were no significant group differences in the volume of the amygdala.

  11. Sad benefit in face working memory: an emotional bias of melancholic depression.

    Science.gov (United States)

    Linden, Stefanie C; Jackson, Margaret C; Subramanian, Leena; Healy, David; Linden, David E J

    2011-12-01

    Emotion biases feature prominently in cognitive theories of depression and are a focus of psychological interventions. However, there is presently no stable neurocognitive marker of altered emotion-cognition interactions in depression. One reason may be the heterogeneity of major depressive disorder. Our aim in the present study was to find an emotional bias that differentiates patients with melancholic depression from controls, and patients with melancholic from those with non-melancholic depression. We used a working memory paradigm for emotional faces, where two faces with angry, happy, neutral, sad or fearful expression had to be retained over one second. Twenty patients with melancholic depression, 20 age-, education- and gender-matched control participants and 20 patients with non-melancholic depression participated in the study. We analysed performance on the working memory task using signal detection measures. We found an interaction between group and emotion on working memory performance that was driven by the higher performance for sad faces compared to other categories in the melancholic group. We computed a measure of "sad benefit", which distinguished melancholic and non-melancholic patients with good sensitivity and specificity. However, replication studies and formal discriminant analysis will be needed in order to assess whether emotion bias in working memory may become a useful diagnostic tool to distinguish these two syndromes. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Virtual faces expressing emotions: an initial concomitant and construct validity study.

    Science.gov (United States)

    Joyal, Christian C; Jacob, Laurence; Cigna, Marie-Hélène; Guay, Jean-Pierre; Renaud, Patrice

    2014-01-01

    Facial expressions of emotions represent classic stimuli for the study of social cognition. Developing virtual dynamic facial expressions of emotions, however, would open-up possibilities, both for fundamental and clinical research. For instance, virtual faces allow real-time Human-Computer retroactions between physiological measures and the virtual agent. The goal of this study was to initially assess concomitants and construct validity of a newly developed set of virtual faces expressing six fundamental emotions (happiness, surprise, anger, sadness, fear, and disgust). Recognition rates, facial electromyography (zygomatic major and corrugator supercilii muscles), and regional gaze fixation latencies (eyes and mouth regions) were compared in 41 adult volunteers (20 ♂, 21 ♀) during the presentation of video clips depicting real vs. virtual adults expressing emotions. Emotions expressed by each set of stimuli were similarly recognized, both by men and women. Accordingly, both sets of stimuli elicited similar activation of facial muscles and similar ocular fixation times in eye regions from man and woman participants. Further validation studies can be performed with these virtual faces among clinical populations known to present social cognition difficulties. Brain-Computer Interface studies with feedback-feedforward interactions based on facial emotion expressions can also be conducted with these stimuli.

  13. Desired emotional states: their nature, causes, and implications for emotion regulation.

    Science.gov (United States)

    Tamir, Maya; Gutentag, Tony

    2017-10-01

    Emotion regulation is a process directed toward achieving desired emotions. People want to experience different emotions at different times and for different reasons, leading them to change emotions accordingly. Research on desired emotions has made several discoveries. First, what people want to feel varies across individuals and across situations. Second, what people want to feel depends on how much they value emotions and on the extent to which they expect emotions to yield behavioral, social, or epistemic benefits. Third, what people want to feel sets the direction of emotion regulation and can shape emotional experiences and subsequent behavior. Identifying and understanding desired emotional states can promote healthier emotion regulation and emotional experiences, and more adaptive personal and social functioning. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Abnormal early gamma responses to emotional faces differentiate unipolar from bipolar disorder patients.

    Science.gov (United States)

    Liu, T Y; Chen, Y S; Su, T P; Hsieh, J C; Chen, L F

    2014-01-01

    This study investigates the cortical abnormalities of early emotion perception in patients with major depressive disorder (MDD) and bipolar disorder (BD) using gamma oscillations. Twenty-three MDD patients, twenty-five BD patients, and twenty-four normal controls were enrolled and their event-related magnetoencephalographic responses were recorded during implicit emotional tasks. Our results demonstrated abnormal gamma activity within 100 ms in the emotion-related regions (amygdala, orbitofrontal (OFC) cortex, anterior insula (AI), and superior temporal pole) in the MDD patients, suggesting that these patients may have dysfunctions or negativity biases in perceptual binding of emotional features at very early stage. Decreased left superior medial frontal cortex (smFC) responses to happy faces in the MDD patients were correlated with their serious level of depression symptoms, indicating that decreased smFC activity perhaps underlies irregular positive emotion processing in depressed patients. In the BD patients, we showed abnormal activation in visual regions (inferior/middle occipital and middle temporal cortices) which responded to emotional faces within 100 ms, supporting that the BD patients may hyperactively respond to emotional features in perceptual binding. The discriminant function of gamma activation in the left smFC, right medial OFC, right AI/inferior OFC, and the right precentral cortex accurately classified 89.6% of patients as unipolar/bipolar disorders.

  15. The Discrete Emotions Questionnaire: A New Tool for Measuring State Self-Reported Emotions.

    Science.gov (United States)

    Harmon-Jones, Cindy; Bastian, Brock; Harmon-Jones, Eddie

    2016-01-01

    Several discrete emotions have broad theoretical and empirical importance, as shown by converging evidence from diverse areas of psychology, including facial displays, developmental behaviors, and neuroscience. However, the measurement of these states has not progressed along with theory, such that when researchers measure subjectively experienced emotions, they commonly rely on scales assessing broad dimensions of affect (positivity and negativity), rather than discrete emotions. The current manuscript presents four studies that validate a new instrument, the Discrete Emotions Questionnaire (DEQ), that is sensitive to eight distinct state emotions: anger, disgust, fear, anxiety, sadness, happiness, relaxation, and desire. Emotion theory supporting the importance of distinguishing these specific emotions is reviewed.

  16. Artificial emotional model based on finite state machine

    Institute of Scientific and Technical Information of China (English)

    MENG Qing-mei; WU Wei-guo

    2008-01-01

    According to the basic emotional theory, the artificial emotional model based on the finite state machine(FSM) was presented. In finite state machine model of emotion, the emotional space included the basic emotional space and the multiple emotional spaces. The emotion-switching diagram was defined and transition function was developed using Markov chain and linear interpolation algorithm. The simulation model was built using Stateflow toolbox and Simulink toolbox based on the Matlab platform.And the model included three subsystems: the input one, the emotion one and the behavior one. In the emotional subsystem, the responses of different personalities to the external stimuli were described by defining personal space. This model takes states from an emotional space and updates its state depending on its current state and a state of its input (also a state-emotion). The simulation model realizes the process of switching the emotion from the neutral state to other basic emotions. The simulation result is proved to correspond to emotion-switching law of human beings.

  17. Are patients with schizophrenia impaired in processing non-emotional features of human faces?

    Directory of Open Access Journals (Sweden)

    Hayley eDarke

    2013-08-01

    Full Text Available It is known that individuals with schizophrenia exhibit signs of impaired face processing, however, the exact perceptual and cognitive mechanisms underlying these deficits are yet to be elucidated. One possible source of confusion in the current literature is the methodological and conceptual inconsistencies that can arise from the varied treatment of different aspects of face processing relating to emotional and non-emotional aspects of face perception. This review aims to disentangle the literature by focusing on the performance of patients with schizophrenia in a range of tasks that required processing of non-emotional features of face stimuli (e.g. identity or gender. We also consider the performance of patients on non-face stimuli that share common elements such as familiarity (e.g. cars and social relevance (e.g. gait. We conclude by exploring whether observed deficits are best considered as face-specific and note that further investigation is required to properly assess the potential contribution of more generalised attentional or perceptual impairments.

  18. Emotional face recognition deficit in amnestic patients with mild cognitive impairment: behavioral and electrophysiological evidence

    Directory of Open Access Journals (Sweden)

    Yang L

    2015-08-01

    Full Text Available Linlin Yang, Xiaochuan Zhao, Lan Wang, Lulu Yu, Mei Song, Xueyi Wang Department of Mental Health, The First Hospital of Hebei Medical University, Hebei Medical University Institute of Mental Health, Shijiazhuang, People’s Republic of China Abstract: Amnestic mild cognitive impairment (MCI has been conceptualized as a transitional stage between healthy aging and Alzheimer’s disease. Thus, understanding emotional face recognition deficit in patients with amnestic MCI could be useful in determining progression of amnestic MCI. The purpose of this study was to investigate the features of emotional face processing in amnestic MCI by using event-related potentials (ERPs. Patients with amnestic MCI and healthy controls performed a face recognition task, giving old/new responses to previously studied and novel faces with different emotional messages as the stimulus material. Using the learning-recognition paradigm, the experiments were divided into two steps, ie, a learning phase and a test phase. ERPs were analyzed on electroencephalographic recordings. The behavior data indicated high emotion classification accuracy for patients with amnestic MCI and for healthy controls. The mean percentage of correct classifications was 81.19% for patients with amnestic MCI and 96.46% for controls. Our ERP data suggest that patients with amnestic MCI were still be able to undertake personalizing processing for negative faces, but not for neutral or positive faces, in the early frontal processing stage. In the early time window, no differences in frontal old/new effect were found between patients with amnestic MCI and normal controls. However, in the late time window, the three types of stimuli did not elicit any old/new parietal effects in patients with amnestic MCI, suggesting their recollection was impaired. This impairment may be closely associated with amnestic MCI disease. We conclude from our data that face recognition processing and emotional memory is

  19. Pretreatment Differences in BOLD Response to Emotional Faces Correlate with Antidepressant Response to Scopolamine.

    Science.gov (United States)

    Furey, Maura L; Drevets, Wayne C; Szczepanik, Joanna; Khanna, Ashish; Nugent, Allison; Zarate, Carlos A

    2015-03-28

    Faster acting antidepressants and biomarkers that predict treatment response are needed to facilitate the development of more effective treatments for patients with major depressive disorders. Here, we evaluate implicitly and explicitly processed emotional faces using neuroimaging to identify potential biomarkers of treatment response to the antimuscarinic, scopolamine. Healthy participants (n=15) and unmedicated-depressed major depressive disorder patients (n=16) participated in a double-blind, placebo-controlled crossover infusion study using scopolamine (4 μg/kg). Before and following scopolamine, blood oxygen-level dependent signal was measured using functional MRI during a selective attention task. Two stimuli comprised of superimposed pictures of faces and houses were presented. Participants attended to one stimulus component and performed a matching task. Face emotion was modulated (happy/sad) creating implicit (attend-houses) and explicit (attend-faces) emotion processing conditions. The pretreatment difference in blood oxygen-level dependent response to happy and sad faces under implicit and explicit conditions (emotion processing biases) within a-priori regions of interest was correlated with subsequent treatment response in major depressive disorder. Correlations were observed exclusively during implicit emotion processing in the regions of interest, which included the subgenual anterior cingulate (Pemotional faces prior to treatment reflect the potential to respond to scopolamine. These findings replicate earlier results, highlighting the potential for pretreatment neural activity in the middle occipital cortices and subgenual anterior cingulate to inform us about the potential to respond clinically to scopolamine. Published by Oxford University Press on behalf of CINP 2015. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  20. Emotional Faces in Context: Age Differences in Recognition Accuracy and Scanning Patterns

    Science.gov (United States)

    Noh, Soo Rim; Isaacowitz, Derek M.

    2014-01-01

    While age-related declines in facial expression recognition are well documented, previous research relied mostly on isolated faces devoid of context. We investigated the effects of context on age differences in recognition of facial emotions and in visual scanning patterns of emotional faces. While their eye movements were monitored, younger and older participants viewed facial expressions (i.e., anger, disgust) in contexts that were emotionally congruent, incongruent, or neutral to the facial expression to be identified. Both age groups had highest recognition rates of facial expressions in the congruent context, followed by the neutral context, and recognition rates in the incongruent context were worst. These context effects were more pronounced for older adults. Compared to younger adults, older adults exhibited a greater benefit from congruent contextual information, regardless of facial expression. Context also influenced the pattern of visual scanning characteristics of emotional faces in a similar manner across age groups. In addition, older adults initially attended more to context overall. Our data highlight the importance of considering the role of context in understanding emotion recognition in adulthood. PMID:23163713

  1. Image-based Analysis of Emotional Facial Expressions in Full Face Transplants.

    Science.gov (United States)

    Bedeloglu, Merve; Topcu, Çagdas; Akgul, Arzu; Döger, Ela Naz; Sever, Refik; Ozkan, Ozlenen; Ozkan, Omer; Uysal, Hilmi; Polat, Ovunc; Çolak, Omer Halil

    2018-01-20

    In this study, it is aimed to determine the degree of the development in emotional expression of full face transplant patients from photographs. Hence, a rehabilitation process can be planned according to the determination of degrees as a later work. As envisaged, in full face transplant cases, the determination of expressions can be confused or cannot be achieved as the healthy control group. In order to perform image-based analysis, a control group consist of 9 healthy males and 2 full-face transplant patients participated in the study. Appearance-based Gabor Wavelet Transform (GWT) and Local Binary Pattern (LBP) methods are adopted for recognizing neutral and 6 emotional expressions which consist of angry, scared, happy, hate, confused and sad. Feature extraction was carried out by using both methods and combination of these methods serially. In the performed expressions, the extracted features of the most distinct zones in the facial area where the eye and mouth region, have been used to classify the emotions. Also, the combination of these region features has been used to improve classifier performance. Control subjects and transplant patients' ability to perform emotional expressions have been determined with K-nearest neighbor (KNN) classifier with region-specific and method-specific decision stages. The results have been compared with healthy group. It has been observed that transplant patients don't reflect some emotional expressions. Also, there were confusions among expressions.

  2. Emotional Face Identification in Youths with Primary Bipolar Disorder or Primary Attention-Deficit/Hyperactivity Disorder

    Science.gov (United States)

    Seymour, Karen E.; Pescosolido, Matthew F.; Reidy, Brooke L.; Galvan, Thania; Kim, Kerri L.; Young, Matthew; Dickstein, Daniel P.

    2013-01-01

    Objective: Bipolar disorder (BD) and attention-deficit/hyperactivity disorder (ADHD) are often comorbid or confounded; therefore, we evaluated emotional face identification to better understand brain/behavior interactions in children and adolescents with either primary BD, primary ADHD, or typically developing controls (TDC). Method: Participants…

  3. Gender differences in the recognition of emotional faces: are men less efficient?

    Directory of Open Access Journals (Sweden)

    Ana Ruiz-Ibáñez

    2017-06-01

    Full Text Available As research in recollection of stimuli with emotional valence indicates, emotions influence memory. Many studies in face and emotional facial expression recognition have focused on age (young and old people and gender-associated (men and women differences. Nevertheless, this kind of studies has produced contradictory results, because of that, it would be necessary to study gender involvement in depth. The main objective of our research consists of analyzing the differences in image recognition using faces with emotional facial expressions between two groups composed by university students aged 18-30. The first group is constituted by men and the second one by women. The results showed statistically significant differences in face corrected recognition (hit rate - false alarm rate: the women demonstrated a better recognition than the men. However, other analyzed variables as time or efficiency do not provide conclusive results. Furthermore, a significant negative correlation between the time used and the efficiency when doing the task was found in the male group. This information reinforces not only the hypothesis of gender difference in face recognition, in favor of women, but also these ones that suggest a different cognitive processing of facial stimuli in both sexes. Finally, we argue the necessity of a greater research related to variables as age or sociocultural level.

  4. Amygdala Hyperactivation During Face Emotion Processing in Unaffected Youth at Risk for Bipolar Disorder

    Science.gov (United States)

    Olsavsky, Aviva K.; Brotman, Melissa A.; Rutenberg, Julia G.; Muhrer, Eli J.; Deveney, Christen M.; Fromm, Stephen J.; Towbin, Kenneth; Pine, Daniel S.; Leibenluft, Ellen

    2012-01-01

    Objective: Youth at familial risk for bipolar disorder (BD) show deficits in face emotion processing, but the neural correlates of these deficits have not been examined. This preliminary study tests the hypothesis that, relative to healthy comparison (HC) subjects, both BD subjects and youth at risk for BD (i.e., those with a first-degree BD…

  5. Neural activation to emotional faces in adolescents with autism spectrum disorders.

    Science.gov (United States)

    Weng, Shih-Jen; Carrasco, Melisa; Swartz, Johnna R; Wiggins, Jillian Lee; Kurapati, Nikhil; Liberzon, Israel; Risi, Susan; Lord, Catherine; Monk, Christopher S

    2011-03-01

    Autism spectrum disorders (ASD) involve a core deficit in social functioning and impairments in the ability to recognize face emotions. In an emotional faces task designed to constrain group differences in attention, the present study used functional MRI to characterize activation in the amygdala, ventral prefrontal cortex (vPFC), and striatum, three structures involved in socio-emotional processing in adolescents with ASD. Twenty-two adolescents with ASD and 20 healthy adolescents viewed facial expressions (happy, fearful, sad and neutral) that were briefly presented (250 ms) during functional MRI acquisition. To monitor attention, subjects pressed a button to identify the gender of each face. The ASD group showed greater activation to the faces relative to the control group in the amygdala, vPFC and striatum. Follow-up analyses indicated that the ASD relative to control group showed greater activation in the amygdala, vPFC and striatum (p gender identification task. When group differences in attention to facial expressions were limited, adolescents with ASD showed greater activation in structures involved in socio-emotional processing. © 2010 The Authors. Journal of Child Psychology and Psychiatry © 2010 Association for Child and Adolescent Mental Health.

  6. Ratings of Emotion in Laterally Presented Faces: Sex and handedness effects

    NARCIS (Netherlands)

    van Strien, J.W.; van Beek, S.

    2000-01-01

    Sixteen right-handed participants (8 male and 8 female students) and 16 left-handed participants (8 male and 8 female students) were presented with cartoon faces expressing emotions ranging from extremely positive to extremely negative. A forced-choice paradigm was used in which the participants

  7. The state of the heart: Emotional labor as emotion regulation reviewed and revised.

    Science.gov (United States)

    Grandey, Alicia A; Melloy, Robert C

    2017-07-01

    Emotional labor has been an area of burgeoning research interest in occupational health psychology in recent years. Emotional labor was conceptualized in the early 1980s by sociologist Arlie Hochschild (1983) as occupational requirements that alienate workers from their emotions. Almost 2 decades later, a model was published in Journal of Occupational Health Psychology ( JOHP ) that viewed emotional labor through a psychological lens, as emotion regulation strategies that differentially relate to performance and wellbeing. For this anniversary issue of JOH P, we review the emotional labor as emotion regulation model, its contributions, limitations, and the state of the evidence for its propositions. At the heart of our article, we present a revised model of emotional labor as emotion regulation, that incorporates recent findings and represents a multilevel and dynamic nature of emotional labor as emotion regulation. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. Increased amygdala responses to emotional faces after psilocybin for treatment-resistant depression.

    Science.gov (United States)

    Roseman, Leor; Demetriou, Lysia; Wall, Matthew B; Nutt, David J; Carhart-Harris, Robin L

    2017-12-27

    Recent evidence indicates that psilocybin with psychological support may be effective for treating depression. Some studies have found that patients with depression show heightened amygdala responses to fearful faces and there is reliable evidence that treatment with SSRIs attenuates amygdala responses (Ma, 2015). We hypothesised that amygdala responses to emotional faces would be altered post-treatment with psilocybin. In this open-label study, 20 individuals diagnosed with moderate to severe, treatment-resistant depression, underwent two separate dosing sessions with psilocybin. Psychological support was provided before, during and after these sessions and 19 completed fMRI scans one week prior to the first session and one day after the second and last. Neutral, fearful and happy faces were presented in the scanner and analyses focused on the amygdala. Group results revealed rapid and enduring improvements in depressive symptoms post psilocybin. Increased responses to fearful and happy faces were observed in the right amygdala post-treatment, and right amygdala increases to fearful versus neutral faces were predictive of clinical improvements at 1-week. Psilocybin with psychological support was associated with increased amygdala responses to emotional stimuli, an opposite effect to previous findings with SSRIs. This suggests fundamental differences in these treatments' therapeutic actions, with SSRIs mitigating negative emotions and psilocybin allowing patients to confront and work through them. Based on the present results, we propose that psilocybin with psychological support is a treatment approach that potentially revives emotional responsiveness in depression, enabling patients to reconnect with their emotions. ISRCTN, number ISRCTN14426797. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Facial Expression Aftereffect Revealed by Adaption to Emotion-Invisible Dynamic Bubbled Faces

    Science.gov (United States)

    Luo, Chengwen; Wang, Qingyun; Schyns, Philippe G.; Kingdom, Frederick A. A.; Xu, Hong

    2015-01-01

    Visual adaptation is a powerful tool to probe the short-term plasticity of the visual system. Adapting to local features such as the oriented lines can distort our judgment of subsequently presented lines, the tilt aftereffect. The tilt aftereffect is believed to be processed at the low-level of the visual cortex, such as V1. Adaptation to faces, on the other hand, can produce significant aftereffects in high-level traits such as identity, expression, and ethnicity. However, whether face adaptation necessitate awareness of face features is debatable. In the current study, we investigated whether facial expression aftereffects (FEAE) can be generated by partially visible faces. We first generated partially visible faces using the bubbles technique, in which the face was seen through randomly positioned circular apertures, and selected the bubbled faces for which the subjects were unable to identify happy or sad expressions. When the subjects adapted to static displays of these partial faces, no significant FEAE was found. However, when the subjects adapted to a dynamic video display of a series of different partial faces, a significant FEAE was observed. In both conditions, subjects could not identify facial expression in the individual adapting faces. These results suggest that our visual system is able to integrate unrecognizable partial faces over a short period of time and that the integrated percept affects our judgment on subsequently presented faces. We conclude that FEAE can be generated by partial face with little facial expression cues, implying that our cognitive system fills-in the missing parts during adaptation, or the subcortical structures are activated by the bubbled faces without conscious recognition of emotion during adaptation. PMID:26717572

  10. Different neural and cognitive response to emotional faces in healthy monozygotic twins at risk of depression

    DEFF Research Database (Denmark)

    Miskowiak, K W; Glerup, L; Vestbo, C

    2015-01-01

    while performing a gender discrimination task. After the scan, they were given a faces dot-probe task, a facial expression recognition task and questionnaires assessing mood, personality traits and coping strategies. RESULTS: High-risk twins showed increased neural response to happy and fearful faces...... processing. These task-related changes in neural responses in high-risk twins were accompanied by impaired gender discrimination performance during face processing. They also displayed increased attention vigilance for fearful faces and were slower at recognizing facial expressions relative to low......BACKGROUND: Negative cognitive bias and aberrant neural processing of emotional faces are trait-marks of depression. Yet it is unclear whether these changes constitute an endophenotype for depression and are also present in healthy individuals with hereditary risk for depression. METHOD: Thirty...

  11. The right place at the right time: priming facial expressions with emotional face components in developmental visual agnosia.

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran R; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo

    2012-04-01

    The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG's impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face's emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG's performance was strongly influenced by the diagnosticity of the components: his emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. An exploration of emotional protection and regulation in nurse-patient interactions: The role of the professional face and the emotional mirror.

    Science.gov (United States)

    Cecil, Penelope; Glass, Nel

    2015-01-01

    While interpersonal styles of nurse-patient communication have become more relaxed in recent years, nurses remain challenged in emotional engagement with patients and other health professionals. In order to preserve a professional distance in patient care delivery however slight, nurses need to be able to regulate their emotions. This research aimed to investigate nurses' perceptions of emotional protection and regulation in patient care delivery. A qualitative approach was used for the study utilising in-depth semi-structured interviews and researcher reflective journaling. Participants were drawn from rural New South Wales. Following institutional ethics approval 5 nurses were interviewed and reflective journaling commenced. The interviews and the reflective journal were transcribed verbatim. The results revealed that nurses' emotional regulation demonstrated by a 'professional face' was an important strategy to enable delivery of quality care even though it resulted in emotional containment. Such regulation was a protective mechanism employed to look after self and was critical in situations of emotional dissonance. The results also found that nurses experience emotional dissonance in situations where they have unresolved personal emotional issues and the latter was a individual motivator to manage emotions in the workplace. Emotions play a pivotal role within nurse-patient relationships. The professional face can be recognised as contributing to emotional health and therefore maintaining the emotional health of nurses in practice. This study foregrounds the importance of regulating emotions and nurturing nurses' emotional health in contemporary practice.

  13. Facing Complaining Customer and Suppressed Emotion at Worksite Related to Sleep Disturbance in Korea.

    Science.gov (United States)

    Lim, Sung Shil; Lee, Wanhyung; Hong, Kwanyoung; Jeung, Dayee; Chang, Sei Jin; Yoon, Jin Ha

    2016-11-01

    This study aimed to investigate the effect of facing complaining customer and suppressed emotion at worksite on sleep disturbance among working population. We enrolled 13,066 paid workers (male = 6,839, female = 6,227, age Working Condition Survey (2011). The odds ratio (OR) and 95% confidence intervals (CI) for sleep disturbance occurrence were calculated using multiple logistic regression models. Among workers in working environments where they always engage complaining customers had a significantly higher risk for sleep disturbance than rarely group (The OR [95% CI]; 5.46 [3.43-8.68] in male, 5.59 [3.30-9.46] in female workers). The OR (95% CI) for sleep disturbance was 1.78 (1.16-2.73) and 1.63 (1.02-2.63), for the male and female groups always suppressing their emotions at the workplace compared with those rarely group. Compared to those who both rarely engaged complaining customers and rarely suppressed their emotions at work, the OR (CI) for sleep disturbance was 9.66 (4.34-20.80) and 10.17 (4.46-22.07), for men and women always exposed to both factors. Sleep disturbance was affected by interactions of both emotional demands (engaging complaining customers and suppressing emotions at the workplace). The level of emotional demand, including engaging complaining customers and suppressing emotions at the workplace is significantly associated with sleep disturbance among Korean working population.

  14. Training Approach-Avoidance of Smiling Faces Affects Emotional Vulnerability in Socially Anxious Individuals

    Directory of Open Access Journals (Sweden)

    Mike eRinck

    2013-08-01

    Full Text Available Previous research revealed an automatic behavioral bias in high socially anxious individuals (HSAs: Although their explicit evaluations of smiling faces are positive, they show automatic avoidance of these faces. This is reflected by faster pushing than pulling of smiling faces in an Approach-Avoidance Task (AAT; Heuer, Rinck, & Becker, 2007. The current study addressed the causal role of this avoidance bias for social anxiety. To this end, we used the AAT to train HSAs, either to approach smiling faces or to avoid them. We examined whether such an AAT training could change HSAs’ automatic avoidance tendencies, and if yes, whether AAT effects would generalize to a new approach task with new facial stimuli, and to mood and anxiety in a social threat situation (a video-recorded self-presentation. We found that HSAs trained to approach smiling faces did indeed approach female faces faster after the training than HSAs trained to avoid smiling faces. Moreover, approach-faces training reduced emotional vulnerability: It led to more positive mood and lower anxiety after the self-presentation than avoid-faces training. These results suggest that automatic approach-avoidance tendencies have a causal role in social anxiety, and that they can be modified by a simple computerized training. This may open new avenues in the therapy of social phobia.

  15. Training approach-avoidance of smiling faces affects emotional vulnerability in socially anxious individuals

    Science.gov (United States)

    Rinck, Mike; Telli, Sibel; Kampmann, Isabel L.; Woud, Marcella L.; Kerstholt, Merel; te Velthuis, Sarai; Wittkowski, Matthias; Becker, Eni S.

    2013-01-01

    Previous research revealed an automatic behavioral bias in high socially anxious individuals (HSAs): although their explicit evaluations of smiling faces are positive, they show automatic avoidance of these faces. This is reflected by faster pushing than pulling of smiling faces in an Approach-Avoidance Task (AAT; Heuer et al., 2007). The current study addressed the causal role of this avoidance bias for social anxiety. To this end, we used the AAT to train HSAs, either to approach smiling faces or to avoid them. We examined whether such an AAT training could change HSAs' automatic avoidance tendencies, and if yes, whether AAT effects would generalize to a new approach task with new facial stimuli, and to mood and anxiety in a social threat situation (a video-recorded self-presentation). We found that HSAs trained to approach smiling faces did indeed approach female faces faster after the training than HSAs trained to avoid smiling faces. Moreover, approach-faces training reduced emotional vulnerability: it led to more positive mood and lower anxiety after the self-presentation than avoid-faces training. These results suggest that automatic approach-avoidance tendencies have a causal role in social anxiety, and that they can be modified by a simple computerized training. This may open new avenues in the therapy of social phobia. PMID:23970862

  16. Stress Exposure, Food Intake, and Emotional State

    Science.gov (United States)

    Ulrich-Lai, Yvonne M.; Fulton, Stephanie; Wilson, Mark; Petrovich, Gorica; Rinaman, Linda

    2016-01-01

    This manuscript summarizes the proceedings of the symposium entitled, “Stress, Palatable Food and Reward”, that was chaired by Drs. Linda Rinaman and Yvonne Ulrich-Lai at the 2014 Neurobiology of Stress Workshop held in Cincinnati, OH. This symposium comprised research presentations by four neuroscientists whose work focuses on the biological bases for complex interactions among stress, food intake and emotion. First, Dr. Ulrich-Lai describes her rodent research exploring mechanisms by which the rewarding properties of sweet palatable foods confer stress relief. Second, Dr. Stephanie Fulton discusses her work in which excessive, long-term intake of dietary lipids, as well as their subsequent withdrawal, promotes stress-related outcomes in mice. Third, Dr. Mark Wilson describes his group’s research examining the effects of social hierarchy-related stress on food intake and diet choice in group-housed female rhesus macaques, and compared the data from monkeys to results obtained in analogous work using rodents. Lastly, Dr. Gorica Petrovich discusses her research program that is aimed at defining cortical–amygdalar–hypothalamic circuitry responsible for curbing food intake during emotional threat (i.e., fear anticipation) in rats. Their collective results reveal the complexity of physiological and behavioral interactions that link stress, food intake and emotional state, and suggest new avenues of research to probe the impact of genetic, metabolic, social, experiential, and environmental factors. PMID:26303312

  17. Stress exposure, food intake and emotional state.

    Science.gov (United States)

    Ulrich-Lai, Yvonne M; Fulton, Stephanie; Wilson, Mark; Petrovich, Gorica; Rinaman, Linda

    2015-01-01

    This manuscript summarizes the proceedings of the symposium entitled, "Stress, Palatable Food and Reward", that was chaired by Drs. Linda Rinaman and Yvonne Ulrich-Lai at the 2014 Neurobiology of Stress Workshop held in Cincinnati, OH. This symposium comprised research presentations by four neuroscientists whose work focuses on the biological bases for complex interactions among stress, food intake and emotion. First, Dr Ulrich-Lai describes her rodent research exploring mechanisms by which the rewarding properties of sweet palatable foods confer stress relief. Second, Dr Stephanie Fulton discusses her work in which excessive, long-term intake of dietary lipids, as well as their subsequent withdrawal, promotes stress-related outcomes in mice. Third, Dr Mark Wilson describes his group's research examining the effects of social hierarchy-related stress on food intake and diet choice in group-housed female rhesus macaques, and compared the data from monkeys to results obtained in analogous work using rodents. Finally, Dr Gorica Petrovich discusses her research program that is aimed at defining cortical-amygdalar-hypothalamic circuitry responsible for curbing food intake during emotional threat (i.e. fear anticipation) in rats. Their collective results reveal the complexity of physiological and behavioral interactions that link stress, food intake and emotional state, and suggest new avenues of research to probe the impact of genetic, metabolic, social, experiential and environmental factors on these interactions.

  18. Childhood Poverty Predicts Adult Amygdala and Frontal Activity and Connectivity in Response to Emotional Faces

    Directory of Open Access Journals (Sweden)

    Arash eJavanbakht

    2015-06-01

    Full Text Available Childhood poverty negatively impacts physical and mental health in adulthood. Altered brain development in response to social and environmental factors associated with poverty likely contributes to this effect, engendering maladaptive patterns of social attribution and/or elevated physiological stress. In this fMRI study, we examined the association between childhood poverty and neural processing of social signals (i.e., emotional faces in adulthood. 52 subjects from a longitudinal prospective study recruited as children, participated in a brain imaging study at 23-25 years of age using the Emotional Faces Assessment Task (EFAT. Childhood poverty, independent of concurrent adult income, was associated with higher amygdala and mPFC responses to threat vs. happy faces. Also, childhood poverty was associated with decreased functional connectivity between left amygdala and mPFC. This study is unique because it prospectively links childhood poverty to emotional processing during adulthood, suggesting a candidate neural mechanism for negative social-emotional bias. Adults who grew up poor appear to be more sensitive to social threat cues and less sensitive to positive social cues.

  19. A face to remember: emotional expression modulates prefrontal activity during memory formation.

    Science.gov (United States)

    Sergerie, Karine; Lepage, Martin; Armony, Jorge L

    2005-01-15

    Emotion can exert a modulatory role on episodic memory. Several studies have shown that negative stimuli (e.g., words, pictures) are better remembered than neutral ones. Although facial expressions are powerful emotional stimuli and have been shown to influence perception and attention processes, little is known about their effect on memory. We used functional magnetic resonance imaging (fMRI) in humans to investigate the effects of expression (happy, neutral, and fearful) on prefrontal cortex (PFC) activity during the encoding of faces, using a subsequent memory effect paradigm. Our results show that activity in right PFC predicted memory for faces, regardless of expression, while a homotopic region in the left hemisphere was associated with successful encoding only for faces with an emotional expression. These findings are consistent with the proposed role of right dorsolateral PFC in successful encoding of nonverbal material, but also suggest that left DLPFC may be a site where integration of memory and emotional processes occurs. This study sheds new light on the current controversy regarding the hemispheric lateralization of PFC in memory encoding.

  20. Detection of emotional faces: salient physical features guide effective visual search.

    Science.gov (United States)

    Calvo, Manuel G; Nummenmaa, Lauri

    2008-08-01

    In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent, surprised and disgusted faces was found both under upright and inverted display conditions. Inversion slowed down the detection of these faces less than that of others (fearful, angry, and sad). Accordingly, the detection advantage involves processing of featural rather than configural information. The facial features responsible for the detection advantage are located in the mouth rather than the eye region. Computationally modeled visual saliency predicted both attentional orienting and detection. Saliency was greatest for the faces (happy) and regions (mouth) that were fixated earlier and detected faster, and there was close correspondence between the onset of the modeled saliency peak and the time at which observers initially fixated the faces. The authors conclude that visual saliency of specific facial features--especially the smiling mouth--is responsible for facilitated initial orienting, which thus shortens detection. (PsycINFO Database Record (c) 2008 APA, all rights reserved).

  1. The extended functional neuroanatomy of emotional processing biases for masked faces in major depressive disorder.

    Directory of Open Access Journals (Sweden)

    Teresa A Victor

    Full Text Available Major depressive disorder (MDD is associated with a mood-congruent processing bias in the amygdala toward face stimuli portraying sad expressions that is evident even when such stimuli are presented below the level of conscious awareness. The extended functional anatomical network that maintains this response bias has not been established, however.To identify neural network differences in the hemodynamic response to implicitly presented facial expressions between depressed and healthy control participants.Unmedicated-depressed participants with MDD (n=22 and healthy controls (HC; n=25 underwent functional MRI as they viewed face stimuli showing sad, happy or neutral face expressions, presented using a backward masking design. The blood-oxygen-level dependent (BOLD signal was measured to identify regions where the hemodynamic response to the emotionally valenced stimuli differed between groups.The MDD subjects showed greater BOLD responses than the controls to masked-sad versus masked-happy faces in the hippocampus, amygdala and anterior inferotemporal cortex. While viewing both masked-sad and masked-happy faces relative to masked-neutral faces, the depressed subjects showed greater hemodynamic responses than the controls in a network that included the medial and orbital prefrontal cortices and anterior temporal cortex.Depressed and healthy participants showed distinct hemodynamic responses to masked-sad and masked-happy faces in neural circuits known to support the processing of emotionally valenced stimuli and to integrate the sensory and visceromotor aspects of emotional behavior. Altered function within these networks in MDD may establish and maintain illness-associated differences in the salience of sensory/social stimuli, such that attention is biased toward negative and away from positive stimuli.

  2. The extended functional neuroanatomy of emotional processing biases for masked faces in major depressive disorder.

    Science.gov (United States)

    Victor, Teresa A; Furey, Maura L; Fromm, Stephen J; Bellgowan, Patrick S F; Öhman, Arne; Drevets, Wayne C

    2012-01-01

    Major depressive disorder (MDD) is associated with a mood-congruent processing bias in the amygdala toward face stimuli portraying sad expressions that is evident even when such stimuli are presented below the level of conscious awareness. The extended functional anatomical network that maintains this response bias has not been established, however. To identify neural network differences in the hemodynamic response to implicitly presented facial expressions between depressed and healthy control participants. Unmedicated-depressed participants with MDD (n=22) and healthy controls (HC; n=25) underwent functional MRI as they viewed face stimuli showing sad, happy or neutral face expressions, presented using a backward masking design. The blood-oxygen-level dependent (BOLD) signal was measured to identify regions where the hemodynamic response to the emotionally valenced stimuli differed between groups. The MDD subjects showed greater BOLD responses than the controls to masked-sad versus masked-happy faces in the hippocampus, amygdala and anterior inferotemporal cortex. While viewing both masked-sad and masked-happy faces relative to masked-neutral faces, the depressed subjects showed greater hemodynamic responses than the controls in a network that included the medial and orbital prefrontal cortices and anterior temporal cortex. Depressed and healthy participants showed distinct hemodynamic responses to masked-sad and masked-happy faces in neural circuits known to support the processing of emotionally valenced stimuli and to integrate the sensory and visceromotor aspects of emotional behavior. Altered function within these networks in MDD may establish and maintain illness-associated differences in the salience of sensory/social stimuli, such that attention is biased toward negative and away from positive stimuli.

  3. Do Valenced Odors and Trait Body Odor Disgust Affect Evaluation of Emotion in Dynamic Faces?

    Science.gov (United States)

    Syrjänen, Elmeri; Liuzza, Marco Tullio; Fischer, Håkan; Olofsson, Jonas K

    2017-12-01

    Disgust is a core emotion evolved to detect and avoid the ingestion of poisonous food as well as the contact with pathogens and other harmful agents. Previous research has shown that multisensory presentation of olfactory and visual information may strengthen the processing of disgust-relevant information. However, it is not known whether these findings extend to dynamic facial stimuli that changes from neutral to emotionally expressive, or if individual differences in trait body odor disgust may influence the processing of disgust-related information. In this preregistered study, we tested whether a classification of dynamic facial expressions as happy or disgusted, and an emotional evaluation of these facial expressions, would be affected by individual differences in body odor disgust sensitivity, and by exposure to a sweat-like, negatively valenced odor (valeric acid), as compared with a soap-like, positively valenced odor (lilac essence) or a no-odor control. Using Bayesian hypothesis testing, we found evidence that odors do not affect recognition of emotion in dynamic faces even when body odor disgust sensitivity was used as moderator. However, an exploratory analysis suggested that an unpleasant odor context may cause faster RTs for faces, independent of their emotional expression. Our results further our understanding of the scope and limits of odor effects on facial perception affect and suggest further studies should focus on reproducibility, specifying experimental circumstances where odor effects on facial expressions may be present versus absent.

  4. KDEF-PT: Valence, Emotional Intensity, Familiarity and Attractiveness Ratings of Angry, Neutral, and Happy Faces.

    Science.gov (United States)

    Garrido, Margarida V; Prada, Marília

    2017-01-01

    The Karolinska Directed Emotional Faces (KDEF) is one of the most widely used human facial expressions database. Almost a decade after the original validation study (Goeleven et al., 2008), we present subjective rating norms for a sub-set of 210 pictures which depict 70 models (half female) each displaying an angry, happy and neutral facial expressions. Our main goals were to provide an additional and updated validation to this database, using a sample from a different nationality ( N = 155 Portuguese students, M = 23.73 years old, SD = 7.24) and to extend the number of subjective dimensions used to evaluate each image. Specifically, participants reported emotional labeling (forced-choice task) and evaluated the emotional intensity and valence of the expression, as well as the attractiveness and familiarity of the model (7-points rating scales). Overall, results show that happy faces obtained the highest ratings across evaluative dimensions and emotion labeling accuracy. Female (vs. male) models were perceived as more attractive, familiar and positive. The sex of the model also moderated the accuracy of emotional labeling and ratings of different facial expressions. Each picture of the set was categorized as low, moderate, or high for each dimension. Normative data for each stimulus (hits proportion, means, standard deviations, and confidence intervals per evaluative dimension) is available as supplementary material (available at https://osf.io/fvc4m/).

  5. A facial expression of pax: Assessing children's "recognition" of emotion from faces.

    Science.gov (United States)

    Nelson, Nicole L; Russell, James A

    2016-01-01

    In a classic study, children were shown an array of facial expressions and asked to choose the person who expressed a specific emotion. Children were later asked to name the emotion in the face with any label they wanted. Subsequent research often relied on the same two tasks--choice from array and free labeling--to support the conclusion that children recognize basic emotions from facial expressions. Here five studies (N=120, 2- to 10-year-olds) showed that these two tasks produce illusory recognition; a novel nonsense facial expression was included in the array. Children "recognized" a nonsense emotion (pax or tolen) and two familiar emotions (fear and jealousy) from the same nonsense face. Children likely used a process of elimination; they paired the unknown facial expression with a label given in the choice-from-array task and, after just two trials, freely labeled the new facial expression with the new label. These data indicate that past studies using this method may have overestimated children's expression knowledge. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. KDEF-PT: Valence, Emotional Intensity, Familiarity and Attractiveness Ratings of Angry, Neutral, and Happy Faces

    Directory of Open Access Journals (Sweden)

    Margarida V. Garrido

    2017-12-01

    Full Text Available The Karolinska Directed Emotional Faces (KDEF is one of the most widely used human facial expressions database. Almost a decade after the original validation study (Goeleven et al., 2008, we present subjective rating norms for a sub-set of 210 pictures which depict 70 models (half female each displaying an angry, happy and neutral facial expressions. Our main goals were to provide an additional and updated validation to this database, using a sample from a different nationality (N = 155 Portuguese students, M = 23.73 years old, SD = 7.24 and to extend the number of subjective dimensions used to evaluate each image. Specifically, participants reported emotional labeling (forced-choice task and evaluated the emotional intensity and valence of the expression, as well as the attractiveness and familiarity of the model (7-points rating scales. Overall, results show that happy faces obtained the highest ratings across evaluative dimensions and emotion labeling accuracy. Female (vs. male models were perceived as more attractive, familiar and positive. The sex of the model also moderated the accuracy of emotional labeling and ratings of different facial expressions. Each picture of the set was categorized as low, moderate, or high for each dimension. Normative data for each stimulus (hits proportion, means, standard deviations, and confidence intervals per evaluative dimension is available as supplementary material (available at https://osf.io/fvc4m/.

  7. The Right Place at the Right Time: Priming Facial Expressions with Emotional Face Components in Developmental Visual Agnosia

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran. R.; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo

    2012-01-01

    The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG’s impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face’s emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG’s performance was strongly influenced by the diagnosticity of the components: His emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. PMID:22349446

  8. Dynamic Influence of Emotional States on Novel Word Learning

    Science.gov (United States)

    Guo, Jingjing; Zou, Tiantian; Peng, Danling

    2018-01-01

    Many researchers realize that it's unrealistic to isolate language learning and processing from emotions. However, few studies on language learning have taken emotions into consideration so far, so that the probable influences of emotions on language learning are unclear. The current study thereby aimed to examine the effects of emotional states on novel word learning and their dynamic changes with learning continuing and task varying. Positive, negative or neutral pictures were employed to induce a given emotional state, and then participants learned the novel words through association with line-drawing pictures in four successive learning phases. At the end of each learning phase, participants were instructed to fulfill a semantic category judgment task (in Experiment 1) or a word-picture semantic consistency judgment task (in Experiment 2) to explore the effects of emotional states on different depths of word learning. Converging results demonstrated that negative emotional state led to worse performance compared with neutral condition; however, how positive emotional state affected learning varied with learning task. Specifically, a facilitative role of positive emotional state in semantic category learning was observed but disappeared in word specific meaning learning. Moreover, the emotional modulation on novel word learning was quite dynamic and changeable with learning continuing, and the final attainment of the learned words tended to be similar under different emotional states. The findings suggest that the impact of emotion can be offset when novel words became more and more familiar and a part of existent lexicon. PMID:29695994

  9. When does subliminal affective image priming influence the ability of schizophrenic patients to perceive face emotions?

    Science.gov (United States)

    Vaina, Lucia Maria; Rana, Kunjan D; Cotos, Ionela; Li-Yang, Chen; Huang, Melissa A; Podea, Delia

    2014-12-24

    Deficits in face emotion perception are among the most pervasive aspects of schizophrenia impairments which strongly affects interpersonal communication and social skills. Schizophrenic patients (PSZ) and healthy control subjects (HCS) performed 2 psychophysical tasks. One, the SAFFIMAP test, was designed to determine the impact of subliminally presented affective or neutral images on the accuracy of face-expression (angry or neutral) perception. In the second test, FEP, subjects saw pictures of face-expression and were asked to rate them as angry, happy, or neutral. The following clinical scales were used to determine the acute symptoms in PSZ: Positive and Negative Syndrome (PANSS), Young Mania Rating (YMRS), Hamilton Depression (HAM-D), and Hamilton Anxiety (HAM-A). On the SAFFIMAP test, different from the HCS group, the PSZ group tended to categorize the neutral expression of test faces as angry and their response to the test-face expression was not influenced by the affective content of the primes. In PSZ, the PANSS-positive score was significantly correlated with correct perception of angry faces for aggressive or pleasant primes. YMRS scores were strongly correlated with PSZ's tendency to recognize angry face expressions when the prime was a pleasant or a neutral image. The HAM-D score was positively correlated with categorizing the test-faces as neutral, regardless of the affective content of the prime or of the test-face expression (angry or neutral). Despite its exploratory nature, this study provides the first evidence that conscious perception and categorization of facial emotions (neutral or angry) in PSZ is directly affected by their positive or negative symptoms of the disease as defined by their individual scores on the clinical diagnostic scales.

  10. Self-esteem Modulates the P3 Component in Response to the Self-face Processing after Priming with Emotional Faces.

    Science.gov (United States)

    Guan, Lili; Zhao, Yufang; Wang, Yige; Chen, Yujie; Yang, Juan

    2017-01-01

    The self-face processing advantage (SPA) refers to the research finding that individuals generally recognize their own face faster than another's face; self-face also elicits an enhanced P3 amplitude compared to another's face. It has been suggested that social evaluation threats could weaken the SPA and that self-esteem could be regarded as a threat buffer. However, little research has directly investigated the neural evidence of how self-esteem modulates the social evaluation threat to the SPA. In the current event-related potential study, 27 healthy Chinese undergraduate students were primed with emotional faces (angry, happy, or neutral) and were asked to judge whether the target face (self, friend, and stranger) was familiar or unfamiliar. Electrophysiological results showed that after priming with emotional faces (angry and happy), self-face elicited similar P3 amplitudes to friend-face in individuals with low self-esteem, but not in individuals with high self-esteem. The results suggest that as low self-esteem raises fears of social rejection and exclusion, priming with emotional faces (angry and happy) can weaken the SPA in low self-esteem individuals but not in high self-esteem individuals.

  11. Self-esteem Modulates the P3 Component in Response to the Self-face Processing after Priming with Emotional Faces

    Directory of Open Access Journals (Sweden)

    Lili Guan

    2017-08-01

    Full Text Available The self-face processing advantage (SPA refers to the research finding that individuals generally recognize their own face faster than another’s face; self-face also elicits an enhanced P3 amplitude compared to another’s face. It has been suggested that social evaluation threats could weaken the SPA and that self-esteem could be regarded as a threat buffer. However, little research has directly investigated the neural evidence of how self-esteem modulates the social evaluation threat to the SPA. In the current event-related potential study, 27 healthy Chinese undergraduate students were primed with emotional faces (angry, happy, or neutral and were asked to judge whether the target face (self, friend, and stranger was familiar or unfamiliar. Electrophysiological results showed that after priming with emotional faces (angry and happy, self-face elicited similar P3 amplitudes to friend-face in individuals with low self-esteem, but not in individuals with high self-esteem. The results suggest that as low self-esteem raises fears of social rejection and exclusion, priming with emotional faces (angry and happy can weaken the SPA in low self-esteem individuals but not in high self-esteem individuals.

  12. Interaction between emotional state and learning underlies mood instability

    OpenAIRE

    Eldar, Eran; Niv, Yael

    2015-01-01

    Intuitively, good and bad outcomes affect our emotional state, but whether the emotional state feeds back onto the perception of outcomes remains unknown. Here, we use behaviour and functional neuroimaging of human participants to investigate this bidirectional interaction, by comparing the evaluation of slot machines played before and after an emotion-impacting wheel-of-fortune draw. Results indicate that self-reported mood instability is associated with a positive-feedback effect of emotion...

  13. Word wins over Face: Emotional Stroop effect activates the frontal cortical network

    Directory of Open Access Journals (Sweden)

    Shima Ovaysikia

    2011-01-01

    Full Text Available The prefrontal cortex (PFC has been implicated in higher order cognitive control of behaviour. Sometimes such control is executed through suppression of an unwanted response in order to avoid conflict. Conflict occurs when two simultaneously competing processes lead to different behavioral outcomes, as seen in tasks such as the anti-saccade, go/no-go and the Stroop task. We set out to examine whether different types of stimuli in a modified emotional Stroop task would cause similar interference effects as the original Stroop-colour/word, and whether the required suppression mechanism(s would recruit similar regions of the medial PFC (mPFC. By using emotional words and emotional faces in this Stroop experiment, we examined the two well-learned automatic behaviours of word reading and recognition of face expressions. In our emotional Stroop paradigm, words were processed faster than face expressions with incongruent trials yielding longer reaction times (RT and larger number of errors compared to the congruent trials. This novel Stroop effect activated the anterior and inferior regions of the mPFC, namely the anterior cingulate cortex (ACC, inferior frontal gyrus (IFG as well as the superior frontal gyrus. Our results suggest that prepotent behaviours such as reading and recognition of face expressions are stimulus-dependent and perhaps hierarchical, hence recruiting distinct regions of the mPFC. Moreover, the faster processing of word reading compared to reporting face expressions is indicative of the formation of stronger stimulus-response (SR associations of an over-learned behaviour compared to an instinctive one, which could alternatively be explained through the distinction between awareness and selective attention.

  14. Daily Emotional Labor, Negative Affect State, and Emotional Exhaustion: Cross-Level Moderators of Affective Commitment

    Directory of Open Access Journals (Sweden)

    Hyewon Kong

    2018-06-01

    Full Text Available Employees’ emotional-labor strategies, experienced affects, and emotional exhaustion in the workplace may vary over time within individuals, even within the same day. However, previous studies on these relationships have not highlighted their dynamic properties of these relationships. In addition, although the effects of surface and deep acting on emotional exhaustion have been investigated in emotional-labor research, empirical studies on these relationships still report mixed results. Thus, we suggest that moderators may affect the relationship between emotional labor and emotional exhaustion. Also, this study examines the relationship between emotional labor and emotional exhaustion within individuals by repeated measurements, and verifies the mediating effect of a negative affect state. Finally, our study confirms the moderating effects that affective commitment has on the relationship between emotional labor and emotional exhaustion. Data was collected from tellers who had a high degree of interaction with clients at banks based in South Korea. A total of 56 tellers participated in the survey and responded for five working days. A total of 616 data entries were collected from the 56 respondents. We used a hierarchical linear model (HLM to examine our hypothesis. The results showed that surface-acting emotional labor increases emotional exhaustion; furthermore, the relationship between surface acting emotional labor and emotional exhaustion is mediated by a negative affect state within individuals. In addition, this study verified that affective commitment buffers the negative effects that surface acting emotional labor has on emotional exhaustion. These results suggest that emotional labor is a dynamic process within individuals, and that emotional exhaustion caused by emotional labor differs among individuals, and is dependent upon factors such as the individual’s level of affective commitment.

  15. A note on age differences in mood-congruent versus mood-incongruent emotion processing in faces

    Directory of Open Access Journals (Sweden)

    Manuel C. Voelkle

    2014-06-01

    Full Text Available This article addresses four interrelated research questions: (1 Does experienced mood affect emotion perception in faces and is this perception mood-congruent or mood-incongruent? (2 Are there age-group differences in the interplay between experienced mood and emotion perception? (3 Does emotion perception in faces change as a function of the temporal sequence of study sessions and stimuli presentation, and (4 does emotion perception in faces serve a mood-regulatory function? One hundred fifty-four adults of three different age groups (younger: 20–31 years; middle-aged: 44–55 years; older adults: 70–81 years were asked to provide multidimensional emotion ratings of a total of 1,026 face pictures of younger, middle-aged, and older men and women, each displaying six different prototypical (primary emotional expressions. By analyzing the likelihood of ascribing an additional emotional expression to a face whose primary emotion had been correctly recognized, the multidimensional rating approach permits the study of emotion perception while controlling for emotion recognition. Following up on previous research on mood responses to recurring unpleasant situations using the same dataset (Voelkle, Ebner, Lindenberger, & Riediger, 2013, crossed random effects analyses supported a mood-congruent relationship between experienced mood and perceived emotions in faces. In particular older adults were more likely to perceive happiness in faces when being in a positive mood and less likely to do so when being in a negative mood. This did not apply to younger adults. Temporal sequence of study sessions and stimuli presentation had a strong effect on the likelihood of ascribing an additional emotional expression. In contrast to previous findings, however, there was neither evidence for a change from mood-congruent to mood-incongruent responses over time nor evidence for a mood-regulatory effect.

  16. A note on age differences in mood-congruent vs. mood-incongruent emotion processing in faces.

    Science.gov (United States)

    Voelkle, Manuel C; Ebner, Natalie C; Lindenberger, Ulman; Riediger, Michaela

    2014-01-01

    (1) Does experienced mood affect emotion perception in faces and is this perception mood-congruent or mood-incongruent?(2) Are there age-group differences in the interplay between experienced mood and emotion perception? (3) Does emotion perception in faces change as a function of the temporal sequence of study sessions and stimuli presentation, and (4) does emotion perception in faces serve a mood-regulatory function? One hundred fifty-four adults of three different age groups (younger: 20-31 years; middle-aged: 44-55 years; older adults: 70-81 years) were asked to provide multidimensional emotion ratings of a total of 1026 face pictures of younger, middle-aged, and older men and women, each displaying six different prototypical (primary) emotional expressions. By analyzing the likelihood of ascribing an additional emotional expression to a face whose primary emotion had been correctly recognized, the multidimensional rating approach permits the study of emotion perception while controlling for emotion recognition. Following up on previous research on mood responses to recurring unpleasant situations using the same dataset (Voelkle et al., 2013), crossed random effects analyses supported a mood-congruent relationship between experienced mood and perceived emotions in faces. In particular older adults were more likely to perceive happiness in faces when being in a positive mood and less likely to do so when being in a negative mood. This did not apply to younger adults. Temporal sequence of study sessions and stimuli presentation had a strong effect on the likelihood of ascribing an additional emotional expression. In contrast to previous findings, however, there was neither evidence for a change from mood-congruent to mood-incongruent responses over time nor evidence for a mood-regulatory effect.

  17. Gender differences in human single neuron responses to male emotional faces.

    Science.gov (United States)

    Newhoff, Morgan; Treiman, David M; Smith, Kris A; Steinmetz, Peter N

    2015-01-01

    Well-documented differences in the psychology and behavior of men and women have spurred extensive exploration of gender's role within the brain, particularly regarding emotional processing. While neuroanatomical studies clearly show differences between the sexes, the functional effects of these differences are less understood. Neuroimaging studies have shown inconsistent locations and magnitudes of gender differences in brain hemodynamic responses to emotion. To better understand the neurophysiology of these gender differences, we analyzed recordings of single neuron activity in the human brain as subjects of both genders viewed emotional expressions. This study included recordings of single-neuron activity of 14 (6 male) epileptic patients in four brain areas: amygdala (236 neurons), hippocampus (n = 270), anterior cingulate cortex (n = 256), and ventromedial prefrontal cortex (n = 174). Neural activity was recorded while participants viewed a series of avatar male faces portraying positive, negative or neutral expressions. Significant gender differences were found in the left amygdala, where 23% (n = 15∕66) of neurons in men were significantly affected by facial emotion, vs. 8% (n = 6∕76) of neurons in women. A Fisher's exact test comparing the two ratios found a highly significant difference between the two (p differences between genders at the single-neuron level in the human amygdala. These differences may reflect gender-based distinctions in evolved capacities for emotional processing and also demonstrate the importance of including subject gender as an independent factor in future studies of emotional processing by single neurons in the human amygdala.

  18. Functional Brain Activation to Emotional and non-Emotional Faces in Healthy Children: Evidence for Developmentally Undifferentiated Amygdala Function During the School Age Period

    Science.gov (United States)

    Pagliaccio, David; Luby, Joan L.; Gaffrey, Michael S.; Belden, Andrew C.; Botteron, Kelly N.; Harms, Michael P.; Barch, Deanna M.

    2013-01-01

    The amygdala is a key region in emotion processing. Particularly, fMRI studies have demonstrated that the amygdala is active during the viewing of emotional faces. Previous research has consistently found greater amygdala responses to fearful faces as compared to neutral faces in adults, convergent with a focus in the animal literature on the amygdala's role in fear processing. Studies have found that the amygdala also responds differentially to other facial emotion types in adults. Yet, the literature regarding when this differential amygdala responsivity develops is limited and mixed. Thus, the goal of current study was to examine amygdala responses to emotional and neutral faces in a relatively large sample of healthy school age children (N = 52). While the amygdala was active in response to emotional and neutral faces, the results do not support the hypothesis that the amygdala responds differentially to emotional faces in 7 – 12 year old children. Nonetheless, amygdala activity was correlated with the severity of subclinical depression symptoms and emotional regulation skills. Additionally, sex differences were observed in frontal, temporal, and visual regions as well as effects of pubertal development in visual regions. These findings suggest important differences in amygdala reactivity in childhood. PMID:23636982

  19. The United States facing their petroleum dependence

    International Nuclear Information System (INIS)

    Noel, P.

    2002-06-01

    In the framework of ''the energy crisis of 2000-2001'', the Cheney report and the petroleum dependence, this study presents a critical examination of the United States petroleum situation, its perception in the american political milieu and the public policies implementing during the last ten years. The first section is devoted to the petroleum supply. In the second section, the american petroleum policy and the energy safety are studied. (A.L.B.)

  20. Seeing emotions in the eyes – Inverse priming effects induced by eyes expressing mental states

    Directory of Open Access Journals (Sweden)

    Caroline eWagenbreth

    2014-09-01

    Full Text Available ObjectiveAutomatic emotional processing of faces and facial expressions gain more and more of relevance in terms of social communication. Among a variety of different primes, targets and tasks, whole face images and facial expressions have been used to affectively prime emotional responses. This study investigates whether emotional information provided solely in eye regions that display mental states can also trigger affective priming.MethodsSixteen subjects answered a lexical decision task (LDT coupled with an affective priming paradigm. Emotion-associated eye regions were extracted from photographs of faces and acted as primes, whereas targets were either words or pseudo-words. Participants had to decide whether the targets were real German words or generated pseudo-words. Primes and targets belonged to the emotional categories fear, disgust, happiness and neutral.ResultsA general valence effect for positive words was observed: Responses in the LDT were faster for target words of the emotional category happiness when compared to other categories. Importantly, pictures of emotional eye regions preceding the target words affected their subsequent classification. While we show a classical priming effect for neutral target words - with shorter RT for congruent compared to incongruent prime-target pairs- , we observed an inverse priming effect for fearful and happy target words - with shorter RT for incongruent compared to congruent prime-target pairs. These inverse priming effects were driven exclusively by specific prime-target pairs.ConclusionReduced facial emotional information is sufficient to induce automatic implicit emotional processing. The emotional-associated eye regions were processed with respect to their emotional valence and affected the performance on the LDT.

  1. ‘Distracters’ do not always distract: Visual working memory for angry faces is enhanced by incidental emotional words.

    Directory of Open Access Journals (Sweden)

    Margaret Cecilia Jackson

    2012-10-01

    Full Text Available We are often required to filter out distraction in order to focus on a primary task during which working memory (WM is engaged. Previous research has shown that negative versus neutral distracters presented during a visual WM maintenance period significantly impair memory for neutral information. However, the contents of WM are often also emotional in nature. The question we address here is how incidental information might impact upon visual WM when both this and the memory items contain emotional information. We presented emotional versus neutral words during the maintenance interval of an emotional visual WM faces task. Participants encoded two angry or happy faces into WM, and several seconds into a 9 second maintenance period a negative, positive, or neutral word was flashed on the screen three times. A single neutral test face was presented for retrieval with a face identity that was either present or absent in the preceding study array. WM for angry face identities was significantly better when an emotional (negative or positive versus neutral (or no word was presented. In contrast, WM for happy face identities was not significantly affected by word valence. These findings suggest that the presence of emotion within an intervening stimulus boosts the emotional value of threat-related information maintained in visual WM and thus improves performance. In addition, we show that incidental events that are emotional in nature do not always distract from an ongoing WM task.

  2. Emotional Expression in Simple Line Drawings of a Robot's Face Leads to Higher Offers in the Ultimatum Game

    Directory of Open Access Journals (Sweden)

    Kazunori Terada

    2017-05-01

    Full Text Available In the present study, we investigated whether expressing emotional states using a simple line drawing to represent a robot's face can serve to elicit altruistic behavior from humans. An experimental investigation was conducted in which human participants interacted with a humanoid robot whose facial expression was shown on an LCD monitor that was mounted as its head (Study 1. Participants were asked to play the ultimatum game, which is usually used to measure human altruistic behavior. All participants were assigned to be the proposer and were instructed to decide their offer within 1 min by controlling a slider bar. The corners of the robot's mouth, as indicated by the line drawing, simply moved upward, or downward depending on the position of the slider bar. The results suggest that the change in the facial expression depicted by a simple line drawing of a face significantly affected the participant's final offer in the ultimatum game. The offers were increased by 13% when subjects were shown contingent changes of facial expression. The results were compared with an experiment in a teleoperation setting in which participants interacted with another person through a computer display showing the same line drawings used in Study 1 (Study 2. The results showed that offers were 15% higher if participants were shown a contingent facial expression change. Together, Studies 1 and 2 indicate that emotional expression in simple line drawings of a robot's face elicits the same higher offer from humans as a human telepresence does.

  3. Emotional Expression in Simple Line Drawings of a Robot's Face Leads to Higher Offers in the Ultimatum Game.

    Science.gov (United States)

    Terada, Kazunori; Takeuchi, Chikara

    2017-01-01

    In the present study, we investigated whether expressing emotional states using a simple line drawing to represent a robot's face can serve to elicit altruistic behavior from humans. An experimental investigation was conducted in which human participants interacted with a humanoid robot whose facial expression was shown on an LCD monitor that was mounted as its head (Study 1). Participants were asked to play the ultimatum game, which is usually used to measure human altruistic behavior. All participants were assigned to be the proposer and were instructed to decide their offer within 1 min by controlling a slider bar. The corners of the robot's mouth, as indicated by the line drawing, simply moved upward, or downward depending on the position of the slider bar. The results suggest that the change in the facial expression depicted by a simple line drawing of a face significantly affected the participant's final offer in the ultimatum game. The offers were increased by 13% when subjects were shown contingent changes of facial expression. The results were compared with an experiment in a teleoperation setting in which participants interacted with another person through a computer display showing the same line drawings used in Study 1 (Study 2). The results showed that offers were 15% higher if participants were shown a contingent facial expression change. Together, Studies 1 and 2 indicate that emotional expression in simple line drawings of a robot's face elicits the same higher offer from humans as a human telepresence does.

  4. [Abnormal processing characteristics to basic emotional faces in the early phase in children with autism spectrum disorder].

    Science.gov (United States)

    Lin, Qiong-Xi; Wu, Gui-Hua; Zhang, Ling; Wang, Zeng-Jian; Pan, Ning; Xu, Cai-Juan; Jing, Jin; Jin, Yu

    2018-02-01

    To explore the recognition ability and abnormal processing characteristics to basic emotional faces in the early phase in children with autism spectrum disorders (ASD). Photos of Chinese static faces with four basic emotions (fearful, happy, angry and sad) were used as stimulus. Twenty-five ASD children and twenty-two age- and gender-matched typical developed children (normal controls) were asked to match the emotional faces with words. Event-related potential (ERP) data were recorded concurrently. N170 latencies for total emotion and fearful face in the left temporal region were faster than in the right one in normal controls (P<0.05), but the results were not noted in ASD children. Further, N170 latencies in the left temporal region of ASD children were slower than normal controls for total emotion, fearful and happy faces (P<0.05), and their N170 latencies in the right temporal region were prone to slower than normal controls for angry and fearful faces. The holistic perception speed of emotional faces in the early cognitive processing phase in ASD children is slower than normal controls. The lateralized response in the early phase of recognizing emotional faces may be aberrant in children with ASD.

  5. Initial Orientation of Attention towards Emotional Faces in Children with Attention Deficit Hyperactivity Disorder

    Directory of Open Access Journals (Sweden)

    Mehrnoosh Ahmadi

    2011-09-01

    Full Text Available Objective: Early recognition of negative emotions is considered to be of vital importance. It seems that children with attention deficit hyperactivity disorder have some difficulties recognizing facial emotional expressions, especially negative ones. This study investigated the preference of children with attention deficit hyperactivity disorder for negative (angry, sad facial expressions compared to normal children.Method: Participants were 35 drug naive boys with ADHD, aged between 6-11 years ,and 31 matched healthy children. Visual orientation data were recorded while participants viewed face pairs (negative-neutral pairs shown for 3000ms. The number of first fixations made to each expression was considered as an index of initial orientation. Results: Group comparisons revealed no difference between attention deficit hyperactivity disorder group and their matched healthy counterparts in initial orientation of attention. A tendency towards negative emotions was found within the normal group, while no difference was observed between initial allocation of attention toward negative and neutral expressions in children with ADHD .Conclusion: Children with attention deficit hyperactivity disorder do not have significant preference for negative facial expressions. In contrast, normal children have a significant preference for negative facial emotions rather than neutral faces.

  6. Priming the Secure Attachment Schema Affects the Emotional Face Processing Bias in Attachment Anxiety: An fMRI Research

    Directory of Open Access Journals (Sweden)

    Xu Chen

    2017-04-01

    Full Text Available Our study explored how priming with a secure base schema affects the processing of emotional facial stimuli in individuals with attachment anxiety. We enrolled 42 undergraduate students between 18 and 27 years of age, and divided them into two groups: attachment anxiety and attachment secure. All participants were primed under two conditions, the secure priming using references to the partner, and neutral priming using neutral references. We performed repeated attachment security priming combined with a dual-task paradigm and functional magnetic resonance imaging. Participants’ reaction times in terms of responding to the facial stimuli were also measured. Attachment security priming can facilitate an individual’s processing of positive emotional faces; for instance, the presentation of the partner’s name was associated with stronger activities in a wide range of brain regions and faster reaction times for positive facial expressions in the subjects. The current finding of higher activity in the left-hemisphere regions for secure priming rather than neutral priming is consistent with the prediction that attachment security priming triggers the spread of the activation of a positive emotional state. However, the difference in brain activity during processing of both, positive and negative emotional facial stimuli between the two priming conditions appeared in the attachment anxiety group alone. This study indicates that the effect of attachment secure priming on the processing of emotional facial stimuli could be mediated by chronic attachment anxiety. In addition, it highlights the association between higher-order processes of the attachment system (secure attachment schema priming and early-stage information processing system (attention, given the increased attention toward the effects of secure base schema on the processing of emotion- and attachment-related information among the insecure population. Thus, the following study has

  7. Hemifacial Display of Emotion in the Resting State

    Directory of Open Access Journals (Sweden)

    M. K. Mandal

    1992-01-01

    Full Text Available The human face at rest displays distinguishable asymmetries with some lateralization of emotion or expression. The asymmetrical nature of the resting face was examined by preparing hemifacial composites, left–left, right–right, along with normal facial orientation. The left side and right side composites were constructed by using the lateral half of one side of the face and its mirror-reversal. The left side facial composites were found to be more emotional than the right side or normal facial orientations of neutral expressions.

  8. A framework for investigating the use of face features to identify spontaneous emotions

    OpenAIRE

    Bezerra, Giuliana Silva

    2014-01-01

    Emotion-based analysis has raised a lot of interest, particularly in areas such as forensics, medicine, music, psychology, and human-machine interface. Following this trend, the use of facial analysis (either automatic or human-based) is the most common subject to be investigated once this type of data can easily be collected and is well accepted in the literature as a metric for inference of emotional states. Despite this popularity, due to several constraints found in real world scenarios (...

  9. Oxytocin effects on emotional response to others' faces via serotonin system in autism: A pilot study.

    Science.gov (United States)

    Fukai, Mina; Hirosawa, Tetsu; Kikuchi, Mitsuru; Ouchi, Yasuomi; Takahashi, Tetsuya; Yoshimura, Yuko; Miyagishi, Yoshiaki; Kosaka, Hirotaka; Yokokura, Masamichi; Yoshikawa, Etsuji; Bunai, Tomoyasu; Minabe, Yoshio

    2017-09-30

    The oxytocin (OT)-related serotonergic system is thought to play an important role in the etiology and social symptoms of autism spectrum disorder (ASD). However, no evidence exists for the relation between the prosocial effect of chronic OT administration and the brain serotonergic system. Ten male subjects with ASD were administered OT for 8-10 weeks in an open-label, single-arm, non-randomized, uncontrolled manner. Before and during the OT treatment, positron emission tomography was used with the ( 11 C)-3-amino-4-(2-[(demethylamino)methyl]phenylthio)benzonitrile( 11 C-DASB) radiotracer. Then binding of serotonin transporter ( 11 C-DASB BP ND ) was estimated. The main outcome measures were changes in 11 C-DASB BP ND and changes in the emotional response to others' faces. No significant change was found in the emotional response to others' faces after the 8-10 week OT treatment. However, the increased serotonin transporter (SERT) level in the striatum after treatment was correlated significantly with increased negative emotional response to human faces. This study revealed a relation between changes in the serotonergic system and in prosociality after chronic OT administration. Additional studies must be conducted to verify the chronic OT effects on social behavior via the serotonergic system. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  10. Emotional Mood States and the Recall of Childhood Memories.

    Science.gov (United States)

    Monteiro, Kenneth P.; Haviland, Jeannette M.

    Recently some psychologists have shown a renewed interest in the relationship between cognition and emotion and have begun to examine the relationship between the representation and processing of factual and emotional information. To investigate the role of emotional state in personal memory retrieval, a study was undertaken to replicate and…

  11. Toward an implicit measure of emotions: ratings of abstract images reveal distinct emotional states.

    Science.gov (United States)

    Bartoszek, Gregory; Cervone, Daniel

    2017-11-01

    Although implicit tests of positive and negative affect exist, implicit measures of distinct emotional states are scarce. Three experiments examined whether a novel implicit emotion-assessment task, the rating of emotion expressed in abstract images, would reveal distinct emotional states. In Experiment 1, participants exposed to a sadness-inducing story inferred more sadness, and less happiness, in abstract images. In Experiment 2, an anger-provoking interaction increased anger ratings. In Experiment 3, compared to neutral images, spider images increased fear ratings in spider-fearful participants but not in controls. In each experiment, the implicit task indicated elevated levels of the target emotion and did not indicate elevated levels of non-target negative emotions; the task thus differentiated among emotional states of the same valence. Correlations also supported the convergent and discriminant validity of the implicit task. Supporting the possibility that heuristic processes underlie the ratings, group differences were stronger among those who responded relatively quickly.

  12. Evaluating the Emotional State of a User Using a Webcam

    OpenAIRE

    Martin Magdin; Milan Turcani; Lukas Hudec

    2016-01-01

    In online learning is more difficult for teachers identify to see how individual students behave. Student’s emotions like self-esteem, motivation, commitment, and others that are believed to be determinant in student’s performance can not be ignored, as they are known (affective states and also learning styles) to greatly influence student’s learning. The ability of the computer to evaluate the emotional state of the user is getting bigger attention. By evaluating the emotional state, there i...

  13. Intranasal Oxytocin Administration Dampens Amygdala Reactivity towards Emotional Faces in Male and Female PTSD Patients.

    Science.gov (United States)

    Koch, Saskia Bj; van Zuiden, Mirjam; Nawijn, Laura; Frijling, Jessie L; Veltman, Dick J; Olff, Miranda

    2016-05-01

    Post-traumatic stress disorder (PTSD) is a disabling psychiatric disorder. As a substantial part of PTSD patients responds poorly to currently available psychotherapies, pharmacological interventions boosting treatment response are needed. Because of its anxiolytic and pro-social properties, the neuropeptide oxytocin (OT) has been proposed as promising strategy for treatment augmentation in PTSD. As a first step to investigate the therapeutic potential of OT in PTSD, we conducted a double-blind, placebo-controlled, cross-over functional MRI study examining OT administration effects (40 IU) on amygdala reactivity toward emotional faces in unmedicated male and female police officers with (n=37, 21 males) and without (n=40, 20 males) PTSD. Trauma-exposed controls were matched to PTSD patients based on age, sex, years of service and educational level. Under placebo, the expected valence-dependent amygdala reactivity (ie, greater activity toward fearful-angry faces compared with happy-neutral faces) was absent in PTSD patients. OT administration dampened amygdala reactivity toward all emotional faces in male and female PTSD patients, but enhanced amygdala reactivity in healthy male and female trauma-exposed controls, independent of sex and stimulus valence. In PTSD patients, greater anxiety prior to scanning and amygdala reactivity during the placebo session were associated with greater reduction of amygdala reactivity after OT administration. Taken together, our results indicate presumably beneficial neurobiological effects of OT administration in male and female PTSD patients. Future studies should investigate OT administration in clinical settings to fully appreciate its therapeutic potential.

  14. Emotion expression of an affective state space; a humanoid robot displaying a dynamic emotional state during a soccer game

    NARCIS (Netherlands)

    van der Mey, A.; Smit, F; Droog, K.J.; Visser, A.

    2010-01-01

    Following a soccer game is an example where clear emotions are displayed. This example is worked out for a humanoid robot which can express emotions with body language. The emotions expressed by the robot are not just stimuli-response, but are based on an affective state which shows dynamic behavior

  15. Neural circuitry of masked emotional face processing in youth with bipolar disorder, severe mood dysregulation, and healthy volunteers

    Directory of Open Access Journals (Sweden)

    Laura A. Thomas

    2014-04-01

    Full Text Available Youth with bipolar disorder (BD and those with severe, non-episodic irritability (severe mood dysregulation, SMD show face-emotion labeling deficits. These groups differ from healthy volunteers (HV in neural responses to emotional faces. It is unknown whether awareness is required to elicit these differences. We compared activation in BD (N = 20, SMD (N = 18, and HV (N = 22 during “Aware” and “Non-aware” priming of shapes by emotional faces. Subjects rated how much they liked the shape. In aware, a face (angry, fearful, happy, neutral, blank oval appeared (187 ms before the shape. In non-aware, a face appeared (17 ms, followed by a mask (170 ms, and shape. A Diagnosis-by-Awareness-by-Emotion ANOVA was not significant. There were significant Diagnosis-by-Awareness interactions in occipital regions. BD and SMD showed increased activity for non-aware vs. aware; HV showed the reverse pattern. When subjects viewed angry or neutral faces, there were Emotion-by-Diagnosis interactions in face-emotion processing regions, including the L precentral gyrus, R posterior cingulate, R superior temporal gyrus, R middle occipital gyrus, and L medial frontal gyrus. Regardless of awareness, BD and SMD differ in activation patterns from HV and each other in multiple brain regions, suggesting that BD and SMD are distinct developmental mood disorders.

  16. Neural circuitry of masked emotional face processing in youth with bipolar disorder, severe mood dysregulation, and healthy volunteers.

    Science.gov (United States)

    Thomas, Laura A; Brotman, Melissa A; Bones, Brian L; Chen, Gang; Rosen, Brooke H; Pine, Daniel S; Leibenluft, Ellen

    2014-04-01

    Youth with bipolar disorder (BD) and those with severe, non-episodic irritability (severe mood dysregulation, SMD) show face-emotion labeling deficits. These groups differ from healthy volunteers (HV) in neural responses to emotional faces. It is unknown whether awareness is required to elicit these differences. We compared activation in BD (N=20), SMD (N=18), and HV (N=22) during "Aware" and "Non-aware" priming of shapes by emotional faces. Subjects rated how much they liked the shape. In aware, a face (angry, fearful, happy, neutral, blank oval) appeared (187 ms) before the shape. In non-aware, a face appeared (17 ms), followed by a mask (170 ms), and shape. A Diagnosis-by-Awareness-by-Emotion ANOVA was not significant. There were significant Diagnosis-by-Awareness interactions in occipital regions. BD and SMD showed increased activity for non-aware vs. aware; HV showed the reverse pattern. When subjects viewed angry or neutral faces, there were Emotion-by-Diagnosis interactions in face-emotion processing regions, including the L precentral gyrus, R posterior cingulate, R superior temporal gyrus, R middle occipital gyrus, and L medial frontal gyrus. Regardless of awareness, BD and SMD differ in activation patterns from HV and each other in multiple brain regions, suggesting that BD and SMD are distinct developmental mood disorders. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. More than words (and faces): evidence for a Stroop effect of prosody in emotion word processing.

    Science.gov (United States)

    Filippi, Piera; Ocklenburg, Sebastian; Bowling, Daniel L; Heege, Larissa; Güntürkün, Onur; Newen, Albert; de Boer, Bart

    2017-08-01

    Humans typically combine linguistic and nonlinguistic information to comprehend emotions. We adopted an emotion identification Stroop task to investigate how different channels interact in emotion communication. In experiment 1, synonyms of "happy" and "sad" were spoken with happy and sad prosody. Participants had more difficulty ignoring prosody than ignoring verbal content. In experiment 2, synonyms of "happy" and "sad" were spoken with happy and sad prosody, while happy or sad faces were displayed. Accuracy was lower when two channels expressed an emotion that was incongruent with the channel participants had to focus on, compared with the cross-channel congruence condition. When participants were required to focus on verbal content, accuracy was significantly lower also when prosody was incongruent with verbal content and face. This suggests that prosody biases emotional verbal content processing, even when conflicting with verbal content and face simultaneously. Implications for multimodal communication and language evolution studies are discussed.

  18. Effect of positive emotion on consolidation of memory for faces: the modulation of facial valence and facial gender.

    Science.gov (United States)

    Wang, Bo

    2013-01-01

    Studies have shown that emotion elicited after learning enhances memory consolidation. However, no prior studies have used facial photos as stimuli. This study examined the effect of post-learning positive emotion on consolidation of memory for faces. During the learning participants viewed neutral, positive, or negative faces. Then they were assigned to a condition in which they either watched a 9-minute positive video clip, or a 9-minute neutral video. Then 30 minutes after the learning participants took a surprise memory test, in which they made "remember", "know", and "new" judgements. The findings are: (1) Positive emotion enhanced consolidation of recognition for negative male faces, but impaired consolidation of recognition for negative female faces; (2) For males, recognition for negative faces was equivalent to that for positive faces; for females, recognition for negative faces was better than that for positive faces. Our study provides the important evidence that effect of post-learning emotion on memory consolidation can extend to facial stimuli and such an effect can be modulated by facial valence and facial gender. The findings may shed light on establishing models concerning the influence of emotion on memory consolidation.

  19. Personality, Attentional Biases towards Emotional Faces and Symptoms of Mental Disorders in an Adolescent Sample.

    Science.gov (United States)

    O'Leary-Barrett, Maeve; Pihl, Robert O; Artiges, Eric; Banaschewski, Tobias; Bokde, Arun L W; Büchel, Christian; Flor, Herta; Frouin, Vincent; Garavan, Hugh; Heinz, Andreas; Ittermann, Bernd; Mann, Karl; Paillère-Martinot, Marie-Laure; Nees, Frauke; Paus, Tomas; Pausova, Zdenka; Poustka, Luise; Rietschel, Marcella; Robbins, Trevor W; Smolka, Michael N; Ströhle, Andreas; Schumann, Gunter; Conrod, Patricia J

    2015-01-01

    To investigate the role of personality factors and attentional biases towards emotional faces, in establishing concurrent and prospective risk for mental disorder diagnosis in adolescence. Data were obtained as part of the IMAGEN study, conducted across 8 European sites, with a community sample of 2257 adolescents. At 14 years, participants completed an emotional variant of the dot-probe task, as well two personality measures, namely the Substance Use Risk Profile Scale and the revised NEO Personality Inventory. At 14 and 16 years, participants and their parents were interviewed to determine symptoms of mental disorders. Personality traits were general and specific risk indicators for mental disorders at 14 years. Increased specificity was obtained when investigating the likelihood of mental disorders over a 2-year period, with the Substance Use Risk Profile Scale showing incremental validity over the NEO Personality Inventory. Attentional biases to emotional faces did not characterise or predict mental disorders examined in the current sample. Personality traits can indicate concurrent and prospective risk for mental disorders in a community youth sample, and identify at-risk youth beyond the impact of baseline symptoms. This study does not support the hypothesis that attentional biases mediate the relationship between personality and psychopathology in a community sample. Task and sample characteristics that contribute to differing results among studies are discussed.

  20. Scanning patterns of faces do not explain impaired emotion recognition in Huntington Disease: Evidence for a high level mechanism

    Directory of Open Access Journals (Sweden)

    Marieke evan Asselen

    2012-02-01

    Full Text Available Previous studies in patients with amygdala lesions suggested that deficits in emotion recognition might be mediated by impaired scanning patterns of faces. Here we investigated whether scanning patterns also contribute to the selective impairment in recognition of disgust in Huntington disease (HD. To achieve this goal, we recorded eye movements during a two-alternative forced choice emotion recognition task. HD patients in presymptomatic (n=16 and symptomatic (n=9 disease stages were tested and their performance was compared to a control group (n=22. In our emotion recognition task, participants had to indicate whether a face reflected one of six basic emotions. In addition, and in order to define whether emotion recognition was altered when the participants were forced to look at a specific component of the face, we used a second task where only limited facial information was provided (eyes/mouth in partially masked faces. Behavioural results showed no differences in the ability to recognize emotions between presymptomatic gene carriers and controls. However, an emotion recognition deficit was found for all 6 basic emotion categories in early stage HD. Analysis of eye movement patterns showed that patient and controls used similar scanning strategies. Patterns of deficits were similar regardless of whether parts of the faces were masked or not, thereby confirming that selective attention to particular face parts is not underlying the deficits. These results suggest that the emotion recognition deficits in symptomatic HD patients cannot be explained by impaired scanning patterns of faces. Furthermore, no selective deficit for recognition of disgust was found in presymptomatic HD patients.

  1. Gender Differences in Human Single Neuron Responses to Male Emotional Faces

    Directory of Open Access Journals (Sweden)

    Morgan eNewhoff

    2015-09-01

    Full Text Available Well-documented differences in the psychology and behavior of men and women have spurred extensive exploration of gender's role within the brain, particularly regarding emotional processing. While neuroanatomical studies clearly show differences between the sexes, the functional effects of these differences are less understood. Neuroimaging studies have shown inconsistent locations and magnitudes of gender differences in brain hemodynamic responses to emotion. To better understand the neurophysiology of these gender differences, we analyzed recordings of single neuron activity in the human brain as subjects of both genders viewed emotional expressions.This study included recordings of single-neuron activity of 14 (6 male epileptic patients in four brain areas: amygdala (236 neurons, hippocampus (n=270, anterior cingulate cortex (n=256, and ventromedial prefrontal cortex (n=174. Neural activity was recorded while participants viewed a series of avatar male faces portraying positive, negative or neutral expressions.Significant gender differences were found in the left amygdala, where 23% (n=15/66 of neurons in men were significantly affected by facial emotion, versus 8% (n=6/76 of neurons in women. A Fisher's exact test comparing the two ratios found a highly significant difference between the two (p<0.01. These results show specific differences between genders at the single-neuron level in the human amygdala. These differences may reflect gender-based distinctions in evolved capacities for emotional processing and also demonstrate the importance of including subject gender as an independent factor in future studies of emotional processing by single neurons in the human amygdala.

  2. Oxytocin and social pretreatment have similar effects on processing of negative emotional faces in healthy adult males

    Directory of Open Access Journals (Sweden)

    Anna eKis

    2013-08-01

    Full Text Available Oxytocin has been shown to affect several aspects of human social cognition, including facial emotion processing. There is also evidence that social stimuli (such as eye-contact can effectively modulate endogenous oxytocin levels.In the present study we directly tested whether intranasal oxytocin administration and pre-treatment with social stimuli had similar effects on face processing at the behavioural level. Subjects (N=52 healthy adult males were presented with a set of faces with expressions of different valence (negative, neutral, positive following different types of pretreatment (oxytocin – OT or placebo – PL and social interaction – Soc or no social interaction – NSoc, N=13 in each and were asked to rate all faces for perceived emotion and trustworthiness. On the next day subjects’ recognition memory was tested on a set of neutral faces and additionally they had to again rate each face for trustworthiness and emotion.Subjects in both the OT and the Soc pretreatment group (as compared to the PL and to the NSoc groups gave higher emotion and trustworthiness scores for faces with negative emotional expression. Moreover, 24 h later, subjects in the OT and Soc groups (unlike in control groups gave lower trustworthiness scores for previously negative faces, than for faces previously seen as emotionally neutral or positive.In sum these results provide the first direct evidence of the similar effects of intranasal oxytocin administration and social stimulation on the perception of negative facial emotions as well as on the delayed recall of negative emotional information.

  3. Neural correlates of top-down processing in emotion perception: an ERP study of emotional faces in white noise versus noise-alone stimuli.

    Science.gov (United States)

    Lee, Kyu-Yong; Lee, Tae-Ho; Yoon, So-Jeong; Cho, Yang Seok; Choi, June-Seek; Kim, Hyun Taek

    2010-06-14

    In the present study, we investigated the neural correlates underlying the perception of emotion in response to facial stimuli in order to elucidate the extent to which emotional perception is affected by the top-down process. Subjects performed a forced, two-choice emotion discrimination task towards ambiguous visual stimuli consisted of emotional faces embedded in different levels of visual white noise, including white noise-alone stimuli. ERP recordings and behavioral responses were analyzed according to the four response categories: hit, miss, false alarm and correct rejection. We observed enlarged EPN and LPP amplitudes when subjects reported seeing fearful faces and a typical emotional EPN response in the white noise-alone conditions when fearful faces were not presented. The two components of the ERP data which imply the characteristic modulation reflecting emotional processing showed the type of emotion each individual subjectively perceived. The results suggest that top-down modulations might be indispensable for emotional perception, which consists of two distinct stages of stimulus processing in the brain. (c) 2010 Elsevier B.V. All rights reserved.

  4. Facing the Problem: Impaired Emotion Recognition During Multimodal Social Information Processing in Borderline Personality Disorder.

    Science.gov (United States)

    Niedtfeld, Inga; Defiebre, Nadine; Regenbogen, Christina; Mier, Daniela; Fenske, Sabrina; Kirsch, Peter; Lis, Stefanie; Schmahl, Christian

    2017-04-01

    Previous research has revealed alterations and deficits in facial emotion recognition in patients with borderline personality disorder (BPD). During interpersonal communication in daily life, social signals such as speech content, variation in prosody, and facial expression need to be considered simultaneously. We hypothesized that deficits in higher level integration of social stimuli contribute to difficulties in emotion recognition in BPD, and heightened arousal might explain this effect. Thirty-one patients with BPD and thirty-one healthy controls were asked to identify emotions in short video clips, which were designed to represent different combinations of the three communication channels: facial expression, speech content, and prosody. Skin conductance was recorded as a measure of sympathetic arousal, while controlling for state dissociation. Patients with BPD showed lower mean accuracy scores than healthy control subjects in all conditions comprising emotional facial expressions. This was true for the condition with facial expression only, and for the combination of all three communication channels. Electrodermal responses were enhanced in BPD only in response to auditory stimuli. In line with the major body of facial emotion recognition studies, we conclude that deficits in the interpretation of facial expressions lead to the difficulties observed in multimodal emotion processing in BPD.

  5. Recognition memory for low- and high-frequency-filtered emotional faces: Low spatial frequencies drive emotional memory enhancement, whereas high spatial frequencies drive the emotion-induced recognition bias.

    Science.gov (United States)

    Rohr, Michaela; Tröger, Johannes; Michely, Nils; Uhde, Alarith; Wentura, Dirk

    2017-07-01

    This article deals with two well-documented phenomena regarding emotional stimuli: emotional memory enhancement-that is, better long-term memory for emotional than for neutral stimuli-and the emotion-induced recognition bias-that is, a more liberal response criterion for emotional than for neutral stimuli. Studies on visual emotion perception and attention suggest that emotion-related processes can be modulated by means of spatial-frequency filtering of the presented emotional stimuli. Specifically, low spatial frequencies are assumed to play a primary role for the influence of emotion on attention and judgment. Given this theoretical background, we investigated whether spatial-frequency filtering also impacts (1) the memory advantage for emotional faces and (2) the emotion-induced recognition bias, in a series of old/new recognition experiments. Participants completed incidental-learning tasks with high- (HSF) and low- (LSF) spatial-frequency-filtered emotional and neutral faces. The results of the surprise recognition tests showed a clear memory advantage for emotional stimuli. Most importantly, the emotional memory enhancement was significantly larger for face images containing only low-frequency information (LSF faces) than for HSF faces across all experiments, suggesting that LSF information plays a critical role in this effect, whereas the emotion-induced recognition bias was found only for HSF stimuli. We discuss our findings in terms of both the traditional account of different processing pathways for HSF and LSF information and a stimulus features account. The double dissociation in the results favors the latter account-that is, an explanation in terms of differences in the characteristics of HSF and LSF stimuli.

  6. Dispositional fear, negative affectivity, and neuroimaging response to visually suppressed emotional faces.

    Science.gov (United States)

    Vizueta, Nathalie; Patrick, Christopher J; Jiang, Yi; Thomas, Kathleen M; He, Sheng

    2012-01-02

    "Invisible" stimulus paradigms provide a method for investigating basic affective processing in clinical and non-clinical populations. Neuroimaging studies utilizing continuous flash suppression (CFS) have shown increased amygdala response to invisible fearful versus neutral faces. The current study used CFS in conjunction with functional MRI to test for differences in brain reactivity to visible and invisible emotional faces in relation to two distinct trait dimensions relevant to psychopathology: negative affectivity (NA) and fearfulness. Subjects consisted of college students (N=31) assessed for fear/fearlessness along with dispositional NA. The main brain regions of interest included the fusiform face area (FFA), superior temporal sulcus (STS), and amygdala. Higher NA, but not trait fear, was associated with enhanced response to fearful versus neutral faces in STS and right amygdala (but not FFA), within the invisible condition specifically. The finding that NA rather than fearfulness predicted degree of amygdala reactivity to suppressed faces implicates the input subdivision of the amygdala in the observed effects. Given the central role of NA in anxiety and mood disorders, the current data also support use of the CFS methodology for investigating the neurobiology of these disorders. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. Spatiotemporal brain dynamics of emotional face processing modulations induced by the serotonin 1A/2A receptor agonist psilocybin.

    Science.gov (United States)

    Bernasconi, Fosco; Schmidt, André; Pokorny, Thomas; Kometer, Michael; Seifritz, Erich; Vollenweider, Franz X

    2014-12-01

    Emotional face processing is critically modulated by the serotonergic system. For instance, emotional face processing is impaired by acute psilocybin administration, a serotonin (5-HT) 1A and 2A receptor agonist. However, the spatiotemporal brain mechanisms underlying these modulations are poorly understood. Here, we investigated the spatiotemporal brain dynamics underlying psilocybin-induced modulations during emotional face processing. Electrical neuroimaging analyses were applied to visual evoked potentials in response to emotional faces, following psilocybin and placebo administration. Our results indicate a first time period of strength (i.e., Global Field Power) modulation over the 168-189 ms poststimulus interval, induced by psilocybin. A second time period of strength modulation was identified over the 211-242 ms poststimulus interval. Source estimations over these 2 time periods further revealed decreased activity in response to both neutral and fearful faces within limbic areas, including amygdala and parahippocampal gyrus, and the right temporal cortex over the 168-189 ms interval, and reduced activity in response to happy faces within limbic and right temporo-occipital brain areas over the 211-242 ms interval. Our results indicate a selective and temporally dissociable effect of psilocybin on the neuronal correlates of emotional face processing, consistent with a modulation of the top-down control. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. Prudence, Personality, Cognitive Ability and Emotional State

    NARCIS (Netherlands)

    Breaban, Adriana; van de Kuilen, Gijs; Noussair, Charles

    2016-01-01

    We report an experiment to consider the emotional correlates of prudent decision making. In the experiment, we present subjects with lotteries and measure their emotional response with facial recognition software. They then make binary choices between risky lotteries that distinguish prudent from

  9. Prudence, emotional state, personality, and cognitive ability

    NARCIS (Netherlands)

    Breaban, Adriana; Van De Kuilen, Gijs; Noussair, Charles N.

    2016-01-01

    We report an experiment to consider the emotional correlates of prudent decision making. In the experiment, we present subjects with lotteries and measure their emotional response with facial recognition software. They then make binary choices between risky lotteries that distinguish prudent from

  10. Altered Functional Subnetwork During Emotional Face Processing: A Potential Intermediate Phenotype for Schizophrenia.

    Science.gov (United States)

    Cao, Hengyi; Bertolino, Alessandro; Walter, Henrik; Schneider, Michael; Schäfer, Axel; Taurisano, Paolo; Blasi, Giuseppe; Haddad, Leila; Grimm, Oliver; Otto, Kristina; Dixson, Luanna; Erk, Susanne; Mohnke, Sebastian; Heinz, Andreas; Romanczuk-Seiferth, Nina; Mühleisen, Thomas W; Mattheisen, Manuel; Witt, Stephanie H; Cichon, Sven; Noethen, Markus; Rietschel, Marcella; Tost, Heike; Meyer-Lindenberg, Andreas

    2016-06-01

    Although deficits in emotional processing are prominent in schizophrenia, it has been difficult to identify neural mechanisms related to the genetic risk for this highly heritable illness. Prior studies have not found consistent regional activation or connectivity alterations in first-degree relatives compared with healthy controls, suggesting that a more comprehensive search for connectomic biomarkers is warranted. To identify a potential systems-level intermediate phenotype linked to emotion processing in schizophrenia and to examine the psychological association, task specificity, test-retest reliability, and clinical validity of the identified phenotype. The study was performed in university research hospitals from June 1, 2008, through December 31, 2013. We examined 58 unaffected first-degree relatives of patients with schizophrenia and 94 healthy controls with an emotional face-matching functional magnetic resonance imaging paradigm. Test-retest reliability was analyzed with an independent sample of 26 healthy participants. A clinical association study was performed in 31 patients with schizophrenia and 45 healthy controls. Data analysis was performed from January 1 to September 30, 2014. Conventional amygdala activity and seeded connectivity measures, graph-based global and local network connectivity measures, Spearman rank correlation, intraclass correlation, and gray matter volumes. Among the 152 volunteers included in the relative-control sample, 58 were unaffected first-degree relatives of patients with schizophrenia (mean [SD] age, 33.29 [12.56]; 38 were women), and 94 were healthy controls without a first-degree relative with mental illness (mean [SD] age, 32.69 [10.09] years; 55 were women). A graph-theoretical connectivity approach identified significantly decreased connectivity in a subnetwork that primarily included the limbic cortex, visual cortex, and subcortex during emotional face processing (cluster-level P corrected for familywise error =

  11. Emotional state of “young” fathers

    Directory of Open Access Journals (Sweden)

    Hanna Liberska

    2016-10-01

    Full Text Available Background The birth of the first child begins a new stage in family life, and the woman and the man must adopt new roles in society. However, adapting to the new conditions of life and the requirements of the new role can be difficult. Participants and procedure The main tools used in the study were the SUPIN scale, the STAI inventory and a questionnaire constructed by the authors. The participants were 90 men who became first time fathers in the period of 6 months prior to the study. Results The results indicate that the first time fathers show a medium level of state anxiety related to the current situation and a low level of trait anxiety understood as an enduring disposition. The level of anxiety is related to the age of the child, but only in the men who fathered a son: the older the son was, the greater was the intensity of state anxiety in the father. Conclusions The deep conviction that the father should be a role model for the son can be a source of anxiety about the ability to manage and the ability to meet the related responsibility. According to the tradition of our culture, the father has to prepare the son to be a man, to assume a man’s roles in society and teach him how to live. The lower intensity of positive emotions related to the birth of a daughter can be explained from the point of view of the true man stereotype – a daughter does not fulfil it.

  12. Consciousness and arousal effects on emotional face processing as revealed by brain oscillations. A gamma band analysis.

    Science.gov (United States)

    Balconi, Michela; Lucchiari, Claudio

    2008-01-01

    It remains an open question whether it is possible to assign a single brain operation or psychological function for facial emotion decoding to a certain type of oscillatory activity. Gamma band activity (GBA) offers an adequate tool for studying cortical activation patterns during emotional face information processing. In the present study brain oscillations were analyzed in response to facial expression of emotions. Specifically, GBA modulation was measured when twenty subjects looked at emotional (angry, fearful, happy, and sad faces) or neutral faces in two different conditions: supraliminal (10 ms) vs subliminal (150 ms) stimulation (100 target-mask pairs for each condition). The results showed that both consciousness and significance of the stimulus in terms of arousal can modulate the power synchronization (ERD decrease) during 150-350 time range: an early oscillatory event showed its peak at about 200 ms post-stimulus. GBA was enhanced by supraliminal more than subliminal elaboration, as well as more by high arousal (anger and fear) than low arousal (happiness and sadness) emotions. Finally a left-posterior dominance for conscious elaboration was found, whereas right hemisphere was discriminant in emotional processing of face in comparison with neutral face.

  13. Emotional face expression modulates occipital-frontal effective connectivity during memory formation in a bottom-up fashion

    Directory of Open Access Journals (Sweden)

    Daiming eXiu

    2015-04-01

    Full Text Available This study investigated the role of bottom-up and top-down neural mechanisms in the processing of emotional face expression during memory formation. Functional brain imaging data was acquired during incidental learning of positive (‘happy’, neutral and negative (‘angry’ or ‘fearful’ faces. Dynamic Causal Modeling (DCM was applied on the fMRI data to characterize effective connectivity within a brain network involving face perception (inferior occipital gyrus and fusiform gyrus and successful memory formation related areas (hippocampus, superior parietal lobule, amygdala and orbitofrontal cortex. The bottom-up models assumed processing of emotional face expression along feed forward pathways to the orbitofrontal cortex. The top-down models assumed that the orbitofrontal cortex processed emotional valence and mediated connections to the hippocampus. A subsequent recognition memory test showed an effect of negative emotion on the response bias, but not on memory performance. Our DCM findings showed that the bottom-up model family of effective connectivity best explained the data across all subjects and specified that emotion affected most bottom-up connections to the orbitofrontal cortex, especially from the occipital visual cortex and superior parietal lobule. Of those pathways to the orbitofrontal cortex the connection from the inferior occipital gyrus correlated with memory performance independently of valence. We suggest that bottom-up neural mechanisms support effects of emotional face expression and memory formation in a parallel and partially overlapping fashion.

  14. Expression-dependent susceptibility to face distortions in processing of facial expressions of emotion.

    Science.gov (United States)

    Guo, Kun; Soornack, Yoshi; Settle, Rebecca

    2018-03-05

    Our capability of recognizing facial expressions of emotion under different viewing conditions implies the existence of an invariant expression representation. As natural visual signals are often distorted and our perceptual strategy changes with external noise level, it is essential to understand how expression perception is susceptible to face distortion and whether the same facial cues are used to process high- and low-quality face images. We systematically manipulated face image resolution (experiment 1) and blur (experiment 2), and measured participants' expression categorization accuracy, perceived expression intensity and associated gaze patterns. Our analysis revealed a reasonable tolerance to face distortion in expression perception. Reducing image resolution up to 48 × 64 pixels or increasing image blur up to 15 cycles/image had little impact on expression assessment and associated gaze behaviour. Further distortion led to decreased expression categorization accuracy and intensity rating, increased reaction time and fixation duration, and stronger central fixation bias which was not driven by distortion-induced changes in local image saliency. Interestingly, the observed distortion effects were expression-dependent with less deterioration impact on happy and surprise expressions, suggesting this distortion-invariant facial expression perception might be achieved through the categorical model involving a non-linear configural combination of local facial features. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. The human body odor compound androstadienone leads to anger-dependent effects in an emotional Stroop but not dot-probe task using human faces.

    Science.gov (United States)

    Hornung, Jonas; Kogler, Lydia; Wolpert, Stephan; Freiherr, Jessica; Derntl, Birgit

    2017-01-01

    The androgen derivative androstadienone is a substance found in human sweat and thus is a putative human chemosignal. Androstadienone has been studied with respect to effects on mood states, attractiveness ratings, physiological and neural activation. With the current experiment, we aimed to explore in which way androstadienone affects attention to social cues (human faces). Moreover, we wanted to test whether effects depend on specific emotions, the participants' sex and individual sensitivity to smell androstadienone. To do so, we investigated 56 healthy individuals (thereof 29 females taking oral contraceptives) with two attention tasks on two consecutive days (once under androstadienone, once under placebo exposure in pseudorandomized order). With an emotional dot-probe task we measured visuo-spatial cueing while an emotional Stroop task allowed us to investigate interference control. Our results suggest that androstadienone acts in a sex, task and emotion-specific manner as a reduction in interference processes in the emotional Stroop task was only apparent for angry faces in men under androstadienone exposure. More specifically, men showed a smaller difference in reaction times for congruent compared to incongruent trials. At the same time also women were slightly affected by smelling androstadienone as they classified angry faces more often correctly under androstadienone. For the emotional dot-probe task no modulation by androstadienone was observed. Furthermore, in both attention paradigms individual sensitivity to androstadienone was neither correlated with reaction times nor error rates in men and women. To conclude, exposure to androstadienone seems to potentiate the relevance of angry faces in both men and women in connection with interference control, while processes of visuo-spatial cueing remain unaffected.

  16. Reduced amygdala and ventral striatal activity to happy faces in PTSD is associated with emotional numbing.

    Directory of Open Access Journals (Sweden)

    Kim L Felmingham

    Full Text Available There has been a growing recognition of the importance of reward processing in PTSD, yet little is known of the underlying neural networks. This study tested the predictions that (1 individuals with PTSD would display reduced responses to happy facial expressions in ventral striatal reward networks, and (2 that this reduction would be associated with emotional numbing symptoms. 23 treatment-seeking patients with Posttraumatic Stress Disorder were recruited from the treatment clinic at the Centre for Traumatic Stress Studies, Westmead Hospital, and 20 trauma-exposed controls were recruited from a community sample. We examined functional magnetic resonance imaging responses during the presentation of happy and neutral facial expressions in a passive viewing task. PTSD participants rated happy facial expression as less intense than trauma-exposed controls. Relative to controls, PTSD participants revealed lower activation to happy (-neutral faces in ventral striatum and and a trend for reduced activation in left amygdala. A significant negative correlation was found between emotional numbing symptoms in PTSD and right ventral striatal regions after controlling for depression, anxiety and PTSD severity. This study provides initial evidence that individuals with PTSD have lower reactivity to happy facial expressions, and that lower activation in ventral striatal-limbic reward networks may be associated with symptoms of emotional numbing.

  17. Neural markers of emotional face perception across psychotic disorders and general population.

    Science.gov (United States)

    Sabharwal, Amri; Kotov, Roman; Szekely, Akos; Leung, Hoi-Chung; Barch, Deanna M; Mohanty, Aprajita

    2017-07-01

    There is considerable variation in negative and positive symptoms of psychosis, global functioning, and emotional face perception (EFP), not only in schizophrenia but also in other psychotic disorders and healthy individuals. However, EFP impairment and its association with worse symptoms and global functioning have been examined largely in the domain of schizophrenia. The present study adopted a dimensional approach to examine the association of behavioral and neural measures of EFP with symptoms of psychosis and global functioning across individuals with schizophrenia spectrum (SZ; N = 28) and other psychotic (OP; N = 29) disorders, and never-psychotic participants (NP; N = 21). Behavioral and functional MRI data were recorded as participants matched emotional expressions of faces and geometrical shapes. Lower accuracy and increased activity in early visual regions, hippocampus, and amygdala during emotion versus shape matching were associated with higher negative, but not positive, symptoms and lower global functioning, across all participants. This association remained even after controlling for group-related (SZ, OP, and NP) variance, dysphoria, and antipsychotic medication status, except in amygdala. Furthermore, negative symptoms mediated the relationship between behavioral and brain EFP measures and global functioning. This study provides some of the first evidence supporting the specific relationship of EFP measures with negative symptoms and global functioning across psychotic and never-psychotic samples, and transdiagnostically across different psychotic disorders. Present findings help bridge the gap between basic EFP-related neuroscience research and clinical research in psychosis, and highlight EFP as a potential symptom-specific marker that tracks global functioning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  18. Emotional expectations influence neural sensitivity to fearful faces in humans:An event-related potential study

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The present study tested whether neural sensitivity to salient emotional facial expressions was influenced by emotional expectations induced by a cue that validly predicted the expression of a subsequently presented target face. Event-related potentials (ERPs) elicited by fearful and neutral faces were recorded while participants performed a gender discrimination task under cued (‘expected’) and uncued (‘unexpected’) conditions. The behavioral results revealed that accuracy was lower for fearful compared with neutral faces in the unexpected condition, while accuracy was similar for fearful and neutral faces in the expected condition. ERP data revealed increased amplitudes in the P2 component and 200–250 ms interval for unexpected fearful versus neutral faces. By contrast, ERP responses were similar for fearful and neutral faces in the expected condition. These findings indicate that human neural sensitivity to fearful faces is modulated by emotional expectations. Although the neural system is sensitive to unpredictable emotionally salient stimuli, sensitivity to salient stimuli is reduced when these stimuli are predictable.

  19. What's good for the goose is not good for the gander: Age and gender differences in scanning emotion faces.

    Science.gov (United States)

    Sullivan, Susan; Campbell, Anna; Hutton, Sam B; Ruffman, Ted

    2017-05-01

    Research indicates that older adults' (≥60 years) emotion recognition is worse than that of young adults, young and older men's emotion recognition is worse than that of young and older women (respectively), older adults' looking at mouths compared with eyes is greater than that of young adults. Nevertheless, previous research has not compared older men's and women's looking at emotion faces so the present study had two aims: (a) to examine whether the tendency to look at mouths is stronger amongst older men compared with older women and (b) to examine whether men's mouth looking correlates with better emotion recognition. We examined the emotion recognition abilities and spontaneous gaze patterns of young (n = 60) and older (n = 58) males and females as they labelled emotion faces. Older men spontaneously looked more to mouths than older women, and older men's looking at mouths correlated with their emotion recognition, whereas women's looking at eyes correlated with their emotion recognition. The findings are discussed in relation to a growing body of research suggesting both age and gender differences in response to emotional stimuli and the differential efficacy of mouth and eyes looking for men and women. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. Toward physiological indices of emotional state driving future ebook interactivity

    NARCIS (Netherlands)

    Erp, J.B.F. van; Hogervorst, M.A.; Werf, Y.D. van der

    2016-01-01

    Ebooks of the future may respond to the emotional experience of the reader. (Neuro-) physiological measures could capture a reader's emotional state and use this to enhance the reading experience by adding matching sounds or to change the storyline therewith creating a hybrid art form in between

  1. Toward physiological indices of emotional state driving future ebook interactivity

    NARCIS (Netherlands)

    van Erp, Johannes Bernardus Fransiscus; Hogervorst, Maarten A.; van der Werf, Ysbrand D.

    2016-01-01

    Ebooks of the future may respond to the emotional experience of the reader. (Neuro-) physiological measures could capture a reader’s emotional state and use this to enhance the reading experience by adding matching sounds or to change the storyline therewith creating a hybrid art form in between

  2. Emotional state and its impact on voice authentication accuracy

    Science.gov (United States)

    Voznak, Miroslav; Partila, Pavol; Penhaker, Marek; Peterek, Tomas; Tomala, Karel; Rezac, Filip; Safarik, Jakub

    2013-05-01

    The paper deals with the increasing accuracy of voice authentication methods. The developed algorithm first extracts segmental parameters, such as Zero Crossing Rate, the Fundamental Frequency and Mel-frequency cepstral coefficients from voice. Based on these parameters, the neural network classifier detects the speaker's emotional state. These parameters shape the distribution of neurons in Kohonen maps, forming clusters of neurons on the map characterizing a particular emotional state. Using regression analysis, we can calculate the function of the parameters of individual emotional states. This relationship increases voice authentication accuracy and prevents unjust rejection.

  3. Emotional face recognition in adolescent suicide attempters and adolescents engaging in non-suicidal self-injury.

    Science.gov (United States)

    Seymour, Karen E; Jones, Richard N; Cushman, Grace K; Galvan, Thania; Puzia, Megan E; Kim, Kerri L; Spirito, Anthony; Dickstein, Daniel P

    2016-03-01

    Little is known about the bio-behavioral mechanisms underlying and differentiating suicide attempts from non-suicidal self-injury (NSSI) in adolescents. Adolescents who attempt suicide or engage in NSSI often report significant interpersonal and social difficulties. Emotional face recognition ability is a fundamental skill required for successful social interactions, and deficits in this ability may provide insight into the unique brain-behavior interactions underlying suicide attempts versus NSSI in adolescents. Therefore, we examined emotional face recognition ability among three mutually exclusive groups: (1) inpatient adolescents who attempted suicide (SA, n = 30); (2) inpatient adolescents engaged in NSSI (NSSI, n = 30); and (3) typically developing controls (TDC, n = 30) without psychiatric illness. Participants included adolescents aged 13-17 years, matched on age, gender and full-scale IQ. Emotional face recognition was evaluated using the diagnostic assessment of nonverbal accuracy (DANVA-2). Compared to TDC youth, adolescents with NSSI made more errors on child fearful and adult sad face recognition while controlling for psychopathology and medication status (ps face recognition between NSSI and SA groups. Secondary analyses showed that compared to inpatients without major depression, those with major depression made fewer errors on adult sad face recognition even when controlling for group status (p recognition errors on adult happy faces even when controlling for group status (p face recognition than TDC, but not inpatient adolescents who attempted suicide. Further results suggest the importance of psychopathology in emotional face recognition. Replication of these preliminary results and examination of the role of context-dependent emotional processing are needed moving forward.

  4. Impaired Integration of Emotional Faces and Affective Body Context in a Rare Case of Developmental Visual Agnosia

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran. R.; Bentin, Shlomo

    2011-01-01

    In the current study we examined the recognition of facial expressions embedded in emotionally expressive bodies in case LG, an individual with a rare form of developmental visual agnosia who suffers from severe prosopagnosia. Neuropsychological testing demonstrated that LG‘s agnosia is characterized by profoundly impaired visual integration. Unlike individuals with typical developmental prosopagnosia who display specific difficulties with face identity (but typically not expression) recognition, LG was also impaired at recognizing isolated facial expressions. By contrast, he successfully recognized the expressions portrayed by faceless emotional bodies handling affective paraphernalia. When presented with contextualized faces in emotional bodies his ability to detect the emotion expressed by a face did not improve even if it was embedded in an emotionally-congruent body context. Furthermore, in contrast to controls, LG displayed an abnormal pattern of contextual influence from emotionally-incongruent bodies. The results are interpreted in the context of a general integration deficit in developmental visual agnosia, suggesting that impaired integration may extend from the level of the face to the level of the full person. PMID:21482423

  5. Beyond face value: does involuntary emotional anticipation shape the perception of dynamic facial expressions?

    Directory of Open Access Journals (Sweden)

    Letizia Palumbo

    Full Text Available Emotional facial expressions are immediate indicators of the affective dispositions of others. Recently it has been shown that early stages of social perception can already be influenced by (implicit attributions made by the observer about the agent's mental state and intentions. In the current study possible mechanisms underpinning distortions in the perception of dynamic, ecologically-valid, facial expressions were explored. In four experiments we examined to what extent basic perceptual processes such as contrast/context effects, adaptation and representational momentum underpinned the perceptual distortions, and to what extent 'emotional anticipation', i.e. the involuntary anticipation of the other's emotional state of mind on the basis of the immediate perceptual history, might have played a role. Neutral facial expressions displayed at the end of short video-clips, in which an initial facial expression of joy or anger gradually morphed into a neutral expression, were misjudged as being slightly angry or slightly happy, respectively (Experiment 1. This response bias disappeared when the actor's identity changed in the final neutral expression (Experiment 2. Videos depicting neutral-to-joy-to-neutral and neutral-to-anger-to-neutral sequences again produced biases but in opposite direction (Experiment 3. The bias survived insertion of a 400 ms blank (Experiment 4. These results suggested that the perceptual distortions were not caused by any of the low-level perceptual mechanisms (adaptation, representational momentum and contrast effects. We speculate that especially when presented with dynamic, facial expressions, perceptual distortions occur that reflect 'emotional anticipation' (a low-level mindreading mechanism, which overrules low-level visual mechanisms. Underpinning neural mechanisms are discussed in relation to the current debate on action and emotion understanding.

  6. Beyond face value: does involuntary emotional anticipation shape the perception of dynamic facial expressions?

    Science.gov (United States)

    Palumbo, Letizia; Jellema, Tjeerd

    2013-01-01

    Emotional facial expressions are immediate indicators of the affective dispositions of others. Recently it has been shown that early stages of social perception can already be influenced by (implicit) attributions made by the observer about the agent's mental state and intentions. In the current study possible mechanisms underpinning distortions in the perception of dynamic, ecologically-valid, facial expressions were explored. In four experiments we examined to what extent basic perceptual processes such as contrast/context effects, adaptation and representational momentum underpinned the perceptual distortions, and to what extent 'emotional anticipation', i.e. the involuntary anticipation of the other's emotional state of mind on the basis of the immediate perceptual history, might have played a role. Neutral facial expressions displayed at the end of short video-clips, in which an initial facial expression of joy or anger gradually morphed into a neutral expression, were misjudged as being slightly angry or slightly happy, respectively (Experiment 1). This response bias disappeared when the actor's identity changed in the final neutral expression (Experiment 2). Videos depicting neutral-to-joy-to-neutral and neutral-to-anger-to-neutral sequences again produced biases but in opposite direction (Experiment 3). The bias survived insertion of a 400 ms blank (Experiment 4). These results suggested that the perceptual distortions were not caused by any of the low-level perceptual mechanisms (adaptation, representational momentum and contrast effects). We speculate that especially when presented with dynamic, facial expressions, perceptual distortions occur that reflect 'emotional anticipation' (a low-level mindreading mechanism), which overrules low-level visual mechanisms. Underpinning neural mechanisms are discussed in relation to the current debate on action and emotion understanding.

  7. The NMDA antagonist ketamine and the 5-HT agonist psilocybin produce dissociable effects on structural encoding of emotional face expressions.

    Science.gov (United States)

    Schmidt, André; Kometer, Michael; Bachmann, Rosilla; Seifritz, Erich; Vollenweider, Franz

    2013-01-01

    Both glutamate and serotonin (5-HT) play a key role in the pathophysiology of emotional biases. Recent studies indicate that the glutamate N-methyl-D-aspartate (NMDA) receptor antagonist ketamine and the 5-HT receptor agonist psilocybin are implicated in emotion processing. However, as yet, no study has systematically compared their contribution to emotional biases. This study used event-related potentials (ERPs) and signal detection theory to compare the effects of the NMDA (via S-ketamine) and 5-HT (via psilocybin) receptor system on non-conscious or conscious emotional face processing biases. S-ketamine or psilocybin was administrated to two groups of healthy subjects in a double-blind within-subject placebo-controlled design. We behaviorally assessed objective thresholds for non-conscious discrimination in all drug conditions. Electrophysiological responses to fearful, happy, and neutral faces were subsequently recorded with the face-specific P100 and N170 ERP. Both S-ketamine and psilocybin impaired the encoding of fearful faces as expressed by a reduced N170 over parieto-occipital brain regions. In contrast, while S-ketamine also impaired the encoding of happy facial expressions, psilocybin had no effect on the N170 in response to happy faces. This study demonstrates that the NMDA and 5-HT receptor systems differentially contribute to the structural encoding of emotional face expressions as expressed by the N170. These findings suggest that the assessment of early visual evoked responses might allow detecting pharmacologically induced changes in emotional processing biases and thus provides a framework to study the pathophysiology of dysfunctional emotional biases.

  8. Abelian faces of state spaces of C*-algebras

    International Nuclear Information System (INIS)

    Batty, C.J.K.

    1980-01-01

    Let F be a closed face of the weak* compact convex state space of a unital C*-algebra A. The class of F-abelian states, introduced earlier by the author, is studied further. It is shown (without any restriction on A or F) that F is a Choquet simplex if and only if every state in F is F-abelian, and that it is sufficient for this that every pure state in F is F-abelian. As a corollary, it is deduced that an arbitrary C*-dynamical system (A,G,α) is G-abelian if and only if every ergodic state is weakly clustering. Nevertheless the set of all F-abelian (or even G-abelian) states is not necessarily weak* compact. (orig.)

  9. Cortical deficits of emotional face processing in adults with ADHD: its relation to social cognition and executive function.

    Science.gov (United States)

    Ibáñez, Agustin; Petroni, Agustin; Urquina, Hugo; Torrente, Fernando; Torralva, Teresa; Hurtado, Esteban; Guex, Raphael; Blenkmann, Alejandro; Beltrachini, Leandro; Muravchik, Carlos; Baez, Sandra; Cetkovich, Marcelo; Sigman, Mariano; Lischinsky, Alicia; Manes, Facundo

    2011-01-01

    Although it has been shown that adults with attention-deficit hyperactivity disorder (ADHD) have impaired social cognition, no previous study has reported the brain correlates of face valence processing. This study looked for behavioral, neuropsychological, and electrophysiological markers of emotion processing for faces (N170) in adult ADHD compared to controls matched by age, gender, educational level, and handedness. We designed an event-related potential (ERP) study based on a dual valence task (DVT), in which faces and words were presented to test the effects of stimulus type (faces, words, or face-word stimuli) and valence (positive versus negative). Individual signatures of cognitive functioning in participants with ADHD and controls were assessed with a comprehensive neuropsychological evaluation, including executive functioning (EF) and theory of mind (ToM). Compared to controls, the adult ADHD group showed deficits in N170 emotion modulation for facial stimuli. These N170 impairments were observed in the absence of any deficit in facial structural processing, suggesting a specific ADHD impairment in early facial emotion modulation. The cortical current density mapping of N170 yielded a main neural source of N170 at posterior section of fusiform gyrus (maximum at left hemisphere for words and right hemisphere for faces and simultaneous stimuli). Neural generators of N170 (fusiform gyrus) were reduced in ADHD. In those patients, N170 emotion processing was associated with performance on an emotional inference ToM task, and N170 from simultaneous stimuli was associated with EF, especially working memory. This is the first report to reveal an adult ADHD-specific impairment in the cortical modulation of emotion for faces and an association between N170 cortical measures and ToM and EF.

  10. Emotional face recognition deficits and medication effects in pre-manifest through stage-II Huntington's disease.

    Science.gov (United States)

    Labuschagne, Izelle; Jones, Rebecca; Callaghan, Jenny; Whitehead, Daisy; Dumas, Eve M; Say, Miranda J; Hart, Ellen P; Justo, Damian; Coleman, Allison; Dar Santos, Rachelle C; Frost, Chris; Craufurd, David; Tabrizi, Sarah J; Stout, Julie C

    2013-05-15

    Facial emotion recognition impairments have been reported in Huntington's disease (HD). However, the nature of the impairments across the spectrum of HD remains unclear. We report on emotion recognition data from 344 participants comprising premanifest HD (PreHD) and early HD patients, and controls. In a test of recognition of facial emotions, we examined responses to six basic emotional expressions and neutral expressions. In addition, and within the early HD sample, we tested for differences on emotion recognition performance between those 'on' vs. 'off' neuroleptic or selective serotonin reuptake inhibitor (SSRI) medications. The PreHD groups showed significant (precognition, compared to controls, on fearful, angry and surprised faces; whereas the early HD groups were significantly impaired across all emotions including neutral expressions. In early HD, neuroleptic use was associated with worse facial emotion recognition, whereas SSRI use was associated with better facial emotion recognition. The findings suggest that emotion recognition impairments exist across the HD spectrum, but are relatively more widespread in manifest HD than in the premanifest period. Commonly prescribed medications to treat HD-related symptoms also appear to affect emotion recognition. These findings have important implications for interpersonal communication and medication usage in HD. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  11. State and Employer: CERN faced with its responsibilities (cont.)

    CERN Multimedia

    Association du personnel

    2009-01-01

    The next Staff Council will be faced with a difficult situation in the area of social security and will have to put pressure on the governing bodies to fulfil the following statement: “The Organization must assume its responsibilities without reservation in both of its roles: as a State for our social security system (pensions, health insurance) and as an Employer (salaries, careers, contracts, etc.).”

  12. A Rapid Subcortical Amygdala Route for Faces Irrespective of Spatial Frequency and Emotion.

    Science.gov (United States)

    McFadyen, Jessica; Mermillod, Martial; Mattingley, Jason B; Halász, Veronika; Garrido, Marta I

    2017-04-05

    There is significant controversy over the existence and function of a direct subcortical visual pathway to the amygdala. It is thought that this pathway rapidly transmits low spatial frequency information to the amygdala independently of the cortex, and yet the directionality of this function has never been determined. We used magnetoencephalography to measure neural activity while human participants discriminated the gender of neutral and fearful faces filtered for low or high spatial frequencies. We applied dynamic causal modeling to demonstrate that the most likely underlying neural network consisted of a pulvinar-amygdala connection that was uninfluenced by spatial frequency or emotion, and a cortical-amygdala connection that conveyed high spatial frequencies. Crucially, data-driven neural simulations revealed a clear temporal advantage of the subcortical connection over the cortical connection in influencing amygdala activity. Thus, our findings support the existence of a rapid subcortical pathway that is nonselective in terms of the spatial frequency or emotional content of faces. We propose that that the "coarseness" of the subcortical route may be better reframed as "generalized." SIGNIFICANCE STATEMENT The human amygdala coordinates how we respond to biologically relevant stimuli, such as threat or reward. It has been postulated that the amygdala first receives visual input via a rapid subcortical route that conveys "coarse" information, namely, low spatial frequencies. For the first time, the present paper provides direction-specific evidence from computational modeling that the subcortical route plays a generalized role in visual processing by rapidly transmitting raw, unfiltered information directly to the amygdala. This calls into question a widely held assumption across human and animal research that fear responses are produced faster by low spatial frequencies. Our proposed mechanism suggests organisms quickly generate fear responses to a wide range

  13. Do the emotional states of pregnant women affect neonatal behaviour?

    Science.gov (United States)

    Hernández-Martínez, Carmen; Arija, Victoria; Balaguer, Albert; Cavallé, Pere; Canals, Josefa

    2008-11-01

    The emotional states of pregnant women affect the course of their pregnancies, their deliveries and the behaviour and development of their infants. The aim of this study is to analyse the influence of positive and negative maternal emotional states on neonatal behaviour at 2-3 days after birth. A sample of 163 healthy full-term newborns was evaluated using the Neonatal Behavioral Assessment Scale. Maternal anxiety, perceived stress, and emotional stability during pregnancy were evaluated in the immediate postpartum period with the State Trait Anxiety Inventory and the Perceived Stress Scale. Moderate levels of anxiety during pregnancy alter infant orientation and self-regulation. These aspects of infant behaviour could lead to later attachment, behavioural and developmental problems. Maternal emotional stability during pregnancy improves infant self-regulation and several aspects of infant behaviour that may predispose them to better interactions with their parents.

  14. Automatic Emotional State Detection using Facial Expression Dynamic in Videos

    Directory of Open Access Journals (Sweden)

    Hongying Meng

    2014-11-01

    Full Text Available In this paper, an automatic emotion detection system is built for a computer or machine to detect the emotional state from facial expressions in human computer communication. Firstly, dynamic motion features are extracted from facial expression videos and then advanced machine learning methods for classification and regression are used to predict the emotional states. The system is evaluated on two publicly available datasets, i.e. GEMEP_FERA and AVEC2013, and satisfied performances are achieved in comparison with the baseline results provided. With this emotional state detection capability, a machine can read the facial expression of its user automatically. This technique can be integrated into applications such as smart robots, interactive games and smart surveillance systems.

  15. Characterization and recognition of mixed emotional expressions in thermal face image

    Science.gov (United States)

    Saha, Priya; Bhattacharjee, Debotosh; De, Barin K.; Nasipuri, Mita

    2016-05-01

    Facial expressions in infrared imaging have been introduced to solve the problem of illumination, which is an integral constituent of visual imagery. The paper investigates facial skin temperature distribution on mixed thermal facial expressions of our created face database where six are basic expressions and rest 12 are a mixture of those basic expressions. Temperature analysis has been performed on three facial regions of interest (ROIs); periorbital, supraorbital and mouth. Temperature variability of the ROIs in different expressions has been measured using statistical parameters. The temperature variation measurement in ROIs of a particular expression corresponds to a vector, which is later used in recognition of mixed facial expressions. Investigations show that facial features in mixed facial expressions can be characterized by positive emotion induced facial features and negative emotion induced facial features. Supraorbital is a useful facial region that can differentiate basic expressions from mixed expressions. Analysis and interpretation of mixed expressions have been conducted with the help of box and whisker plot. Facial region containing mixture of two expressions is generally less temperature inducing than corresponding facial region containing basic expressions.

  16. Emotional state talk and emotion understanding: a training study with preschool children.

    Science.gov (United States)

    Gavazzi, Ilaria Grazzani; Ornaghi, Veronica

    2011-11-01

    ABSTRACTThe present study investigates whether training preschool children in the active use of emotional state talk plays a significant role in bringing about greater understanding of emotion terms and improved emotion comprehension. Participants were 100 preschool children (M=52 months; SD=9·9; range: 35-70 months), randomly assigned to experimental or control conditions. They were pre- and post-tested to assess their language comprehension, metacognitive language comprehension and emotion understanding. Analyses of pre-test data did not show any significant differences between experimental and control groups. During the intervention phase, the children were read stories enriched with emotional lexicon. After listening to the stories, children in the experimental group took part in conversational language games designed to stimulate use of the selected emotional terms. In contrast, the control group children did not take part in any special linguistic activities after the story readings. Analyses revealed that the experimental group outperformed the control group in the understanding of inner state language and in the comprehension of emotion.

  17. Faces

    DEFF Research Database (Denmark)

    Mortensen, Kristine Køhler; Brotherton, Chloe

    2018-01-01

    for the face the be put into action. Based on an ethnographic study of Danish teenagers’ use of SnapChat we demonstrate how the face is used as a central medium for interaction with peers. Through the analysis of visual SnapChat messages we investigate how SnapChat requires the sender to put an ‘ugly’ face...... already secured their popular status on the heterosexual marketplace in the broad context of the school. Thus SnapChat functions both as a challenge to beauty norms of ‘flawless faces’ and as a reinscription of these same norms by further manifesting the exclusive status of the popular girl...

  18. Association of Irritability and Anxiety With the Neural Mechanisms of Implicit Face Emotion Processing in Youths With Psychopathology.

    Science.gov (United States)

    Stoddard, Joel; Tseng, Wan-Ling; Kim, Pilyoung; Chen, Gang; Yi, Jennifer; Donahue, Laura; Brotman, Melissa A; Towbin, Kenneth E; Pine, Daniel S; Leibenluft, Ellen

    2017-01-01

    Psychiatric comorbidity complicates clinical care and confounds efforts to elucidate the pathophysiology of commonly occurring symptoms in youths. To our knowledge, few studies have simultaneously assessed the effect of 2 continuously distributed traits on brain-behavior relationships in children with psychopathology. To determine shared and unique effects of 2 major dimensions of child psychopathology, irritability and anxiety, on neural responses to facial emotions during functional magnetic resonance imaging. Cross-sectional functional magnetic resonance imaging study in a large, well-characterized clinical sample at a research clinic at the National Institute of Mental Health. The referred sample included youths ages 8 to 17 years, 93 youths with anxiety, disruptive mood dysregulation, and/or attention-deficit/hyperactivity disorders and 22 healthy youths. The child's irritability and anxiety were rated by both parent and child on the Affective Reactivity Index and Screen for Child Anxiety Related Disorders, respectively. Using functional magnetic resonance imaging, neural response was measured across the brain during gender labeling of varying intensities of angry, happy, or fearful face emotions. In mixed-effects analyses, the shared and unique effects of irritability and anxiety were tested on amygdala functional connectivity and activation to face emotions. The mean (SD) age of participants was 13.2 (2.6) years; of the 115 included, 64 were male. Irritability and/or anxiety influenced amygdala connectivity to the prefrontal and temporal cortex. Specifically, irritability and anxiety jointly influenced left amygdala to left medial prefrontal cortex connectivity during face emotion viewing (F4,888 = 9.20; P differences in neural response to face emotions in several areas (F2, 888 ≥ 13.45; all P emotion dysregulation when very anxious and irritable youth process threat-related faces. Activation in the ventral visual circuitry suggests a mechanism

  19. The Processing of Human Emotional Faces by Pet and Lab Dogs: Evidence for Lateralization and Experience Effects

    Science.gov (United States)

    Barber, Anjuli L. A.; Randi, Dania; Müller, Corsin A.; Huber, Ludwig

    2016-01-01

    From all non-human animals dogs are very likely the best decoders of human behavior. In addition to a high sensitivity to human attentive status and to ostensive cues, they are able to distinguish between individual human faces and even between human facial expressions. However, so far little is known about how they process human faces and to what extent this is influenced by experience. Here we present an eye-tracking study with dogs emanating from two different living environments and varying experience with humans: pet and lab dogs. The dogs were shown pictures of familiar and unfamiliar human faces expressing four different emotions. The results, extracted from several different eye-tracking measurements, revealed pronounced differences in the face processing of pet and lab dogs, thus indicating an influence of the amount of exposure to humans. In addition, there was some evidence for the influences of both, the familiarity and the emotional expression of the face, and strong evidence for a left gaze bias. These findings, together with recent evidence for the dog's ability to discriminate human facial expressions, indicate that dogs are sensitive to some emotions expressed in human faces. PMID:27074009

  20. Emotional state and local versus global spatial memory.

    Science.gov (United States)

    Brunyé, Tad T; Mahoney, Caroline R; Augustyn, Jason S; Taylor, Holly A

    2009-02-01

    The present work investigated the effects of participant emotional state on global versus local memory for map-based information. Participants were placed into one of four emotion induction groups, crossing high and low arousal with positive and negative valence, or a control group. They then studied a university campus map and completed two memory tests, free recall and spatial statement verification. Converging evidence from these two tasks demonstrated that arousal amplifies symbolic distance effects and leads to a globally-focused spatial mental representation, partially at the expense of local knowledge. These results were found for both positively- and negatively-valenced affective states. The present study is the first investigation of emotional effects on spatial memory, and has implications for theories of emotion and spatial cognition.

  1. Quality of life and emotional state in chronic skin disease.

    Science.gov (United States)

    Pärna, Ene; Aluoja, Anu; Kingo, Külli

    2015-03-01

    The aim of this study was to evaluate the associations between chronic inflammatory skin conditions and patients' emotional state and quality of life. The following self-rated questionnaires were used: Emotional State Questionnaire, a self-report scale assessing depression and anxiety symptoms; Dermatology Life Quality Index (DLQI); and RAND-36, a measure of health-related quality of life. The study group comprised 40 patients with psoriasis, 40 with eczema, 40 with acne, 15 with seborrhoeic dermatitis and 40 healthy controls. Patients with chronic skin diseases had lower DLQI and lower RAND-36 physical functioning scores, more perceived physical limitations and pain, and lower emotional well-being and general health ratings compared with the control group. In conclusion, chronic skin diseases are associated with symptoms of emotional distress, in particular insomnia and general anxiety.

  2. Working memory training improves emotional states of healthy individuals

    Directory of Open Access Journals (Sweden)

    Hikaru eTakeuchi

    2014-10-01

    Full Text Available Working memory (WM capacity is associated with various emotional aspects, including states of depression and stress, reactions to emotional stimuli, and regulatory behaviors. We have previously investigated the effects of WM training (WMT on cognitive functions and brain structures. However, the effects of WMT on emotional states and related neural mechanisms among healthy young adults remain unknown. In the present study, we investigated these effects in young adults who underwent WMT or received no intervention for 4 weeks. Before and after the intervention, subjects completed self-report questionnaires related to their emotional states and underwent scanning sessions in which brain activities related to negative emotions were measured. Compared with controls, subjects who underwent WMT showed reduced anger, fatigue, and depression. Furthermore, WMT reduced activity in the left posterior insula during tasks evoking negative emotion, which was related to anger. It also reduced activity in the left frontoparietal area. These findings show that WMT can reduce negative mood and provide new insight into the clinical applications of WMT, at least among subjects with preclinical-level conditions.

  3. Amygdala activation and its functional connectivity during perception of emotional faces in social phobia and panic disorder

    NARCIS (Netherlands)

    Demenescu, L.R.; Kortekaas, R.; Cremers, H.R.; Renken, R.J.; van Tol, M.J.; van der Wee, M.J.A.; Veltman, D.J.; den Boer, J.A.; Roelofs, K.; Aleman, A.

    Social phobia (SP) and panic disorder (PD) have been associated with aberrant amygdala responses to threat-related stimuli. The aim of the present study was to examine amygdala function and its connectivity with medial prefrontal cortex (mPFC) during emotional face perception in PD and SP, and the

  4. Association of Maternal Interaction with Emotional Regulation in 4 and 9 Month Infants During the Still Face Paradigm

    Science.gov (United States)

    Lowe, Jean R.; MacLean, Peggy C.; Duncan, Andrea F.; Aragón, Crystal; Schrader, Ronald M.; Caprihan, Arvind; Phillips, John P.

    2013-01-01

    This study used the Still Face Paradigm to investigate the relationship of maternal interaction on infants’ emotion regulation responses. Seventy infant-mother dyads were seen at 4 months and 25 of these same dyads were re-evaluated at 9 months. Maternal interactions were coded for attention seeking and contingent responding. Emotional regulation was described by infant stress reaction and overall positive affect. Results indicated that at both 4 and 9 months mothers who used more contingent responding interactions had infants who showed more positive affect. In contrast, mothers who used more attention seeking play had infants who showed less positive affect after the Still Face Paradigm. Patterns of stress reaction were reversed, as mothers who used more attention seeking play had infants with less negative affect. Implications for intervention and emotional regulation patterns over time are discussed. PMID:22217393

  5. Influence of spatial frequency and emotion expression on face processing in patients with panic disorder.

    Science.gov (United States)

    Shim, Miseon; Kim, Do-Won; Yoon, Sunkyung; Park, Gewnhi; Im, Chang-Hwan; Lee, Seung-Hwan

    2016-06-01

    Deficits in facial emotion processing is a major characteristic of patients with panic disorder. It is known that visual stimuli with different spatial frequencies take distinct neural pathways. This study investigated facial emotion processing involving stimuli presented at broad, high, and low spatial frequencies in patients with panic disorder. Eighteen patients with panic disorder and 19 healthy controls were recruited. Seven event-related potential (ERP) components: (P100, N170, early posterior negativity (EPN); vertex positive potential (VPP), N250, P300; and late positive potential (LPP)) were evaluated while the participants looked at fearful and neutral facial stimuli presented at three spatial frequencies. When a fearful face was presented, panic disorder patients showed a significantly increased P100 amplitude in response to low spatial frequency compared to high spatial frequency; whereas healthy controls demonstrated significant broad spatial frequency dependent processing in P100 amplitude. Vertex positive potential amplitude was significantly increased in high and broad spatial frequency, compared to low spatial frequency in panic disorder. Early posterior negativity amplitude was significantly different between HSF and BSF, and between LSF and BSF processing in both groups, regardless of facial expression. The possibly confounding effects of medication could not be controlled. During early visual processing, patients with panic disorder prefer global to detailed information. However, in later processing, panic disorder patients overuse detailed information for the perception of facial expressions. These findings suggest that unique spatial frequency-dependent facial processing could shed light on the neural pathology associated with panic disorder. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Psychopathic traits are associated with reduced attention to the eyes of emotional faces among adult male non-offenders

    Directory of Open Access Journals (Sweden)

    Steven Mark Gillespie

    2015-10-01

    Full Text Available Psychopathic traits are linked with impairments in emotional facial expression recognition. These impairments may, in part, reflect reduced attention to the eyes of emotional faces. Although reduced attention to the eyes has been noted among children with conduct problems and callous-unemotional traits, similar findings are yet to be found in relation to psychopathic traits among adult male participants. Here we investigated the relationship of primary (selfish, uncaring and secondary (impulsive, antisocial psychopathic traits with attention to the eyes among adult male non-offenders during an emotion recognition task. We measured the number of fixations, and overall dwell time, on the eyes and the mouth of male and female faces showing the six basic emotions at varying levels of intensity. We found no relationship of primary or secondary psychopathic traits with recognition accuracy. However, primary psychopathic traits were associated with a reduced number of fixations, and lower overall dwell time, on the eyes relative to the mouth across expressions, intensity, and sex. Furthermore, the relationship of primary psychopathic traits with attention to the eyes of angry and fearful faces was influenced by the sex and intensity of the expression. We also showed that a greater number of fixations on the eyes, relative to the mouth, was associated with increased accuracy for angry and fearful expression recognition. These results are the first to show effects of psychopathic traits on attention to the eyes of emotional faces in an adult male sample, and may support amygdala based accounts of psychopathy. These findings may also have methodological implications for clinical studies of emotion recognition.

  7. Amygdala habituation to emotional faces in adolescents with internalizing disorders, adolescents with childhood sexual abuse related PTSD and healthy adolescents

    Directory of Open Access Journals (Sweden)

    Bianca G. van den Bulk

    2016-10-01

    Full Text Available Adolescents with internalizing disorders and adolescents with childhood sexual abuse related post-traumatic stress disorder (CSA-related PTSD show a large overlap in symptomatology. In addition, brain research indicated hyper-responsiveness and sustained activation instead of habituation of amygdala activation to emotional faces in both groups. Little is known, however, about whether the same patterns of amygdala habituation are present in these two groups. The current study examined habituation patterns of amygdala activity to emotional faces (fearful, happy and neutral in adolescents with a DSM-IV depressive and/or anxiety disorder (N = 25, adolescents with CSA-related PTSD (N = 19 and healthy controls (N = 26. Behaviourally, the adolescents from the internalizing and CSA-related PTSD group reported more anxiety to fearful and neutral faces than adolescents from the control group and adolescents from the CSA-related PTSD group reacted slower compared to the internalizing group. At the whole brain level, there was a significant interaction between time and group within the left amygdala. Follow-up ROI analysis showed elevated initial activity in the amygdala and rapid habituation in the CSA-related PTSD group compared to the internalizing group. These findings suggest that habituation patterns of amygdala activation provide additional information on problems with emotional face processing. Furthermore, the results suggest there are differences in the underlying neurobiological mechanisms related to emotional face processing for adolescents with internalizing disorders and adolescents with CSA-related PTSD. Possibly CSA-related PTSD is characterized by a stronger primary emotional response driven by the amygdala.

  8. Amygdala habituation to emotional faces in adolescents with internalizing disorders, adolescents with childhood sexual abuse related PTSD and healthy adolescents.

    Science.gov (United States)

    van den Bulk, Bianca G; Somerville, Leah H; van Hoof, Marie-José; van Lang, Natasja D J; van der Wee, Nic J A; Crone, Eveline A; Vermeiren, Robert R J M

    2016-10-01

    Adolescents with internalizing disorders and adolescents with childhood sexual abuse related post-traumatic stress disorder (CSA-related PTSD) show a large overlap in symptomatology. In addition, brain research indicated hyper-responsiveness and sustained activation instead of habituation of amygdala activation to emotional faces in both groups. Little is known, however, about whether the same patterns of amygdala habituation are present in these two groups. The current study examined habituation patterns of amygdala activity to emotional faces (fearful, happy and neutral) in adolescents with a DSM-IV depressive and/or anxiety disorder (N=25), adolescents with CSA-related PTSD (N=19) and healthy controls (N=26). Behaviourally, the adolescents from the internalizing and CSA-related PTSD group reported more anxiety to fearful and neutral faces than adolescents from the control group and adolescents from the CSA-related PTSD group reacted slower compared to the internalizing group. At the whole brain level, there was a significant interaction between time and group within the left amygdala. Follow-up ROI analysis showed elevated initial activity in the amygdala and rapid habituation in the CSA-related PTSD group compared to the internalizing group. These findings suggest that habituation patterns of amygdala activation provide additional information on problems with emotional face processing. Furthermore, the results suggest there are differences in the underlying neurobiological mechanisms related to emotional face processing for adolescents with internalizing disorders and adolescents with CSA-related PTSD. Possibly CSA-related PTSD is characterized by a stronger primary emotional response driven by the amygdala. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Do bodily expressions compete with facial expressions? Time course of integration of emotional signals from the face and the body.

    Science.gov (United States)

    Gu, Yuanyuan; Mai, Xiaoqin; Luo, Yue-jia

    2013-01-01

    The decoding of social signals from nonverbal cues plays a vital role in the social interactions of socially gregarious animals such as humans. Because nonverbal emotional signals from the face and body are normally seen together, it is important to investigate the mechanism underlying the integration of emotional signals from these two sources. We conducted a study in which the time course of the integration of facial and bodily expressions was examined via analysis of event-related potentials (ERPs) while the focus of attention was manipulated. Distinctive integrating features were found during multiple stages of processing. In the first stage, threatening information from the body was extracted automatically and rapidly, as evidenced by enhanced P1 amplitudes when the subjects viewed compound face-body images with fearful bodies compared with happy bodies. In the second stage, incongruency between emotional information from the face and the body was detected and captured by N2. Incongruent compound images elicited larger N2s than did congruent compound images. The focus of attention modulated the third stage of integration. When the subjects' attention was focused on the face, images with congruent emotional signals elicited larger P3s than did images with incongruent signals, suggesting more sustained attention and elaboration of congruent emotional information extracted from the face and body. On the other hand, when the subjects' attention was focused on the body, images with fearful bodies elicited larger P3s than did images with happy bodies, indicating more sustained attention and elaboration of threatening information from the body during evaluative processes.

  10. Facial Expression Generation from Speaker's Emotional States in Daily Conversation

    Science.gov (United States)

    Mori, Hiroki; Ohshima, Koh

    A framework for generating facial expressions from emotional states in daily conversation is described. It provides a mapping between emotional states and facial expressions, where the former is represented by vectors with psychologically-defined abstract dimensions, and the latter is coded by the Facial Action Coding System. In order to obtain the mapping, parallel data with rated emotional states and facial expressions were collected for utterances of a female speaker, and a neural network was trained with the data. The effectiveness of proposed method is verified by a subjective evaluation test. As the result, the Mean Opinion Score with respect to the suitability of generated facial expression was 3.86 for the speaker, which was close to that of hand-made facial expressions.

  11. Emotional eating and Pavlovian learning: evidence for conditioned appetitive responding to negative emotional states.

    Science.gov (United States)

    Bongers, Peggy; Jansen, Anita

    2017-02-01

    Appetitive learning has been demonstrated several times using neutral cues or contexts as a predictor of food intake and it has been shown that humans easily learn cued desires for foods. It has, however, never been studied whether internal cues are also capable of appetitive conditioning. In this study, we tested whether humans can learn cued eating desires to negative moods as conditioned stimuli (CS), thereby offering a potential explanation of emotional eating (EE). Female participants were randomly presented with 10 different stimuli eliciting either negative or neutral emotional states, with one of these states paired with eating chocolate. Expectancy to eat, desire to eat, salivation, and unpleasantness of experiencing negative emotions were assessed. After conditioning, participants were brought into a negative emotional state and were asked to choose between money and chocolate. Data showed differential conditioned responding on the expectancy and desire measures, but not on salivation. Specific conditioned effects were obtained for participants with a higher BMI (body mass index) on the choice task, and for participants high on EE on the unpleasantness ratings. These findings provide the first experimental evidence for the idea that negative emotions can act as conditioned stimuli, and might suggest that classical conditioning is involved in EE.

  12. Face and emotion recognition deficits in Turner syndrome: a possible role for X-linked genes in amygdala development.

    Science.gov (United States)

    Lawrence, Kate; Kuntsi, Jonna; Coleman, Michael; Campbell, Ruth; Skuse, David

    2003-01-01

    Face recognition is thought to rely on configural visual processing. Where face recognition impairments have been identified, qualitatively delayed or anomalous configural processing has also been found. A group of women with Turner syndrome (TS) with monosomy for a single maternal X chromosome (45, Xm) showed an impairment in face recognition skills compared with normally developing women. However, normal configural face-processing abilities were apparent. The ability to recognize facial expressions of emotion, particularly fear, was also impaired in this TS subgroup. Face recognition and fear recognition accuracy were significantly correlated in the female control group but not in women with TS. The authors therefore suggest that anomalies in amygdala function may be a neurological feature of TS of this karyotype.

  13. Effects of music interventions on emotional States and running performance.

    Science.gov (United States)

    Lane, Andrew M; Davis, Paul A; Devonport, Tracey J

    2011-01-01

    The present study compared the effects of two different music interventions on changes in emotional states before and during running, and also explored effects of music interventions upon performance outcome. Volunteer participants (n = 65) who regularly listened to music when running registered online to participate in a three-stage study. Participants attempted to attain a personally important running goal to establish baseline performance. Thereafter, participants were randomly assigned to either a self-selected music group or an Audiofuel music group. Audiofuel produce pieces of music designed to assist synchronous running. The self-selected music group followed guidelines for selecting motivating playlists. In both experimental groups, participants used the Brunel Music Rating Inventory-2 (BMRI-2) to facilitate selection of motivational music. Participants again completed the BMRI-2 post- intervention to assess the motivational qualities of Audiofuel music or the music they selected for use during the study. Results revealed no significant differences between self-selected music and Audiofuel music on all variables analyzed. Participants in both music groups reported increased pleasant emotions and decreased unpleasant emotions following intervention. Significant performance improvements were demonstrated post-intervention with participants reporting a belief that emotional states related to performance. Further analysis indicated that enhanced performance was significantly greater among participants reporting music to be motivational as indicated by high scores on the BMRI-2. Findings suggest that both individual athletes and practitioners should consider using the BMRI-2 when selecting music for running. Key pointsListening to music with a high motivational quotient as indicated by scores on the BMRI-2 was associated with enhanced running performance and meta-emotional beliefs that emotions experienced during running helped performance.Beliefs on the

  14. Emotion

    Science.gov (United States)

    Choi, Sukwoo

    It was widely accepted that emotion such as fear, anger and pleasure could not be studied using a modern scientific tools. During the very early periods of emotion researches, psychologists, but not biologist, dominated in studying emotion and its disorders. Intuitively, one may think that emotion arises from brain first and then bodily responses follow. For example, we are sad first, and then cry. However, groups of psychologists suggested a proposal that our feeling follows bodily responses; that is, we feel sad because we cry! This proposal seems counterintuitive but became a popular hypothesis for emotion. Another example for this hypothesis is as follows. When you accidentally confront a large bear in a mountain, what would be your responses?; you may feel terrified first, and then run, or you may run first, and then feel terrified later on. In fact, the latter explanation is correct! You feel fear after you run (even because you run?). Or, you can imagine that you date with your girl friend who you love so much. Your heart must be beating fast and your body temperature must be elevated! In this situation, if you take a very cold bath, what would you expect? Your hot feeling is usually calmed down after this cold bath; that is, you feel hot because your heart and bodily temperature change. While some evidence supported this hypothesis, others do not. In the case of patients whose cervical vertebrae were severed with an accident, they still retained significant amount of emotion (feelings!) in some cases (but other patients lost most of emotional experience). In addition, one can imagine that there would be a specific set of physical responses for specific emotion if the original hypothesis is correct (e.g. fasten heart beating and redden face for anger etc.). However, some psychologists failed to find any specific set of physical responses for specific emotion, though others insisted that there existed such specific responses. Based on these controversial

  15. Animal emotions, behaviour and the promotion of positive welfare states.

    Science.gov (United States)

    Mellor, D J

    2012-01-01

    This paper presents a rationale that may significantly boost the drive to promote positive welfare states in animals. The rationale is based largely, but not exclusively, on an experimentally supported neuropsychological understanding of relationships between emotions and behaviour, an understanding that has not yet been incorporated into animal welfare science thinking. Reference is made to major elements of the neural/cognitive foundations of motivational drives that energise and direct particular behaviours and their related subjective or emotional experiences. These experiences are generated in part by sensory inputs that reflect the animal's internal functional state and by neural processing linked to the animal's perception of its external circumstances. The integrated subjective or emotional outcome of these inputs corresponds to the animal's welfare status. The internally generated subjective experiences represent motivational urges or drives that are predominantly negative and include breathlessness, thirst, hunger and pain. They are generated by, and elicit specific behaviours designed to correct, imbalances in the animal's internal functional state. Externally generated subjective experiences are said to be integral to the operation of interacting 'action-orientated systems' that give rise to particular behaviours and their negative or positive emotional contents. These action-orientated systems, described in neuropsychological terms, give rise to negative emotions that include fear, anger and panic, and positive emotions that include comfort, vitality, euphoria and playfulness. It is argued that early thinking about animal welfare management focused mainly on minimising disturbances to the internal functional states that generate associated unpleasant motivational urges or drives. This strategy produced animal welfare benefits, but at best it could only lift a poor net welfare status to a neutral one. In contrast, strategies designed to manipulate the

  16. Effects of Acute Alcohol Consumption on the Processing of Emotion in Faces: Implications for Understanding Alcohol-Related Aggression

    Science.gov (United States)

    Attwood, Angela S.; Munafò, Marcus R.

    2016-01-01

    The negative consequences of chronic alcohol abuse are well known, but heavy episodic consumption ("binge drinking") is also associated with significant personal and societal harms. Aggressive tendencies are increased after alcohol but the mechanisms underlying these changes are not fully understood. While effects on behavioural control are likely to be important, other effects may be involved given the widespread action of alcohol. Altered processing of social signals is associated with changes in social behaviours, including aggression, but until recently there has been little research investigating the effects of acute alcohol consumption on these outcomes. Recent work investigating the effects of acute alcohol on emotional face processing has suggested reduced sensitivity to submissive signals (sad faces) and increased perceptual bias towards provocative signals (angry faces) after alcohol consumption, which may play a role in alcohol-related aggression. Here we discuss a putative mechanism that may explain how alcohol consumption influences emotional processing and subsequent aggressive responding, via disruption of OFC-amygdala connectivity. While the importance of emotional processing on social behaviours is well established, research into acute alcohol consumption and emotional processing is still in its infancy. Further research is needed and we outline a research agenda to address gaps in the literature. PMID:24920135

  17. Toward physiological indices of emotional state driving future ebook interactivity

    Directory of Open Access Journals (Sweden)

    Jan B.F. van Erp

    2016-05-01

    Full Text Available Ebooks of the future may respond to the emotional experience of the reader. (Neuro- physiological measures could capture a reader’s emotional state and use this to enhance the reading experience by adding matching sounds or to change the storyline therewith creating a hybrid art form in between literature and gaming. We describe the theoretical foundation of the emotional and creative brain and review the neurophysiological indices that can be used to drive future ebook interactivity in a real life situation. As a case study, we report the neurophysiological measurements of a bestselling author during nine days of writing which can potentially be used later to compare them to those of the readers. In designated calibration blocks, the artist wrote emotional paragraphs for emotional (IAPS pictures. Analyses showed that we can reliably distinguish writing blocks from resting but we found no reliable differences related to the emotional content of the writing. The study shows that measurements of EEG, heart rate (variability, skin conductance, facial expression and subjective ratings can be done over several hours a day and for several days in a row. In follow-up phases, we will measure 300 readers with a similar setup.

  18. Infants’ Temperament and Mothers’, and Fathers’ Depression Predict Infants’ Attention to Objects Paired with Emotional Faces

    NARCIS (Netherlands)

    Aktar, E.; Mandell, D.J.; de Vente, W.; Majdandžić, M.; Raijmakers, M.E.J.; Bögels, S.M.

    2016-01-01

    Between 10 and 14 months, infants gain the ability to learn about unfamiliar stimuli by observing others’ emotional reactions to those stimuli, so called social referencing (SR). Joint processing of emotion and head/gaze direction is essential for SR. This study tested emotion and head/gaze

  19. Preschooler's Faces in Spontaneous Emotional Contexts--How Well Do They Match Adult Facial Expression Prototypes?

    Science.gov (United States)

    Gaspar, Augusta; Esteves, Francisco G.

    2012-01-01

    Prototypical facial expressions of emotion, also known as universal facial expressions, are the underpinnings of most research concerning recognition of emotions in both adults and children. Data on natural occurrences of these prototypes in natural emotional contexts are rare and difficult to obtain in adults. By recording naturalistic…

  20. P2-27: Electrophysiological Correlates of Conscious and Unconscious Processing of Emotional Faces in Individuals with High and Low Autistic Traits

    Directory of Open Access Journals (Sweden)

    Svjetlana Vukusic

    2012-10-01

    Full Text Available LeDoux (1996 The Emotional Brain has suggested that subconsciouss presentation of fearful emotional information is relayed to the amygdala along a rapid subcortical route. Rapid emotion processing is important because it alerts other parts of brain to emotionally salient information. It also produces immediate reflexive responses to threating stimuli in comparison to slower conscious appraisal, which is of important adaptive survival value. Current theoretical models of autism spectrum disorders (ASD have linked impairments in the processing of emotional information to amygdala dysfunction. It can be suggested that impairment in face processing found in autism may be the result of impaired rapid subconscious processing of emotional information which does not make faces socially salient. Previous studies examined subconscious processing of emotional stimuli with backward masking paradigms by using very brief presentation of emotional face stimuli proceeded by a mask. We used an event-related potential (ERP study within a backward masking paradigm with subjects with low and high autistic tendencies as measured by the Autism Spectrum Quotient (AQ questionnaire. The time course of processing of fearful and happy facial expressions and an emotionally neutral face was investigated during subliminal (16 ms and supraliminal (166 ms stimuli presentation. The task consisted of an explicit categorization of emotional and neutral faces. We looked at ERP components N2, P3a, and also N170 for differences between subjects with low ( 19 AQ.

  1. The perception and identification of facial emotions in individuals with autism spectrum disorders using the Let's Face It! Emotion Skills Battery.

    Science.gov (United States)

    Tanaka, James W; Wolf, Julie M; Klaiman, Cheryl; Koenig, Kathleen; Cockburn, Jeffrey; Herlihy, Lauren; Brown, Carla; Stahl, Sherin S; South, Mikle; McPartland, James C; Kaiser, Martha D; Schultz, Robert T

    2012-12-01

    Although impaired social-emotional ability is a hallmark of autism spectrum disorder (ASD), the perceptual skills and mediating strategies contributing to the social deficits of autism are not well understood. A perceptual skill that is fundamental to effective social communication is the ability to accurately perceive and interpret facial emotions. To evaluate the expression processing of participants with ASD, we designed the Let's Face It! Emotion Skills Battery (LFI! Battery), a computer-based assessment composed of three subscales measuring verbal and perceptual skills implicated in the recognition of facial emotions. We administered the LFI! Battery to groups of participants with ASD and typically developing control (TDC) participants that were matched for age and IQ. On the Name Game labeling task, participants with ASD (N = 68) performed on par with TDC individuals (N = 66) in their ability to name the facial emotions of happy, sad, disgust and surprise and were only impaired in their ability to identify the angry expression. On the Matchmaker Expression task that measures the recognition of facial emotions across different facial identities, the ASD participants (N = 66) performed reliably worse than TDC participants (N = 67) on the emotions of happy, sad, disgust, frighten and angry. In the Parts-Wholes test of perceptual strategies of expression, the TDC participants (N = 67) displayed more holistic encoding for the eyes than the mouths in expressive faces whereas ASD participants (N = 66) exhibited the reverse pattern of holistic recognition for the mouth and analytic recognition of the eyes. In summary, findings from the LFI! Battery show that participants with ASD were able to label the basic facial emotions (with the exception of angry expression) on par with age- and IQ-matched TDC participants. However, participants with ASD were impaired in their ability to generalize facial emotions across different identities and showed a tendency to recognize

  2. Effects on automatic attention due to exposure to pictures of emotional faces while performing Chinese word judgment tasks.

    Science.gov (United States)

    Junhong, Huang; Renlai, Zhou; Senqi, Hu

    2013-01-01

    Two experiments were conducted to investigate the automatic processing of emotional facial expressions while performing low or high demand cognitive tasks under unattended conditions. In Experiment 1, 35 subjects performed low (judging the structure of Chinese words) and high (judging the tone of Chinese words) cognitive load tasks while exposed to unattended pictures of fearful, neutral, or happy faces. The results revealed that the reaction time was slower and the performance accuracy was higher while performing the low cognitive load task than while performing the high cognitive load task. Exposure to fearful faces resulted in significantly longer reaction times and lower accuracy than exposure to neutral faces on the low cognitive load task. In Experiment 2, 26 subjects performed the same word judgment tasks and their brain event-related potentials (ERPs) were measured for a period of 800 ms after the onset of the task stimulus. The amplitudes of the early component of ERP around 176 ms (P2) elicited by unattended fearful faces over frontal-central-parietal recording sites was significantly larger than those elicited by unattended neutral faces while performing the word structure judgment task. Together, the findings of the two experiments indicated that unattended fearful faces captured significantly more attention resources than unattended neutral faces on a low cognitive load task, but not on a high cognitive load task. It was concluded that fearful faces could automatically capture attention if residues of attention resources were available under the unattended condition.

  3. Attentional capture by emotional faces is contingent on attentional control settings

    DEFF Research Database (Denmark)

    Barratt, D.; Bundesen, Claus

    2012-01-01

    faster and attract more processing resources), responses to positive faces were slower when these were flanked by (response incompatible) negative faces as compared with positive or neutral faces, whereas responses to negative faces were unaffected by the identity of the flankers. Experiment 2...

  4. Early life stress and trauma and enhanced limbic activation to emotionally valenced faces in depressed and healthy children.

    Science.gov (United States)

    Suzuki, Hideo; Luby, Joan L; Botteron, Kelly N; Dietrich, Rachel; McAvoy, Mark P; Barch, Deanna M

    2014-07-01

    Previous studies have examined the relationships between structural brain characteristics and early life stress in adults. However, there is limited evidence for functional brain variation associated with early life stress in children. We hypothesized that early life stress and trauma would be associated with increased functional brain activation response to negative emotional faces in children with and without a history of depression. Psychiatric diagnosis and life events in children (starting at age 3-5 years) were assessed in a longitudinal study. A follow-up magnetic resonance imaging (MRI) study acquired data (N = 115 at ages 7-12, 51% girls) on functional brain response to fearful, sad, and happy faces relative to neutral faces. We used a region-of-interest mask within cortico-limbic areas and conducted regression analyses and repeated-measures analysis of covariance. Greater activation responses to fearful, sad, and happy faces in the amygdala and its neighboring regions were found in children with greater life stress. Moreover, an association between life stress and left hippocampal and globus pallidus activity depended on children's diagnostic status. Finally, all children with greater life trauma showed greater bilateral amygdala and cingulate activity specific to sad faces but not the other emotional faces, although right amygdala activity was moderated by psychiatric status. These findings suggest that limbic hyperactivity may be a biomarker of early life stress and trauma in children and may have implications in the risk trajectory for depression and other stress-related disorders. However, this pattern varied based on emotion type and history of psychopathology. Copyright © 2014 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  5. The Glass Half Empty: How Emotional Exhaustion Affects the State-Trait Discrepancy in Self-Reports of Teaching Emotions.

    Science.gov (United States)

    Goetz, Thomas; Becker, Eva S; Bieg, Madeleine; Keller, Melanie M; Frenzel, Anne C; Hall, Nathan C

    2015-01-01

    Following from previous research on intensity bias and the accessibility model of emotional self-report, the present study examined the role of emotional exhaustion in explaining the discrepancy in teachers' reports of their trait (habitual) versus state (momentary, "real") emotions. Trait reports (habitual emotions, exhaustion) were assessed via trait questionnaires, and state reports (momentary emotions) were assessed in real time via the experience sampling method by using personal digital assistants (N = 69 high school teachers; 1,089 measures within teachers). In line with our assumptions, multi-level analyses showed that, as compared to the state assessment, teachers reported higher levels of habitual teaching-related emotions of anger, anxiety, shame, boredom, enjoyment, and pride. Additionally, the state-trait discrepancy in self-reports of negative emotions was accounted for by teachers' emotional exhaustion, with high exhaustion levels corresponding with a greater state-trait discrepancy. Exhaustion levels did not moderate the state-trait discrepancy in positive emotions indicating that perceived emotional exhaustion may reflect identity-related cognitions specific to the negative belief system. Implications for research and educational practice are discussed.

  6. The Glass Half Empty: How Emotional Exhaustion Affects the State-Trait Discrepancy in Self-Reports of Teaching Emotions

    Science.gov (United States)

    Goetz, Thomas; Becker, Eva S.; Bieg, Madeleine; Keller, Melanie M.; Frenzel, Anne C.; Hall, Nathan C.

    2015-01-01

    Following from previous research on intensity bias and the accessibility model of emotional self-report, the present study examined the role of emotional exhaustion in explaining the discrepancy in teachers’ reports of their trait (habitual) versus state (momentary, “real”) emotions. Trait reports (habitual emotions, exhaustion) were assessed via trait questionnaires, and state reports (momentary emotions) were assessed in real time via the experience sampling method by using personal digital assistants (N = 69 high school teachers; 1,089 measures within teachers). In line with our assumptions, multi-level analyses showed that, as compared to the state assessment, teachers reported higher levels of habitual teaching-related emotions of anger, anxiety, shame, boredom, enjoyment, and pride. Additionally, the state-trait discrepancy in self-reports of negative emotions was accounted for by teachers’ emotional exhaustion, with high exhaustion levels corresponding with a greater state-trait discrepancy. Exhaustion levels did not moderate the state-trait discrepancy in positive emotions indicating that perceived emotional exhaustion may reflect identity-related cognitions specific to the negative belief system. Implications for research and educational practice are discussed. PMID:26368911

  7. FACING UP TO MULTINATIONALCOMPLEX LITIGATION IN THE UNITED STATES

    Directory of Open Access Journals (Sweden)

    Ángel R. Oquendo

    2017-05-01

    Full Text Available A federal court should approach the presence of foreigners in a global class action for monetary relief with an openmind. It should keep them in so long as it can conclude, upon a reflective comparative law analysis, that the judiciary in theirnation of origin would uphold the ultimate ruling. For example, Latin American absent class members should normally stay on board inasmuchas virtually every jurisdictionin their regionwould allow a U.S. adjudicator to arrive at this conclusion.Accordingly, they would fail, on grounds of res judicata, if they ever tried to re-litigate the matter back home upon a defeat on the merits in the United States. In particular, a tribunal from any one of seven representative regional countries (Mexico, Brazil, Venezuela, Colombia, Panama, Peru, and Ecuador wouldmost probably find such a U.S.judgment consistent with local due process, as well as with the remaining requirements for recognition.In other words, it would hold thatabsentees stemming from its jurisdictional territory could not legitimately complain about the preclusive effect since they would have free ridden on the efforts of their representatives with a chance at compensation, would have benefited from numerous fairness controls, and could have similarly faced preclusion in their homeland based on a suit prosecuted by someone else without their authorization. Judges in the United States should engage in a similar in-depth deliberation to decide whether to welcomecitizens from anywhere else in the world to the litigation.

  8. Problems Faced by Mexican Asylum Seekers in the United States

    Directory of Open Access Journals (Sweden)

    J. Anna Cabot

    2014-12-01

    Full Text Available Violence in Mexico rose sharply in response to President Felipe Calderón’s military campaign against drug cartels which began in late 2006. As a consequence, the number of Mexicans who have sought asylum in the United States has grown significantly. In 2013, Mexicans made up the second largest group of defensive asylum seekers (those in removal proceedings in the United States, behind only China (EOIR 2014b. Yet between 2008 and 2013, the grant rate for Mexican asylum seekers in immigration court fell from 23 percent to nine percent (EOIR 2013, 2014b. This paper examines—from the perspective of an attorney who represented Mexican asylum seekers on the US-Mexico border in El Paso, Texas—the reasons for low asylum approval rates for Mexicans despite high levels of violence in and flight from Mexico from 2008 to 2013. It details the obstacles faced by Mexican asylum seekers along the US-Mexico border, including placement in removal proceedings, detention, evidentiary issues, narrow legal standards, and (effectively judicial notice of country conditions in Mexico. The paper recommends that asylum seekers at the border be placed in affirmative proceedings (before immigration officials, making them eligible for bond. It also proposes increased oversight of immigration judges.

  9. Western United States Dams Challenges Faced, Options, and Opportunities

    Science.gov (United States)

    Raff, D.

    2017-12-01

    Water management in the Western United States relies significantly upon a fleet of small to very large engineered dams to store water during times of runoff and distribute that water during times of need. Much of this infrastructure is Federally owned and/or operated, and was designed and funded during the first half of the twentieth century through a complex set of repayment contracts for Federally authorized purposes addressing water supply, recreation, and hydropower, and other water management objectives. With environmental laws, namely the Endangered Species Act, and other environmental concerns taking a more active role in water resources in the mid to latter half of the twentieth century, this infrastructure is being stressed even greater than anticipated to provide authorized purposes. Additionally, weather and climate norms being experienced are certainly near the edges, if not outside, of anticipated variability in the climate and hydrology scenarios for which the infrastructure was designed. And, finally, these dams, economically designed for a lifespan of 50 - 100 years, are experiencing maintenance challenges from routine to significant. This presentation will focus on identifying some of the history and challenges facing the water infrastructure in the Western United States. Additionally, some perspectives on future paths to meet the needs of western irrigation and hydropower production will be provided.

  10. Dissociable neural effects of stimulus valence and preceding context during the inhibition of responses to emotional faces.

    Science.gov (United States)

    Schulz, Kurt P; Clerkin, Suzanne M; Halperin, Jeffrey M; Newcorn, Jeffrey H; Tang, Cheuk Y; Fan, Jin

    2009-09-01

    Socially appropriate behavior requires the concurrent inhibition of actions that are inappropriate in the context. This self-regulatory function requires an interaction of inhibitory and emotional processes that recruits brain regions beyond those engaged by either processes alone. In this study, we isolated brain activity associated with response inhibition and emotional processing in 24 healthy adults using event-related functional magnetic resonance imaging (fMRI) and a go/no-go task that independently manipulated the context preceding no-go trials (ie, number of go trials) and the valence (ie, happy, sad, and neutral) of the face stimuli used as trial cues. Parallel quadratic trends were seen in correct inhibitions on no-go trials preceded by increasing numbers of go trials and associated activation for correct no-go trials in inferior frontal gyrus pars opercularis, pars triangularis, and pars orbitalis, temporoparietal junction, superior parietal lobule, and temporal sensory association cortices. Conversely, the comparison of happy versus neutral faces and sad versus neutral faces revealed valence-dependent activation in the amygdala, anterior insula cortex, and posterior midcingulate cortex. Further, an interaction between inhibition and emotion was seen in valence-dependent variations in the quadratic trend in no-go activation in the right inferior frontal gyrus and left posterior insula cortex. These results suggest that the inhibition of response to emotional cues involves the interaction of partly dissociable limbic and frontoparietal networks that encode emotional cues and use these cues to exert inhibitory control over the motor, attention, and sensory functions needed to perform the task, respectively. 2008 Wiley-Liss, Inc.

  11. The ties to unbind: Age-related differences in feature (unbinding in working memory for emotional faces

    Directory of Open Access Journals (Sweden)

    Didem ePehlivanoglu

    2014-04-01

    Full Text Available In the present study, we investigated age-related differences in the processing of emotional stimuli. Specifically, we were interested in whether older adults would show deficits in unbinding emotional expression (i.e., either no emotion, happiness, anger, or disgust from bound stimuli (i.e., photographs of faces expressing these emotions, as a hyperbinding account of age-related differences in working memory would predict. Younger and older adults completed different N-Back tasks (side-by-side 0-Back, 1-Back, 2-Back under three conditions: match/mismatch judgments based on either the identity of the face (identity condition, the face’s emotional expression (expression condition, or both identity and expression of the face (binding condition. Both age groups performed more slowly and with lower accuracy in the expression condition than in the binding condition, indicating the presence of an unbinding process. This unbinding effect was more pronounced in older adults than in younger adults, but only in the 2-Back task. Thus, older adults seemed to have a specific deficit in unbinding in working memory, over and beyond age-related differences observed in perceptual processing (0-Back and attention/short-term memory (1-Back. Additionally, no age-related differences were found in accuracy in the 0-Back task, but such differences emerged in the 1-Back task, and were further magnified in the 2-Back task, indicating independent age-related differences in attention/short-term memory and working memory. Pupil dilation data confirmed that the attention/short-term memory version of the task (1-Back is more effortful in older adults than younger adults.

  12. Emotion recognition through static faces and moving bodies: a comparison between typically developed adults and individuals with high level of autistic traits.

    Science.gov (United States)

    Actis-Grosso, Rossana; Bossi, Francesco; Ricciardelli, Paola

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits or with High Functioning Autism Spectrum Disorder (HAT group) was compared in the recognition of four emotions (Happiness, Anger, Fear, and Sadness) either shown in static faces or conveyed by moving body patch-light displays (PLDs). Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage). Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that (i) emotion recognition is not generally impaired in HAT individuals, (ii) the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals.

  13. Emotion recognition through static faces and moving bodies: a comparison between typically-developed adults and individuals with high level of autistic traits

    Directory of Open Access Journals (Sweden)

    Rossana eActis-Grosso

    2015-10-01

    Full Text Available We investigated whether the type of stimulus (pictures of static faces vs. body motion contributes differently to the recognition of emotions. The performance (accuracy and response times of 25 Low Autistic Traits (LAT group young adults (21 males and 20 young adults (16 males with either High Autistic Traits (HAT group or with High Functioning Autism Spectrum Disorder was compared in the recognition of four emotions (Happiness, Anger, Fear and Sadness either shown in static faces or conveyed by moving bodies (patch-light displays, PLDs. Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage. Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that i emotion recognition is not generally impaired in HAT individuals, ii the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals.

  14. What you want to avoid is what you see: Social avoidance motivation affects the interpretation of emotional faces

    OpenAIRE

    Nikitin, Jana; Freund, Alexandra M

    2015-01-01

    This study investigated the effects of habitual social approach and avoidance motivation on the classification of facial expressions of different visual clarity. Participants (N = 78) categorized partially masked emotional faces expressing either anger or happiness as positive or negative. Participants generally tended to interpret the facial expressions in a positive way. This positivity effect was reduced when persons were highly avoidance motivated. Social avoidance motivation predicted fe...

  15. Measuring Emotions in Marketing and Consumer Behavior : Is Face Reader an applicable tool?

    OpenAIRE

    Drozdova, Natalia

    2014-01-01

    This thesis investigates the topic of measuring emotions in marketing and consumer research. An overview of existing implicit and explicit methods of measuring emotions is presented in the thesis, followed by a literature review of methods used in empirical research during the last decade. The last part of the thesis focuses on automatic facial expression analysis as a tool for measuring emotional responses. A pilot study conducted by the Center of Service Innovations in the Norwegian School ...

  16. Infants? Temperament and Mothers?, and Fathers? Depression Predict Infants? Attention to Objects Paired with Emotional Faces

    OpenAIRE

    Aktar, Evin; Mandell, Dorothy J.; de Vente, Wieke; Majdand?i?, Mirjana; Raijmakers, Maartje E. J.; B?gels, Susan M.

    2015-01-01

    Between 10 and 14 months, infants gain the ability to learn about unfamiliar stimuli by observing others’ emotional reactions to those stimuli, so called social referencing (SR). Joint processing of emotion and head/gaze direction is essential for SR. This study tested emotion and head/gaze direction effects on infants’ attention via pupillometry in the period following the emergence of SR. Pupil responses of 14-to-17-month-old infants (N = 57) were measured during computerized presentations ...

  17. Different underlying mechanisms for face emotion and gender processing during feature-selective attention: Evidence from event-related potential studies.

    Science.gov (United States)

    Wang, Hailing; Ip, Chengteng; Fu, Shimin; Sun, Pei

    2017-05-01

    Face recognition theories suggest that our brains process invariant (e.g., gender) and changeable (e.g., emotion) facial dimensions separately. To investigate whether these two dimensions are processed in different time courses, we analyzed the selection negativity (SN, an event-related potential component reflecting attentional modulation) elicited by face gender and emotion during a feature selective attention task. Participants were instructed to attend to a combination of face emotion and gender attributes in Experiment 1 (bi-dimensional task) and to either face emotion or gender in Experiment 2 (uni-dimensional task). The results revealed that face emotion did not elicit a substantial SN, whereas face gender consistently generated a substantial SN in both experiments. These results suggest that face gender is more sensitive to feature-selective attention and that face emotion is encoded relatively automatically on SN, implying the existence of different underlying processing mechanisms for invariant and changeable facial dimensions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. EFFECTS OF MUSIC INTERVENTIONS ON EMOTIONAL STATES AND RUNNING PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Andrew M. Lane

    2011-06-01

    Full Text Available The present study compared the effects of two different music interventions on changes in emotional states before and during running, and also explored effects of music interventions upon performance outcome. Volunteer participants (n = 65 who regularly listened to music when running registered online to participate in a three-stage study. Participants attempted to attain a personally important running goal to establish baseline performance. Thereafter, participants were randomly assigned to either a self-selected music group or an Audiofuel music group. Audiofuel produce pieces of music designed to assist synchronous running. The self-selected music group followed guidelines for selecting motivating playlists. In both experimental groups, participants used the Brunel Music Rating Inventory-2 (BMRI-2 to facilitate selection of motivational music. Participants again completed the BMRI-2 post- intervention to assess the motivational qualities of Audiofuel music or the music they selected for use during the study. Results revealed no significant differences between self-selected music and Audiofuel music on all variables analyzed. Participants in both music groups reported increased pleasant emotions and decreased unpleasant emotions following intervention. Significant performance improvements were demonstrated post-intervention with participants reporting a belief that emotional states related to performance. Further analysis indicated that enhanced performance was significantly greater among participants reporting music to be motivational as indicated by high scores on the BMRI-2. Findings suggest that both individual athletes and practitioners should consider using the BMRI-2 when selecting music for running

  19. BESST (Bochum Emotional Stimulus Set)--a pilot validation study of a stimulus set containing emotional bodies and faces from frontal and averted views.

    Science.gov (United States)

    Thoma, Patrizia; Soria Bauser, Denise; Suchan, Boris

    2013-08-30

    This article introduces the freely available Bochum Emotional Stimulus Set (BESST), which contains pictures of bodies and faces depicting either a neutral expression or one of the six basic emotions (happiness, sadness, fear, anger, disgust, and surprise), presented from two different perspectives (0° frontal view vs. camera averted by 45° to the left). The set comprises 565 frontal view and 564 averted view pictures of real-life bodies with masked facial expressions and 560 frontal and 560 averted view faces which were synthetically created using the FaceGen 3.5 Modeller. All stimuli were validated in terms of categorization accuracy and the perceived naturalness of the expression. Additionally, each facial stimulus was morphed into three age versions (20/40/60 years). The results show high recognition of the intended facial expressions, even under speeded forced-choice conditions, as corresponds to common experimental settings. The average naturalness ratings for the stimuli range between medium and high. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  20. Electrocortical Reactivity to Emotional Faces in Young Children and Associations with Maternal and Paternal Depression

    Science.gov (United States)

    Kujawa, Autumn; Hajcak, Greg; Torpey, Dana; Kim, Jiyon; Klein, Daniel N.

    2012-01-01

    Background: The late positive potential (LPP) is an event-related potential component that indexes selective attention toward motivationally salient information and is sensitive to emotional stimuli. Few studies have examined the LPP in children. Depression has been associated with reduced reactivity to negative and positive emotional stimuli,…

  1. Neural Reactivity to Emotional Faces May Mediate the Relationship between Childhood Empathy and Adolescent Prosocial Behavior

    Science.gov (United States)

    Flournoy, John C.; Pfeifer, Jennifer H.; Moore, William E.; Tackman, Allison M.; Masten, Carrie L.; Mazziotta, John C.; Iacoboni, Marco; Dapretto, Mirella

    2016-01-01

    Reactivity to others' emotions not only can result in empathic concern (EC), an important motivator of prosocial behavior, but can also result in personal distress (PD), which may hinder prosocial behavior. Examining neural substrates of emotional reactivity may elucidate how EC and PD differentially influence prosocial behavior. Participants…

  2. Double attention bias for positive and negative emotional faces in clinical depression: evidence from an eye-tracking study.

    Science.gov (United States)

    Duque, Almudena; Vázquez, Carmelo

    2015-03-01

    According to cognitive models, attentional biases in depression play key roles in the onset and subsequent maintenance of the disorder. The present study examines the processing of emotional facial expressions (happy, angry, and sad) in depressed and non-depressed adults. Sixteen unmedicated patients with Major Depressive Disorder (MDD) and 34 never-depressed controls (ND) completed an eye-tracking task to assess different components of visual attention (orienting attention and maintenance of attention) in the processing of emotional faces. Compared to ND, participants with MDD showed a negative attentional bias in attentional maintenance indices (i.e. first fixation duration and total fixation time) for sad faces. This attentional bias was positively associated with the severity of depressive symptoms. Furthermore, the MDD group spent a marginally less amount of time viewing happy faces compared with the ND group. No differences were found between the groups with respect to angry faces and orienting attention indices. The current study is limited by its cross-sectional design. These results support the notion that attentional biases in depression are specific to depression-related information and that they operate in later stages in the deployment of attention. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Variations in the serotonin-transporter gene are associated with attention bias patterns to positive and negative emotion faces.

    Science.gov (United States)

    Pérez-Edgar, Koraly; Bar-Haim, Yair; McDermott, Jennifer Martin; Gorodetsky, Elena; Hodgkinson, Colin A; Goldman, David; Ernst, Monique; Pine, Daniel S; Fox, Nathan A

    2010-03-01

    Both attention biases to threat and a serotonin-transporter gene polymorphism (5-HTTLPR) have been linked to heightened neural activation to threat and the emergence of anxiety. The short allele of 5-HTTLPR may act via its effect on neurotransmitter availability, while attention biases shape broad patterns of cognitive processing. We examined individual differences in attention bias to emotion faces as a function of 5-HTTLPR genotype. Adolescents (N=117) were classified for presumed SLC6A4 expression based on 5-HTTLPR-low (SS, SL(G), or L(G)L(G)), intermediate (SL(A) or L(A)L(G)), or high (L(A)L(A)). Participants completed the dot-probe task, measuring attention biases toward or away from angry and happy faces. Biases for angry faces increased with the genotype-predicted neurotransmission levels (low>intermediate>high). The reverse pattern was evident for happy faces. The data indicate a linear relation between 5-HTTLPR allelic status and attention biases to emotion, demonstrating a genetic mechanism for biased attention using ecologically valid stimuli that target socioemotional adaptation. Copyright 2009 Elsevier B.V. All rights reserved.

  4. Memory for faces with emotional expressions in Alzheimer's disease and healthy older participants: positivity effect is not only due to familiarity.

    Science.gov (United States)

    Sava, Alina-Alexandra; Krolak-Salmon, Pierre; Delphin-Combe, Floriane; Cloarec, Morgane; Chainay, Hanna

    2017-01-01

    Young individuals better memorize initially seen faces with emotional rather than neutral expressions. Healthy older participants and Alzheimer's disease (AD) patients show better memory for faces with positive expressions. The socioemotional selectivity theory postulates that this positivity effect in memory reflects a general age-related preference for positive stimuli, subserving emotion regulation. Another explanation might be that older participants use compensatory strategies, often considering happy faces as previously seen. The question about the existence of this effect in tasks not permitting such compensatory strategies is still open. Thus, we compared the performance of healthy participants and AD patients for positive, neutral, and negative faces in such tasks. Healthy older participants and AD patients showed a positivity effect in memory, but there was no difference between emotional and neutral faces in young participants. Our results suggest that the positivity effect in memory is not entirely due to the sense of familiarity for smiling faces.

  5. Is fear in your head? A comparison of instructed and real-life expressions of emotion in the face and body.

    Science.gov (United States)

    Abramson, Lior; Marom, Inbal; Petranker, Rotem; Aviezer, Hillel

    2017-04-01

    The majority of emotion perception studies utilize instructed and stereotypical expressions of faces or bodies. While such stimuli are highly standardized and well-recognized, their resemblance to real-life expressions of emotion remains unknown. Here we examined facial and body expressions of fear and anger during real-life situations and compared their recognition to that of instructed expressions of the same emotions. In order to examine the source of the affective signal, expressions of emotion were presented as faces alone, bodies alone, and naturally, as faces with bodies. The results demonstrated striking deviations between recognition of instructed and real-life stimuli, which differed as a function of the emotion expressed. In real-life fearful expressions of emotion, bodies were far better recognized than faces, a pattern not found with instructed expressions of emotion. Anger reactions were better recognized from the body than from the face in both real-life and instructed stimuli. However, the real-life stimuli were overall better recognized than their instructed counterparts. These results indicate that differences between instructed and real-life expressions of emotion are prevalent and raise caution against an overreliance of researchers on instructed affective stimuli. The findings also demonstrate that in real life, facial expression perception may rely heavily on information from the contextualizing body. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Quantification of vascular function changes under different emotion states: A pilot study.

    Science.gov (United States)

    Xia, Yirong; Yang, Licai; Mao, Xueqin; Zheng, Dingchang; Liu, Chengyu

    2017-01-01

    Recent studies have indicated that physiological parameters change with different emotion states. This study aimed to quantify the changes of vascular function at different emotion and sub-emotion states. Twenty young subjects were studied with their finger photoplethysmographic (PPG) pulses recorded at three distinct emotion states: natural (1 minute), happiness and sadness (10 minutes for each). Within the period of happiness and sadness emotion states, two sub-emotion states (calmness and outburst) were identified with the synchronously recorded videos. Reflection index (RI) and stiffness index (SI), two widely used indices of vascular function, were derived from the PPG pulses to quantify their differences between three emotion states, as well as between two sub-emotion states. The results showed that, when compared with the natural emotion, RI and SI decreased in both happiness and sadness emotions. The decreases in RI were significant for both happiness and sadness emotions (both Pemotion (Pemotions, there was significant difference in RI (Pemotion in comparison with the calmness one for both happiness and sadness emotions (both Pemotion only in sadness emotion (Pemotion measurements. This pilot study confirmed that vascular function changes with diffenrt emotion states could be quantified by the simple PPG measurement.

  7. The Way Humans Behave Modulates the Emotional State of Piglets.

    Directory of Open Access Journals (Sweden)

    Sophie Brajon

    Full Text Available The emotional state can influence decision-making under ambiguity. Cognitive bias tests (CBT proved to be a promising indicator of the affective valence of animals in a context of farm animal welfare. Although it is well-known that humans can influence the intensity of fear and reactions of animals, research on cognitive bias often focusses on housing and management conditions and neglects the role of humans on emotional states of animals. The present study aimed at investigating whether humans can modulate the emotional state of weaned piglets. Fifty-four piglets received a chronic experience with humans: gentle (GEN, rough (ROU or minimal contact (MIN. Simultaneously, they were individually trained on a go/no-go task to discriminate a positive auditory cue, associated with food reward in a trough, from a negative one, associated with punishments (e.g. water spray. Independently of the treatment (P = 0.82, 59% of piglets completed the training. Successfully trained piglets were then subjected to CBT, including ambiguous cues in presence or absence of a human observer. As hypothesized, GEN piglets showed a positive judgement bias, as shown by their higher percentage of go responses following an ambiguous cue compared to ROU (P = 0.03 and MIN (P = 0.02 piglets, whereas ROU and MIN piglets did not differ (P > 0.10. The presence of an observer during CBT did not modulate the percentage of go responses following an ambiguous cue (P > 0.10. However, regardless of the treatment, piglets spent less time in contact with the trough following positive cues during CBT in which the observer was present than absent (P < 0.0001. This study originally demonstrates that the nature of a chronic experience with humans can induce a judgement bias indicating that the emotional state of farm animals such as piglets can be affected by the way humans interact with them.

  8. Emotional state and coping style among gynecologic patients undergoing surgery.

    Science.gov (United States)

    Matsushita, Toshiko; Murata, Hinako; Matsushima, Eisuke; Sakata, Yu; Miyasaka, Naoyuki; Aso, Takeshi

    2007-02-01

    The aim of the present study was to investigate changes in emotional state and the relationship between emotional state and demographic/clinical factors and coping style among gynecologic patients undergoing surgery. Using the Japanese version of the Profile of Mood States (POMS), 90 patients (benign disease: 32, malignancy: 58) were examined on three occasions: before surgery, before discharge, and 3 months after discharge. They were also examined using the Coping Inventory for Stressful Situations (CISS) on one occasion before discharge. The scores for the subscales depression, anger, and confusion were the highest after discharge while those for anxiety were the highest before surgery. The average scores of the POMS subscales for all subjects were within the normal range. With regard to the relationship between these emotional states and other factors, multiple regressions showed that the principal determinants of anxiety before surgery were religious belief, psychological symptoms during hospitalization and emotion-oriented (E) coping style; further, it was found that depression after discharge could be explained by chemotherapy, duration of hospitalization, and E coping style. The principal determinants of anger after discharge and vigor before surgery were length of education and E coping style, and severity of disease, chemotherapy, E coping style and task-oriented coping style, respectively. Those of post-discharge fatigue and confusion were length of education, psychological symptoms, and E coping style. In summary it is suggested that the following should be taken into account in patients undergoing gynecologic surgery: anxiety before surgery, depression, anger, and confusion after surgery, including coping styles.

  9. Event-related brain responses to emotional words, pictures, and faces - a cross-domain comparison.

    Science.gov (United States)

    Bayer, Mareike; Schacht, Annekathrin

    2014-01-01

    Emotion effects in event-related brain potentials (ERPs) have previously been reported for a range of visual stimuli, including emotional words, pictures, and facial expressions. Still, little is known about the actual comparability of emotion effects across these stimulus classes. The present study aimed to fill this gap by investigating emotion effects in response to words, pictures, and facial expressions using a blocked within-subject design. Furthermore, ratings of stimulus arousal and valence were collected from an independent sample of participants. Modulations of early posterior negativity (EPN) and late positive complex (LPC) were visible for all stimulus domains, but showed clear differences, particularly in valence processing. While emotion effects were limited to positive stimuli for words, they were predominant for negative stimuli in pictures and facial expressions. These findings corroborate the notion of a positivity offset for words and a negativity bias for pictures and facial expressions, which was assumed to be caused by generally lower arousal levels of written language. Interestingly, however, these assumed differences were not confirmed by arousal ratings. Instead, words were rated as overall more positive than pictures and facial expressions. Taken together, the present results point toward systematic differences in the processing of written words and pictorial stimuli of emotional content, not only in terms of a valence bias evident in ERPs, but also concerning their emotional evaluation captured by ratings of stimulus valence and arousal.

  10. Respiratory sinus arrhythmia responses to induced emotional states: effects of RSA indices, emotion induction method, age, and sex.

    Science.gov (United States)

    Overbeek, Thérèse J M; van Boxtel, Anton; Westerink, Joyce H D M

    2012-09-01

    The literature shows large inconsistencies in respiratory sinus arrhythmia (RSA) responses to induced emotional states. This may be caused by differences in emotion induction methods, RSA quantification, and non-emotional demands of the situation. In 83 healthy subjects, we studied RSA responses to pictures and film fragments eliciting six different discrete emotions relative to neutral baseline stimuli. RSA responses were quantified in the time and frequency domain and were additionally corrected for differences in mean heart rate and respiration rate, resulting in eight different RSA response measures. Subjective ratings of emotional stimuli and facial electromyographic responses indicated that pictures and film fragments elicited the intended emotions. Although RSA measures showed various emotional effects, responses were quite heterogeneous and frequently nonsignificant. They were substantially influenced by methodological factors, in particular time vs. frequency domain response measures, correction for changes in respiration rate, use of pictures vs. film fragments, and sex of participants. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Placing the face in context: cultural differences in the perception of facial emotion.

    Science.gov (United States)

    Masuda, Takahiko; Ellsworth, Phoebe C; Mesquita, Batja; Leu, Janxin; Tanida, Shigehito; Van de Veerdonk, Ellen

    2008-03-01

    Two studies tested the hypothesis that in judging people's emotions from their facial expressions, Japanese, more than Westerners, incorporate information from the social context. In Study 1, participants viewed cartoons depicting a happy, sad, angry, or neutral person surrounded by other people expressing the same emotion as the central person or a different one. The surrounding people's emotions influenced Japanese but not Westerners' perceptions of the central person. These differences reflect differences in attention, as indicated by eye-tracking data (Study 2): Japanese looked at the surrounding people more than did Westerners. Previous findings on East-West differences in contextual sensitivity generalize to social contexts, suggesting that Westerners see emotions as individual feelings, whereas Japanese see them as inseparable from the feelings of the group.

  12. Neural Reactivity to Emotional Faces Mediates the Relationship Between Childhood Empathy and Adolescent Prosocial Behavior

    Science.gov (United States)

    Flournoy, John C.; Pfeifer, Jennifer H.; Moore, William E.; Tackman, Allison; Masten, Carrie L.; Mazziotta, John C.; Iacoboni, Marco; Dapretto, Mirella

    2017-01-01

    Reactivity to others' emotions can result in empathic concern (EC), an important motivator of prosocial behavior, but can also result in personal distress (PD), which may hinder prosocial behavior. Examining neural substrates of emotional reactivity may elucidate how EC and PD differentially influence prosocial behavior. Participants (N=57) provided measures of EC, PD, prosocial behavior, and neural responses to emotional expressions at age 10 and 13. Initial EC predicted subsequent prosocial behavior. Initial EC and PD predicted subsequent reactivity to emotions in the inferior frontal gyrus (IFG) and inferior parietal lobule, respectively. Activity in the IFG, a region linked to mirror neuron processes, as well as cognitive control and language, mediated the relation between initial EC and subsequent prosocial behavior. PMID:28262939

  13. Love withdrawal predicts electrocortical responses to emotional faces with performance feedback: a follow-up and extension.

    Science.gov (United States)

    Huffmeijer, Renske; Bakermans-Kranenburg, Marian J; Alink, Lenneke R A; van IJzendoorn, Marinus H

    2014-06-02

    Parental use of love withdrawal is thought to affect children's later psychological functioning because it creates a link between children's performance and relational consequences. In addition, recent studies have begun to show that experiences of love withdrawal also relate to the neural processing of socio-emotional information relevant to a performance-relational consequence link, and can moderate effects of oxytocin on social information processing and behavior. The current study follows-up on our previous results by attempting to confirm and extend previous findings indicating that experiences of maternal love withdrawal are related to electrocortical responses to emotional faces presented with performance feedback. More maternal love withdrawal was related to enhanced early processing of facial feedback stimuli (reflected in more positive VPP amplitudes, and confirming previous findings). However, attentional engagement with and processing of the stimuli at a later stage were diminished in those reporting higher maternal love withdrawal (reflected in less positive LPP amplitudes, and diverging from previous findings). Maternal love withdrawal affects the processing of emotional faces presented with performance feedback differently in different stages of neural processing.

  14. Emotion recognition through static faces and moving bodies: a comparison between typically-developed adults and individuals with high level of autistic traits

    OpenAIRE

    Rossana eActis-Grosso; Rossana eActis-Grosso; Francesco eBossi; Paola eRicciardelli; Paola eRicciardelli

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits (HAT group) or with High Functioning Autism Spectrum Disorder was compared in the recognition of four emotions (Happiness, Anger, Fear and Sadness) either shown in static faces or c...

  15. Emotion recognition through static faces and moving bodies: a comparison between typically developed adults and individuals with high level of autistic traits

    OpenAIRE

    Actis-Grosso, Rossana; Bossi, Francesco; Ricciardelli, Paola

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits or with High Functioning Autism Spectrum Disorder (HAT group) was compared in the recognition of four emotions (Happiness, Anger, Fear, and Sadness) either shown in static faces or ...

  16. The Way Dogs (Canis familiaris Look at Human Emotional Faces Is Modulated by Oxytocin. An Eye-Tracking Study

    Directory of Open Access Journals (Sweden)

    Anna Kis

    2017-10-01

    Full Text Available Dogs have been shown to excel in reading human social cues, including facial cues. In the present study we used eye-tracking technology to further study dogs’ face processing abilities. It was found that dogs discriminated between human facial regions in their spontaneous viewing pattern and looked most to the eye region independently of facial expression. Furthermore dogs played most attention to the first two images presented, afterwards their attention dramatically decreases; a finding that has methodological implications. Increasing evidence indicates that the oxytocin system is involved in dogs’ human-directed social competence, thus as a next step we investigated the effects of oxytocin on processing of human facial emotions. It was found that oxytocin decreases dogs’ looking to the human faces expressing angry emotional expression. More interestingly, however, after oxytocin pre-treatment dogs’ preferential gaze toward the eye region when processing happy human facial expressions disappears. These results provide the first evidence that oxytocin is involved in the regulation of human face processing in dogs. The present study is one of the few empirical investigations that explore eye gaze patterns in naïve and untrained pet dogs using a non-invasive eye-tracking technique and thus offers unique but largely untapped method for studying social cognition in dogs.

  17. How stable is activation in the amygdala and prefrontal cortex in adolescence? A study of emotional face processing across three measurements

    NARCIS (Netherlands)

    van den Bulk, B.G.; Koolschijn, P.C.M.P.; Meens, P.H.F.; van Lang, N.D.J.; van der Wee, N.J.A.; Rombouts, S.A.R.B.; Vermeiren, R.R.J.M.; Crone, E.A.

    2013-01-01

    Prior developmental functional magnetic resonance imaging (fMRI) studies have demonstrated elevated activation patterns in the amygdala and prefrontal cortex (PFC) in response to viewing emotional faces. As adolescence is a time of substantial variability in mood and emotional responsiveness, the

  18. Differential Interactions between Identity and Emotional Expression in Own and Other-Race Faces: Effects of Familiarity Revealed through Redundancy Gains

    Science.gov (United States)

    Yankouskaya, Alla; Humphreys, Glyn W.; Rotshtein, Pia

    2014-01-01

    We examined relations between the processing of facial identity and emotion in own- and other-race faces, using a fully crossed design with participants from 3 different ethnicities. The benefits of redundant identity and emotion signals were evaluated and formally tested in relation to models of independent and coactive feature processing and…

  19. Emotion modelling towards affective pathogenesis.

    Science.gov (United States)

    Bas, James Le

    2009-12-01

    Objective: There is a need in psychiatry for models that integrate pathological states with normal systems. The interaction of arousal and emotion is the focus of an exploration of affective pathogenesis. Method: Given that the explicit causes of affective disorder remain nascent, methods of linking emotion and disorder are evaluated. Results: A network model of emotional families is presented, in which emotions exist as quantal gradients. Morbid emotional states are seen as the activation of distal emotion sites. The phenomenology of affective disorders is described with reference to this model. Recourse is made to non-linear dynamic theory. Conclusions: Metaphoric emotion models have face validity and may prove a useful heuristic.

  20. Psycho-emotional state of rats under thyroid dysfunction

    Directory of Open Access Journals (Sweden)

    Demchenko Е.М.

    2014-03-01

    Full Text Available Spontaneous behavioral activity of white rats and concentration of glutamate, glycine and gamma-aminobutyric acid in the neocortex in experimental hyper- and hypothyroidism was investigated. It was found that an excess of thyroid hormones was accompanied by emotional and anxiolytic effects in terms of reducing duration of grooming by 37% and increasing number of transitions and the time spent in the illuminated cross-shaped maze arms (26 % and 35%. Such behavior changes occurred against a background of increasing concentrations of GABA by 52 % (p<0,05 in the cortex of animals with experimental hyperthyroidism. Perhaps, the psycho-emotional state of the organism is modulated by thyroid hormones through the GABA - ergic system of neocortex.

  1. Attentional Bias for Emotional Faces in Children with Generalized Anxiety Disorder

    Science.gov (United States)

    Waters, Allison M.; Mogg, Karin; Bradley, Brendan P.; Pine, Daniel S.

    2008-01-01

    Attentional bias for angry and happy faces in 7-12 year old children with general anxiety disorder (GAD) is examined. Results suggest that an attentional bias toward threat faces depends on a certain degree of clinical severity and/or the type of anxiety diagnosis in children.

  2. Elevated responses to constant facial emotions in different faces in the human amygdala: an fMRI study of facial identity and expression

    Directory of Open Access Journals (Sweden)

    Weiller Cornelius

    2004-11-01

    Full Text Available Abstract Background Human faces provide important signals in social interactions by inferring two main types of information, individual identity and emotional expression. The ability to readily assess both, the variability and consistency among emotional expressions in different individuals, is central to one's own interpretation of the imminent environment. A factorial design was used to systematically test the interaction of either constant or variable emotional expressions with constant or variable facial identities in areas involved in face processing using functional magnetic resonance imaging. Results Previous studies suggest a predominant role of the amygdala in the assessment of emotional variability. Here we extend this view by showing that this structure activated to faces with changing identities that display constant emotional expressions. Within this condition, amygdala activation was dependent on the type and intensity of displayed emotion, with significant responses to fearful expressions and, to a lesser extent so to neutral and happy expressions. In contrast, the lateral fusiform gyrus showed a binary pattern of increased activation to changing stimulus features while it was also differentially responsive to the intensity of displayed emotion when processing different facial identities. Conclusions These results suggest that the amygdala might serve to detect constant facial emotions in different individuals, complementing its established role for detecting emotional variability.

  3. The role of emotion in dynamic audiovisual integration of faces and voices.

    Science.gov (United States)

    Kokinous, Jenny; Kotz, Sonja A; Tavano, Alessandro; Schröger, Erich

    2015-05-01

    We used human electroencephalogram to study early audiovisual integration of dynamic angry and neutral expressions. An auditory-only condition served as a baseline for the interpretation of integration effects. In the audiovisual conditions, the validity of visual information was manipulated using facial expressions that were either emotionally congruent or incongruent with the vocal expressions. First, we report an N1 suppression effect for angry compared with neutral vocalizations in the auditory-only condition. Second, we confirm early integration of congruent visual and auditory information as indexed by a suppression of the auditory N1 and P2 components in the audiovisual compared with the auditory-only condition. Third, audiovisual N1 suppression was modulated by audiovisual congruency in interaction with emotion: for neutral vocalizations, there was N1 suppression in both the congruent and the incongruent audiovisual conditions. For angry vocalizations, there was N1 suppression only in the congruent but not in the incongruent condition. Extending previous findings of dynamic audiovisual integration, the current results suggest that audiovisual N1 suppression is congruency- and emotion-specific and indicate that dynamic emotional expressions compared with non-emotional expressions are preferentially processed in early audiovisual integration. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  4. Face processing in chronic alcoholism: a specific deficit for emotional features.

    Science.gov (United States)

    Maurage, P; Campanella, S; Philippot, P; Martin, S; de Timary, P

    2008-04-01

    It is well established that chronic alcoholism is associated with a deficit in the decoding of emotional facial expression (EFE). Nevertheless, it is still unclear whether this deficit is specifically for emotions or due to a more general impairment in visual or facial processing. This study was designed to clarify this issue using multiple control tasks and the subtraction method. Eighteen patients suffering from chronic alcoholism and 18 matched healthy control subjects were asked to perform several tasks evaluating (1) Basic visuo-spatial and facial identity processing; (2) Simple reaction times; (3) Complex facial features identification (namely age, emotion, gender, and race). Accuracy and reaction times were recorded. Alcoholic patients had a preserved performance for visuo-spatial and facial identity processing, but their performance was impaired for visuo-motor abilities and for the detection of complex facial aspects. More importantly, the subtraction method showed that alcoholism is associated with a specific EFE decoding deficit, still present when visuo-motor slowing down is controlled for. These results offer a post hoc confirmation of earlier data showing an EFE decoding deficit in alcoholism by strongly suggesting a specificity of this deficit for emotions. This may have implications for clinical situations, where emotional impairments are frequently observed among alcoholic subjects.

  5. The development of the Athens Emotional States Inventory (AESI): collection, validation and automatic processing of emotionally loaded sentences.

    Science.gov (United States)

    Chaspari, Theodora; Soldatos, Constantin; Maragos, Petros

    2015-01-01

    The development of ecologically valid procedures for collecting reliable and unbiased emotional data towards computer interfaces with social and affective intelligence targeting patients with mental disorders. Following its development, presented with, the Athens Emotional States Inventory (AESI) proposes the design, recording and validation of an audiovisual database for five emotional states: anger, fear, joy, sadness and neutral. The items of the AESI consist of sentences each having content indicative of the corresponding emotion. Emotional content was assessed through a survey of 40 young participants with a questionnaire following the Latin square design. The emotional sentences that were correctly identified by 85% of the participants were recorded in a soundproof room with microphones and cameras. A preliminary validation of AESI is performed through automatic emotion recognition experiments from speech. The resulting database contains 696 recorded utterances in Greek language by 20 native speakers and has a total duration of approximately 28 min. Speech classification results yield accuracy up to 75.15% for automatically recognizing the emotions in AESI. These results indicate the usefulness of our approach for collecting emotional data with reliable content, balanced across classes and with reduced environmental variability.

  6. The effect of emotionally valenced eye region images on visuocortical processing of surprised faces.

    Science.gov (United States)

    Li, Shuaixia; Li, Ping; Wang, Wei; Zhu, Xiangru; Luo, Wenbo

    2018-05-01

    In this study, we presented pictorial representations of happy, neutral, and fearful expressions projected in the eye regions to determine whether the eye region alone is sufficient to produce a context effect. Participants were asked to judge the valence of surprised faces that had been preceded by a picture of an eye region. Behavioral results showed that affective ratings of surprised faces were context dependent. Prime-related ERPs with presentation of happy eyes elicited a larger P1 than those for neutral and fearful eyes, likely due to the recognition advantage provided by a happy expression. Target-related ERPs showed that surprised faces in the context of fearful and happy eyes elicited dramatically larger C1 than those in the neutral context, which reflected the modulation by predictions during the earliest stages of face processing. There were larger N170 with neutral and fearful eye contexts compared to the happy context, suggesting faces were being integrated with contextual threat information. The P3 component exhibited enhanced brain activity in response to faces preceded by happy and fearful eyes compared with neutral eyes, indicating motivated attention processing may be involved at this stage. Altogether, these results indicate for the first time that the influence of isolated eye regions on the perception of surprised faces involves preferential processing at the early stages and elaborate processing at the late stages. Moreover, higher cognitive processes such as predictions and attention can modulate face processing from the earliest stages in a top-down manner. © 2017 Society for Psychophysiological Research.

  7. Faces of Shame: Implications for Self-Esteem, Emotion Regulation, Aggression, and Well-Being.

    Science.gov (United States)

    Velotti, Patrizia; Garofalo, Carlo; Bottazzi, Federica; Caretti, Vincenzo

    2017-02-17

    There is an increasing interest in psychological research on shame experiences and their associations with other aspects of psychological functioning and well-being, as well as with possible maladaptive outcomes. In an attempt to confirm and extend previous knowledge on this topic, we investigated the nomological network of shame experiences in a large community sample (N = 380; 66.1% females), adopting a multidimensional conceptualization of shame. Females reported higher levels of shame (in particular, bodily and behavioral shame), guilt, psychological distress, emotional reappraisal, and hostility. Males had higher levels of self-esteem, emotional suppression, and physical aggression. Shame feelings were associated with low self-esteem, hostility, and psychological distress in a consistent way across gender. Associations between characterological shame and emotional suppression, as well as between bodily shame and anger occurred only among females. Moreover, characterological and bodily shame added to the prediction of low self-esteem, hostility, and psychological distress above and beyond the influence of trait shame. Finally, among females, emotional suppression mediated the influence of characterological shame on hostility and psychological distress. These findings extend current knowledge on the nomological net surrounding shame experiences in everyday life, supporting the added value of a multidimensional conceptualization of shame feelings.

  8. Modulation of Attentional Blink with Emotional Faces in Typical Development and in Autism Spectrum Disorders

    Science.gov (United States)

    Yerys, Benjamin E.; Ruiz, Ericka; Strang, John; Sokoloff, Jennifer; Kenworthy, Lauren; Vaidya, Chandan J.

    2013-01-01

    Background: The attentional blink (AB) phenomenon was used to assess the effect of emotional information on early visual attention in typically developing (TD) children and children with autism spectrum disorders (ASD). The AB effect is the momentary perceptual unawareness that follows target identification in a rapid serial visual processing…

  9. Recognizing emotions in faces : effects of acute tryptophan depletion and bright light

    NARCIS (Netherlands)

    aan het Rot, Marije; Coupland, Nicholas; Boivin, Diane B.; Benkelfat, Chawki; Young, Simon N.

    2010-01-01

    In healthy never-depressed individuals, acute tryptophan depletion (ATD) may selectively decrease the accurate recognition of fearful facial expressions. Here we investigated the perception of facial emotions after ATD in more detail. We also investigated whether bright light, which can reverse

  10. Processing of masked and unmasked emotional faces under different attentional conditions: an electrophysiological investigation.

    Directory of Open Access Journals (Sweden)

    Marzia eDel Zotto

    2015-10-01

    Full Text Available In order to investigate the interactions between non-spatial selective attention, awareness and emotion processing, we carried out an ERP study using a backward masking paradigm, in which angry, fearful, happy and neutral facial expressions were presented, while participants attempted to detect the presence of one or the other category of facial expressions in the different experimental blocks. ERP results showed that negative emotions enhanced an early N170 response over temporal-occipital leads in both masked and unmasked conditions, independently of selective attention. A later effect arising at the P2 was linked to awareness. Finally, selective attention was found to affect the N2 and N3 components over occipito-parietal leads. Our findings reveal that i the initial processing of facial expressions arises prior to attention and awareness; ii attention and awareness give rise to temporally distinct periods of activation independently of the type of emotion with only a partial degree of overlap; and iii selective attention appears to be influenced by the emotional nature of the stimuli, which in turn impinges on unconscious processing at a very early stage. This study confirms previous reports that negative facial expressions can be processed rapidly, in absence of visual awareness and independently of selective attention. On the other hand, attention and awareness may operate in a synergistic way, depending on task demand.

  11. State-Dependent Differences in Emotion Regulation Between Unmedicated Bipolar Disorder and Major Depressive Disorder

    NARCIS (Netherlands)

    Rive, M.M.; Mocking, R.J.T.; Koeter, M.W.; Wingen, G. van; Wit, S.J. de; Heuvel, O.A. van den; Veltman, D.J.; Ruhe, H.G.; Schene, A.H.

    2015-01-01

    IMPORTANCE: Major depressive disorder (MDD) and bipolar disorder (BD) are difficult to distinguish clinically during the depressed or remitted states. Both mood disorders are characterized by emotion regulation disturbances; however, little is known about emotion regulation differences between MDD

  12. State-Dependent Differences in Emotion Regulation Between Unmedicated Bipolar Disorder and Major Depressive Disorder

    NARCIS (Netherlands)

    Rive, Maria M.; Mocking, Roel J. T.; Koeter, Maarten W. J.; van Wingen, Guido; de Wit, Stella J.; van den Heuvel, Odile A.; Veltman, Dick J.; Ruhe, Henricus G.; Schene, Aart H.

    IMPORTANCE Major depressive disorder (MDD) and bipolar disorder (BD) are difficult to distinguish clinically during the depressed or remitted states. Both mood disorders are characterized by emotion regulation disturbances; however, little is known about emotion regulation differences between MDD

  13. State-Dependent Differences in Emotion Regulation Between Unmedicated Bipolar Disorder and Major Depressive Disorder

    NARCIS (Netherlands)

    Rive, M.M.; Mocking, R.J.T.; Koeter, M.W.J.; van Wingen, G.; de Wit, S.J.; van den Heuvel, O.A.; Veltman, D.J.; Ruhe, H.G.; Schene, A.H.

    2015-01-01

    IMPORTANCE Major depressive disorder (MDD) and bipolar disorder (BD) are difficult to distinguish clinically during the depressed or remitted states. Both mood disorders are characterized by emotion regulation disturbances; however, little is known about emotion regulation differences between MDD

  14. A dimensional approach to determine common and specific neurofunctional markers for depression and social anxiety during emotional face processing.

    Science.gov (United States)

    Luo, Lizhu; Becker, Benjamin; Zheng, Xiaoxiao; Zhao, Zhiying; Xu, Xiaolei; Zhou, Feng; Wang, Jiaojian; Kou, Juan; Dai, Jing; Kendrick, Keith M

    2018-02-01

    Major depression disorder (MDD) and anxiety disorder are both prevalent and debilitating. High rates of comorbidity between MDD and social anxiety disorder (SAD) suggest common pathological pathways, including aberrant neural processing of interpersonal signals. In patient populations, the determination of common and distinct neurofunctional markers of MDD and SAD is often hampered by confounding factors, such as generally elevated anxiety levels and disorder-specific brain structural alterations. This study employed a dimensional disorder approach to map neurofunctional markers associated with levels of depression and social anxiety symptoms in a cohort of 91 healthy subjects using an emotional face processing paradigm. Examining linear associations between levels of depression and social anxiety, while controlling for trait anxiety revealed that both were associated with exaggerated dorsal striatal reactivity to fearful and sad expression faces respectively. Exploratory analysis revealed that depression scores were positively correlated with dorsal striatal functional connectivity during processing of fearful faces, whereas those of social anxiety showed a negative association during processing of sad faces. No linear relationships between levels of depression and social anxiety were observed during a facial-identity matching task or with brain structure. Together, the present findings indicate that dorsal striatal neurofunctional alterations might underlie aberrant interpersonal processing associated with both increased levels of depression and social anxiety. © 2017 Wiley Periodicals, Inc.

  15. The Influence of Emotional State and Pictorial Cues on Perceptual Judgments

    Energy Technology Data Exchange (ETDEWEB)

    Kimberly R. Raddatz; Abigail Werth; Tuan Q. Tran

    2007-10-01

    Perspective displays (e.g., CDTI) are commonly used as decision aids in environments characterized by periods of high emotional arousal (e.g., terrain enhanced primary flight displays). However, little attention has been devoted to understanding how emotional state, independently or in conjunction with other perceptual factors (e.g., pictorial depth cues), can impact perceptual judgments. Preliminary research suggests that induced emotional state (positive or negative) adversely impacts size comparisons in perspective displays (Tran & Raddatz, 2006). This study further investigated how size comparisons are affected by emotional state and pictorial depth cues while attenuating the limitations of the Tran & Raddatz (2006) study. Results confirmed that observers do make slower judgments under induced emotional state. However, observers under negative emotional state showed higher sensitivity (d’) and required more evidence to respond that a size difference exists (response bias) than observers under positive emotional state. Implications for display design and human performance are discussed.

  16. vMMN for schematic faces: automatic detection of change in emotional expression

    Directory of Open Access Journals (Sweden)

    Kairi eKreegipuu

    2013-10-01

    Full Text Available Our brain is able to automatically detect changes in sensory stimulation, including in vision. A large variety of changes of features in stimulation elicit a deviance-reflecting ERP component known as the mismatch negativity (MMN. The present study has three main goals: (1 to register vMMN using a rapidly presented stream of schematic faces (neutral, happy, angry; adapted from Öhman et al., 2001; (2 to compare elicited vMMNs to angry and happy schematic faces in two different paradigms, in a traditional oddball design with frequent standard and rare target and deviant stimuli (12.5% each and in an version of an optimal multi-feature paradigm with several deviant stimuli (altogether 37.5% in the stimulus block; (3 to compare vMMNs to subjective ratings of valence, arousal and attention capture for happy and angry schematic faces, i.e., to estimate the effect of affective value of stimuli on their automatic detection. Eleven observers (19-32 years, 6 women took part in both experiments, an oddball and optimum paradigm. Stimuli were rapidly presented schematic faces and an object with face-features that served as the target stimulus to be detected by a button-press. Results show that a vMMN-type response at posterior sites was equally elicited in both experiments. Post-experimental reports confirmed that the angry face attracted more automatic attention than the happy face but the difference did not emerge directly at the ERP level. Thus, when interested in studying change detection in facial expressions we encourage the use of the optimum (multi-feature design in order to save time and other experimental resources.

  17. It Is Not Just in Faces! Processing of Emotion and Intention from Biological Motion in Psychiatric Disorders

    Directory of Open Access Journals (Sweden)

    Łukasz Okruszek

    2018-02-01

    Full Text Available Social neuroscience offers a wide range of techniques that may be applied to study the social cognitive deficits that may underlie reduced social functioning—a common feature across many psychiatric disorders. At the same time, a significant proportion of research in this area has been conducted using paradigms that utilize static displays of faces or eyes. The use of point-light displays (PLDs offers a viable alternative for studying recognition of emotion or intention inference while minimizing the amount of information presented to participants. This mini-review aims to summarize studies that have used PLD to study emotion and intention processing in schizophrenia (SCZ, affective disorders, anxiety and personality disorders, eating disorders and neurodegenerative disorders. Two main conclusions can be drawn from the reviewed studies: first, the social cognitive problems found in most of the psychiatric samples using PLD were of smaller magnitude than those found in studies presenting social information using faces or voices. Second, even though the information presented in PLDs is extremely limited, presentation of these types of stimuli is sufficient to elicit the disorder-specific, social cognitive biases (e.g., mood-congruent bias in depression, increased threat perception in anxious individuals, aberrant body size perception in eating disorders documented using other methodologies. Taken together, these findings suggest that point-light stimuli may be a useful method of studying social information processing in psychiatry. At the same time, some limitations of using this methodology are also outlined.

  18. Assessing positive emotional states in dogs using heart rate and heart rate variability.

    Science.gov (United States)

    Zupan, Manja; Buskas, Julia; Altimiras, Jordi; Keeling, Linda J

    2016-03-01

    Since most animal species have been recognized as sentient beings, emotional state may be a good indicator of welfare in animals. The goal of this study was to manipulate the environment of nine beagle research dogs to highlight physiological responses indicative of different emotional experiences. Stimuli were selected to be a more or a less positive food (meatball or food pellet) or social reward (familiar person or less familiar person). That all the stimuli were positive and of different reward value was confirmed in a runway motivation test. Dogs were tested individually while standing facing a display theatre where the different stimuli could be shown by lifting a shutter. The dogs approached and remained voluntarily in the test system. They were tested in four sessions (of 20s each) for each of the four stimuli. A test session consisted of four presentation phases (1st exposure to stimulus, post exposure, 2nd exposure, and access to reward). Heart rate (HR) and heart rate variability (HRV) responses were recorded during testing in the experimental room and also when lying resting in a quiet familiar room. A new method of 'stitching' short periods of HRV data together was used in the analysis. When testing different stimuli, no significant differences were observed in HR and LF:HF ratio (relative power in low frequency (LF) and the high-frequency (HF) range), implying that the sympathetic tone was activated similarly for all the stimuli and may suggest that dogs were in a state of positive arousal. A decrease of HF was associated with the meatball stimulus compared to the food pellet and the reward phase (interacting with the person or eating the food) was associated with a decrease in HF and RMSSD (root mean square of successive differences of inter-beat intervals) compared to the preceding phase (looking at the person or food). This suggests that parasympathetic deactivation is associated with a more positive emotional state in the dog. A similar reduction

  19. Fusiform gyrus dysfunction is associated with perceptual processing efficiency to emotional faces in adolescent depression: a model-based approach

    Directory of Open Access Journals (Sweden)

    Tiffany Cheing Ho

    2016-02-01

    Full Text Available While the extant literature has focused on major depressive disorder (MDD as being characterized by abnormalities in processing affective stimuli (e.g., facial expressions, little is known regarding which specific aspects of cognition influence the evaluation of affective stimuli, and what are the underlying neural correlates. To investigate these issues, we assessed 26 adolescents diagnosed with MDD and 37 well-matched healthy controls (HCL who completed an emotion identification task of dynamically morphing faces during functional magnetic resonance imaging (fMRI. We analyzed the behavioral data using a sequential sampling model of response time (RT commonly used to elucidate aspects of cognition in binary perceptual decision making tasks: the Linear Ballistic Accumulator (LBA model. Using a hierarchical Bayesian estimation method, we obtained group-level and individual-level estimates of LBA parameters on the facial emotion identification task. While the MDD and HCL groups did not differ in mean RT, accuracy, or group-level estimates of perceptual processing efficiency (i.e., drift rate parameter of the LBA, the MDD group showed significantly reduced responses in left fusiform gyrus compared to the HCL group during the facial emotion identification task. Furthermore, within the MDD group, fMRI signal in the left fusiform gyrus during affective face processing was significantly associated with greater individual-level estimates of perceptual processing efficiency. Our results therefore suggest that affective processing biases in adolescents with MDD are characterized by greater perceptual processing efficiency of affective visual information in sensory brain regions responsible for the early processing of visual information. The theoretical, methodological, and clinical implications of our results are discussed.

  20. Shall I call, text, post it online or just tell it face-to-face? How and why Flemish adolescents choose to share their emotions on- or offline

    OpenAIRE

    Vermeulen, Anne; Vandebosch, Heidi; Heirman, Wannes

    2017-01-01

    Abstract: Social sharing of emotions is a frequently used emotion regulation strategy. This study adds to the emotion regulation literature and the affordances of technologies perspective by providing a better understanding of with whom adolescents share emotions on- and offline, how they do this and why they use certain modes. In-depth interviews with 22 Flemish adolescents (aged 1418) show that these youngsters share almost all experienced emotions, often with multiple recipients and using ...

  1. Social Anxiety Under Load: The Effects of Perceptual Load in Processing Emotional Faces

    Directory of Open Access Journals (Sweden)

    Sandra Cristina Soares

    2015-04-01

    Full Text Available Previous studies in the social anxiety arena have shown an impaired attentional control system, similar to that found in trait anxiety. However, the effect of task demands on social anxiety in socially threatening stimuli, such as angry faces, remains unseen. In the present study, fifty-four university students scoring high and low in the Social Interaction and Performance Anxiety and Avoidance Scale (SIPAAS questionnaire, participated in a target letter discrimination task while task-irrelevant face stimuli (angry, disgust, happy, and neutral were simultaneously presented. The results showed that high (compared to low socially anxious individuals were more prone to distraction by task-irrelevant stimuli, particularly under high perceptual load conditions. More importantly, for such individuals, the accuracy proportions for angry faces significantly differed between the low and high perceptual load conditions, which is discussed in light of current evolutionary models of social anxiety.

  2. Social anxiety under load: the effects of perceptual load in processing emotional faces.

    Science.gov (United States)

    Soares, Sandra C; Rocha, Marta; Neiva, Tiago; Rodrigues, Paulo; Silva, Carlos F

    2015-01-01

    Previous studies in the social anxiety arena have shown an impaired attentional control system, similar to that found in trait anxiety. However, the effect of task demands on social anxiety in socially threatening stimuli, such as angry faces, remains unseen. In the present study, 54 university students scoring high and low in the Social Interaction and Performance Anxiety and Avoidance Scale (SIPAAS) questionnaire, participated in a target letter discrimination task while task-irrelevant face stimuli (angry, disgust, happy, and neutral) were simultaneously presented. The results showed that high (compared to low) socially anxious individuals were more prone to distraction by task-irrelevant stimuli, particularly under high perceptual load conditions. More importantly, for such individuals, the accuracy proportions for angry faces significantly differed between the low and high perceptual load conditions, which is discussed in light of current evolutionary models of social anxiety.

  3. Cultural affordances and emotional experience: socially engaging and disengaging emotions in Japan and the United States.

    Science.gov (United States)

    Kitayama, Shinobu; Mesquita, Batja; Karasawa, Mayumi

    2006-11-01

    The authors hypothesized that whereas Japanese culture encourages socially engaging emotions (e.g., friendly feelings and guilt), North American culture fosters socially disengaging emotions (e.g., pride and anger). In two cross-cultural studies, the authors measured engaging and disengaging emotions repeatedly over different social situations and found support for this hypothesis. As predicted, Japanese showed a pervasive tendency to reportedly experience engaging emotions more strongly than they experienced disengaging emotions, but Americans showed a reversed tendency. Moreover, as also predicted, Japanese subjective well-being (i.e., the experience of general positive feelings) was more closely associated with the experience of engaging positive emotions than with that of disengaging emotions. Americans tended to show the reversed pattern. The established cultural differences in the patterns of emotion suggest the consistent and systematic cultural shaping of emotion over time.

  4. Writ Large on Your Face: Observing Emotions Using Automatic Facial Analysis

    Directory of Open Access Journals (Sweden)

    Dieckmann Anja

    2014-05-01

    Full Text Available Emotions affect all of our daily decisions and, of course, they also influence our evaluations of brands, products and advertisements. But what exactly do consumers feel when they watch a TV commercial, visit a website or when they interact with a brand in different ways? Measuring such emotions is not an easy task. In the past, the effectiveness of marketing material was evaluated mostly by subsequent surveys. Now, with the emergence of neuroscientific approaches like EEG, the measurement of real-time reactions is possible, for instance, when watching a commercial. However, most neuroscientific procedures are fairly invasive and irritating. For an EEG, for instance, numerous electrodes need to be placed on the participant's scalp. Furthermore, data analysis is highly complex. Scientific expertise is necessary for interpretation, so the procedure remains a black box to most practitioners and the results are still rather controversial. By contrast, automatic facial analysis provides similar information without having to wire study participants. In addition, the results of such analyses are intuitive and easy to interpret even for laypeople. These convincing advantages led GfK Company to decide on facial analysis and to develop a tool suitable for measuring emotional responses to marketing stimuli, making it easily applicable in marketing research practice.

  5. Reality, fantasy and emotional state in kindergarten children

    Directory of Open Access Journals (Sweden)

    Alexandra Maftei

    2017-12-01

    Full Text Available The aim of the present research was to study children’s ability to distinguish reality from fantasy based on individual differences in age, as well as on the valence generated by an event. Moreover, we explored the differences in emotional states generated by each type of stimuli, for each age category. A sample of 120 children from an urban kindergarten participated in the study. Each child was asked to respond to a set of questions after seeing eight pictures, different in terms of valence. The results revealed that preschool children aged 6 have a stronger ability to distinguish reality from fantasy, for each type of stimuli, compared to children aged 4 and 5. Moreover, the participants associated negative stimuli, both real and fantastic, with a higher level of negative affective state, compared to positive and also real images. The results are discussed from a sociodevelopmental perspective.

  6. The United States facing their petroleum dependence; Les Etats-Unis face a leur dependance petroliere

    Energy Technology Data Exchange (ETDEWEB)

    Noel, P. [Institut francais des Relations Internationals, 75 - Paris (France); Universite Pierre Mendes-France-IEPE-CNRS, 38 - Grenoble (France)

    2002-06-01

    In the framework of ''the energy crisis of 2000-2001'', the Cheney report and the petroleum dependence, this study presents a critical examination of the United States petroleum situation, its perception in the american political milieu and the public policies implementing during the last ten years. The first section is devoted to the petroleum supply. In the second section, the american petroleum policy and the energy safety are studied. (A.L.B.)

  7. Building Biases in Infancy: The Influence of Race on Face and Voice Emotion Matching

    Science.gov (United States)

    Vogel, Margaret; Monesson, Alexandra; Scott, Lisa S.

    2012-01-01

    Early in the first year of life infants exhibit equivalent performance distinguishing among people within their own race and within other races. However, with development and experience, their face recognition skills become tuned to groups of people they interact with the most. This developmental tuning is hypothesized to be the origin of adult…

  8. Attention Bias to Emotional Faces Varies by IQ and Anxiety in Williams Syndrome

    Science.gov (United States)

    McGrath, Lauren M.; Oates, Joyce M.; Dai, Yael G.; Dodd, Helen F.; Waxler, Jessica; Clements, Caitlin C.; Weill, Sydney; Hoffnagle, Alison; Anderson, Erin; MacRae, Rebecca; Mullett, Jennifer; McDougle, Christopher J.; Pober, Barbara R.; Smoller, Jordan W.

    2016-01-01

    Individuals with Williams syndrome (WS) often experience significant anxiety. A promising approach to anxiety intervention has emerged from cognitive studies of attention bias to threat. To investigate the utility of this intervention in WS, this study examined attention bias to happy and angry faces in individuals with WS (N = 46). Results showed…

  9. Face Emotion Processing in Depressed Children and Adolescents with and without Comorbid Conduct Disorder

    Science.gov (United States)

    Schepman, Karen; Taylor, Eric; Collishaw, Stephan; Fombonne, Eric

    2012-01-01

    Studies of adults with depression point to characteristic neurocognitive deficits, including differences in processing facial expressions. Few studies have examined face processing in juvenile depression, or taken account of other comorbid disorders. Three groups were compared: depressed children and adolescents with conduct disorder (n = 23),…

  10. Priming Facial Gender and Emotional Valence: The Influence of Spatial Frequency on Face Perception in ASD

    Science.gov (United States)

    Vanmarcke, Steven; Wagemans, Johan

    2017-01-01

    Adolescents with and without autism spectrum disorder (ASD) performed two priming experiments in which they implicitly processed a prime stimulus, containing high and/or low spatial frequency information, and then explicitly categorized a target face either as male/female (gender task) or as positive/negative (Valence task). Adolescents with ASD…

  11. Confronting the Challenges Facing the Nigerian Nation State: Using ...

    African Journals Online (AJOL)

    This paper explains the concept of nation-state and sports. It critically examines some of the challenges characterizing Nigerian nation state such as ethnicity, corruption, politics of godfatherisms, tribalism, language differences, and religious crises among others. It emphasizes the use of sports for creating of friendship ...

  12. Hemispheric contributions to the processing of emotion in chimeric faces : behavioural and electrophysiological evidence

    OpenAIRE

    Geiger, Anja

    2005-01-01

    The face in general and facial expressions in particular have always been a focus of sociological, biological and psychological interest, with numerous different scientific approaches and objectives, investigating the expression and perception of facial affect or the social interaction of transmitting facial information between poser and perceiver.This study, dealing with two different aspects of facial expression, namely intensity and efficiency of facial expression, focuses on hemispheric c...

  13. Social anxiety under load: the effects of perceptual load in processing emotional faces

    OpenAIRE

    Soares, Sandra C.; Rocha, Marta; Neiva, Tiago; Rodrigues, Paulo; Silva, Carlos F.

    2015-01-01

    Previous studies in the social anxiety arena have shown an impaired attentional control system, similar to that found in trait anxiety. However, the effect of task demands on social anxiety in socially threatening stimuli, such as angry faces, remains unseen. In the present study, 54 university students scoring high and low in the Social Interaction and Performance Anxiety and Avoidance Scale (SIPAAS) questionnaire, participated in a target letter discrimination task while task-irrelevant fac...

  14. The telltale face: possible mechanisms behind defector and cooperator recognition revealed by emotional facial expression metrics.

    Science.gov (United States)

    Kovács-Bálint, Zsófia; Bereczkei, Tamás; Hernádi, István

    2013-11-01

    In this study, we investigated the role of facial cues in cooperator and defector recognition. First, a face image database was constructed from pairs of full face portraits of target subjects taken at the moment of decision-making in a prisoner's dilemma game (PDG) and in a preceding neutral task. Image pairs with no deficiencies (n = 67) were standardized for orientation and luminance. Then, confidence in defector and cooperator recognition was tested with image rating in a different group of lay judges (n = 62). Results indicate that (1) defectors were better recognized (58% vs. 47%), (2) they looked different from cooperators (p towards the cooperator category (p < .01), and (4) females were more confident in detecting defectors (p < .05). According to facial microexpression analysis, defection was strongly linked with depressed lower lips and less opened eyes. Significant correlation was found between the intensity of micromimics and the rating of images in the cooperator-defector dimension. In summary, facial expressions can be considered as reliable indicators of momentary social dispositions in the PDG. Females may exhibit an evolutionary-based overestimation bias to detecting social visual cues of the defector face. © 2012 The British Psychological Society.

  15. Does a single session of electroconvulsive therapy alter the neural response to emotional faces in depression? A randomised sham-controlled functional magnetic resonance imaging study

    DEFF Research Database (Denmark)

    Miskowiak, Kamilla W; Kessing, Lars V; Ott, Caroline V

    2017-01-01

    neurocognitive bias in major depressive disorder. Patients with major depressive disorder were randomised to one active ( n=15) or sham electroconvulsive therapy ( n=12). The following day they underwent whole-brain functional magnetic resonance imaging at 3T while viewing emotional faces and performed facial...... expression recognition and dot-probe tasks. A single electroconvulsive therapy session had no effect on amygdala response to emotional faces. Whole-brain analysis revealed no effects of electroconvulsive therapy versus sham therapy after family-wise error correction at the cluster level, using a cluster...... to faces after a single electroconvulsive therapy session, the observed trend changes after a single electroconvulsive therapy session point to an early shift in emotional processing that may contribute to antidepressant effects of electroconvulsive therapy....

  16. Problems Faced by Mexican Asylum Seekers in the United States

    OpenAIRE

    J. Anna Cabot

    2014-01-01

    Violence in Mexico rose sharply in response to President Felipe Calderón’s military campaign against drug cartels which began in late 2006. As a consequence, the number of Mexicans who have sought asylum in the United States has grown significantly. In 2013, Mexicans made up the second largest group of defensive asylum seekers (those in removal proceedings) in the United States, behind only China (EOIR 2014b). Yet between 2008 and 2013, the grant rate for Mexican asylum seekers in immigration...

  17. Inducing and assessing differentiated emotion-feeling states in the laboratory.

    Science.gov (United States)

    Philippot, P

    1993-03-01

    Two questions are addressed. The first question pertains to the capacity of film segments to induce emotional states that are: (a) as comparable as possible to naturally occurring emotions; (b) similar across individuals; and (c) clearly differentiated across the intended emotions. The second question concerns the discriminant capacity of self-report questionnaires of emotion-feeling states differing in their theoretical assumptions. Subjects viewed six short film segments and rated the strength of their responses on one of three kinds of questionnaires. The questionnaires were: (1) the Differential Emotions Scale that postulates category-based distinctions between emotions; (2) the Semantic Differential that postulates that emotions are distinguished along bipolar dimensions; and (3) free labelling of their feelings by the subjects (control condition with no theoretical a priori). Overall, results indicate that film segments can elicit a diversity of predictable emotions, in the same way, in a majority of individuals. In the present procedure, the Differential Emotions Scale yielded a better discrimination between emotional states than the Semantic Differential. Implications for emotion research and theories of the cognitive structure of emotion are discussed.

  18. Gulf States Strategic Vision to Face Iranian Nuclear Project

    Science.gov (United States)

    2015-09-01

    plant of Kalaiya Landmarks in Iran (Maalem Kalaiya) looked like a conference and training center the size of a small hotel , and that the cyclotron given...nuclear reactor agreements with Iran. Moreover, the United States pressured Britain, France, Argentina, Brazil , and India not to deal with Iran in

  19. The State of Hispanic Health, 1992. Facing the Facts.

    Science.gov (United States)

    ASPIRA Association, Inc., Washington, DC. National Office.

    This publication offers an overview of the health of Hispanic Americans in the United States. Topics covered include the following: (1) Hispanic representation in health fields; (2) access to health care; (3) maternal and child health; (4) substance abuse; (5) Acquired Immune Deficiency Syndrome and Hispanics; (6) Hispanic elderly; (7) migrant…

  20. The emotional context facing nursing home residents' families: a call for role reinforcement strategies from nursing homes and the community.

    Science.gov (United States)

    Bern-Klug, Mercedes

    2008-01-01

    Identify useful concepts related to the emotional context facing family members of nursing home residents. These concepts can be used in future studies to design and test interventions that benefit family caregivers. Secondary data analyses of qualitative ethnographic data. Two nursing homes in a large Midwestern city; 8 months of data collection in each. 44 family members of nursing home residents whose health was considered, "declining." Role theory was used to design and help interpret the findings. Data included transcripts of conversations between family members and researchers and were analyzed using a coding scheme developed for the secondary analysis. Comments about emotions related to the social role of family member were grouped into three categories: relief related to admission, stress, and decision making support/stress. Subcategories of stress include the role strain associated with "competing concerns" and the psychological pressures of 1) witnessing the decline of a loved one in a nursing home, and 2) guilt about placement. Decision-making was discussed as a challenge which family members did not want to face alone; support from the resident, health care professionals, and other family members was appreciated. Family members may benefit from role reinforcement activities provided by nursing home staff and community members. All nursing home staff members (in particular social workers) and physicians are called upon to provide educationa and support regarding nursing home admissions, during the decline of the resident, and especially regarding medical decision-making. Community groups are asked to support the family member by offering assistance with concrete tasks (driving, visiting, etc.) and social support.

  1. Neural Correlates of Task-Irrelevant First and Second Language Emotion Words — Evidence from the Face-Word Stroop Task

    Directory of Open Access Journals (Sweden)

    Lin Fan

    2016-11-01

    Full Text Available Emotionally valenced words have thus far not been empirically examined in a bilingual population with the emotional face-word Stroop paradigm. Chinese-English bilinguals were asked to identify the facial expressions of emotion with their first (L1 or second (L2 language task-irrelevant emotion words superimposed on the face pictures. We attempted to examine how the emotional content of words modulates behavioral performance and cerebral functioning in the bilinguals’ two languages. The results indicated that there were significant congruency effects for both L1 and L2 emotion words, and that identifiable differences in the magnitude of Stroop effect between the two languages were also observed, suggesting L1 is more capable of activating the emotional response to word stimuli. For event-related potentials (ERPs data, an N350-550 effect was observed only in L1 task with greater negativity for incongruent than congruent trials. The size of N350-550 effect differed across languages, whereas no identifiable language distinction was observed in the effect of conflict slow potential (conflict SP. Finally, more pronounced negative amplitude at 230-330 ms was observed in L1 than in L2, but only for incongruent trials. This negativity, likened to an orthographic decoding N250, may reflect the extent of attention to emotion word processing at word-form level, while N350-550 reflects a complicated set of processes in the conflict processing. Overall, the face-word congruency effect has reflected identifiable language distinction at 230-330 and 350-550 ms, which provides supporting evidence for the theoretical proposals assuming attenuated emotionality of L2 processing.

  2. Committees State Health and Facing the Phenomenon of Health Judicialization

    Directory of Open Access Journals (Sweden)

    Homero Lamarão Neto

    2016-12-01

    Full Text Available The search for consensus methods of conflict resolution is not much explored in claims involving the public sector. The State Health Committees, created by determining the CNJ, with remarkable goal of consensual resolution on public health issues, have dialogue and academic discussion of evidence-based medicine as guidelines for a bold stance on the rights assurance, innovating behavior the judiciary in coping with the legalization of health phenomenon.

  3. EOM July FY2011 - Face Amount of Life Insurance Coverage by Program by State

    Data.gov (United States)

    Department of Veterans Affairs — Face value of insurance for each administered life insurance program listed by state. Data is current as of 7-31-11. All programs are closed to new issues except for...

  4. FY11_EOM_Oct_Face Amount of Life Insurance Coverage by Program by State

    Data.gov (United States)

    Department of Veterans Affairs — Face value of insurance for each administered life insurance program listed by state. Data is current as of 10-31-11. All programs are closed to new issues except...

  5. FY11_EOM_August_Face Amount of Life Insurance Coverage by Program by State

    Data.gov (United States)

    Department of Veterans Affairs — Face value of insurance for each administered life insurance program listed by state. Data is current as of 8-31-11. All programs are closed to new issues except for...

  6. Differential effects of face-realism and emotion on event-related brain potentials and their implications for the uncanny valley theory

    Science.gov (United States)

    Schindler, Sebastian; Zell, Eduard; Botsch, Mario; Kissler, Johanna

    2017-03-01

    Cartoon characters are omnipresent in popular media. While few studies have scientifically investigated their processing, in computer graphics, efforts are made to increase realism. Yet, close approximations of reality have been suggested to evoke sometimes a feeling of eeriness, the “uncanny valley” effect. Here, we used high-density electroencephalography to investigate brain responses to professionally stylized happy, angry, and neutral character faces. We employed six face-stylization levels varying from abstract to realistic and investigated the N170, early posterior negativity (EPN), and late positive potential (LPP) event-related components. The face-specific N170 showed a u-shaped modulation, with stronger reactions towards both most abstract and most realistic compared to medium-stylized faces. For abstract faces, N170 was generated more occipitally than for real faces, implying stronger reliance on structural processing. Although emotional faces elicited highest amplitudes on both N170 and EPN, on the N170 realism and expression interacted. Finally, LPP increased linearly with face realism, reflecting activity increase in visual and parietal cortex for more realistic faces. Results reveal differential effects of face stylization on distinct face processing stages and suggest a perceptual basis to the uncanny valley hypothesis. They are discussed in relation to face perception, media design, and computer graphics.

  7. Differential effects of face-realism and emotion on event-related brain potentials and their implications for the uncanny valley theory.

    Science.gov (United States)

    Schindler, Sebastian; Zell, Eduard; Botsch, Mario; Kissler, Johanna

    2017-03-23

    Cartoon characters are omnipresent in popular media. While few studies have scientifically investigated their processing, in computer graphics, efforts are made to increase realism. Yet, close approximations of reality have been suggested to evoke sometimes a feeling of eeriness, the "uncanny valley" effect. Here, we used high-density electroencephalography to investigate brain responses to professionally stylized happy, angry, and neutral character faces. We employed six face-stylization levels varying from abstract to realistic and investigated the N170, early posterior negativity (EPN), and late positive potential (LPP) event-related components. The face-specific N170 showed a u-shaped modulation, with stronger reactions towards both most abstract and most realistic compared to medium-stylized faces. For abstract faces, N170 was generated more occipitally than for real faces, implying stronger reliance on structural processing. Although emotional faces elicited highest amplitudes on both N170 and EPN, on the N170 realism and expression interacted. Finally, LPP increased linearly with face realism, reflecting activity increase in visual and parietal cortex for more realistic faces. Results reveal differential effects of face stylization on distinct face processing stages and suggest a perceptual basis to the uncanny valley hypothesis. They are discussed in relation to face perception, media design, and computer graphics.

  8. State and Employer: CERN faced with its responsibilities (final part)

    CERN Multimedia

    Association du personnel

    2009-01-01

    “The Organization must assume its responsibilities without reservation in both of its roles: as a State for our social security system (pensions, health insurance) and as an Employer (salaries, careers, contracts, etc.).” This is the key message passed on to you at the staff meetings last month. Our previous two editorials presented in detail the pensions and health insurance issues and the preparation for the 2010 five-yearly review. In this third part we talk about the contract policy and MARS.

  9. The Changing Face of Tobacco Use Among United States Youth

    OpenAIRE

    Lauterstein, Dana; Hoshino, Risa; Gordon, Terry; Watkins, Beverly-Xaviera; Weitzman, Michael; Zelikoff, Judith

    2014-01-01

    Tobacco use, primarily in the form of cigarettes, is the leading cause of preventable morbidity and mortality in the United States (U.S.). The adverse effects of tobacco use began to be recognized in the 1940’s and new hazards of active smoking and secondhand smoke (SHS) exposure from cigarettes continue to be identified to this day. This has led to a sustained and wide-ranging array of highly effective regulatory, public health, and clinical efforts that have been informed by extensive scien...

  10. Maternal Mental State Language and Preschool Children's Attachment Security: Relation to Children's Mental State Language and Expressions of Emotional Understanding

    Science.gov (United States)

    Mcquaid, Nancy; Bigelow, Ann E.; McLaughlin, Jessica; MacLean, Kim

    2008-01-01

    Mothers' mental state language in conversation with their preschool children, and children's preschool attachment security were examined for their effects on children's mental state language and expressions of emotional understanding in their conversation. Children discussed an emotionally salient event with their mothers and then relayed the…

  11. South Asian women's coping strategies in the face of domestic violence in the United States.

    Science.gov (United States)

    Bhandari, Shreya

    2018-02-01

    We conducted in-depth telephone interviews with a convenience sample of 20 South Asian women experiencing domestic violence in the United States. Utilizing the emotion-focused and problem-focused coping framework, the researchers analyzed the narratives of abused South Asian women. Emotion-focused coping strategies include (a) spirituality and/or religion and (b) the role of children. Problem-focused coping strategies include (c) informal and formal support and (d) strategies of resisting, pacifying, safety planning. Implications for practice and future research in the United States and internationally are discussed.

  12. Head position and spinal position as determinants of perceived emotional state.

    Science.gov (United States)

    Schouwstra, S J; Hoogstraten, J

    1995-10-01

    A sample of 60 first-year psychology students judged the emotional state of 21 drawn figures and completed the Adjective Checklist and a mood questionnaire. The judgments were affected by the interaction between head position and spinal position of the figure. Each figure was associated with a unique pattern of emotions, and the judgments given were not influenced by the subjects' own emotional state.

  13. The Changing Face of Tobacco Use Among United States Youth

    Science.gov (United States)

    Lauterstein, Dana; Hoshino, Risa; Gordon, Terry; Watkins, Beverly-Xaviera; Weitzman, Michael; Zelikoff, Judith

    2015-01-01

    Tobacco use, primarily in the form of cigarettes, is the leading cause of preventable morbidity and mortality in the United States (U.S.). The adverse effects of tobacco use began to be recognized in the 1940’s and new hazards of active smoking and secondhand smoke (SHS) exposure from cigarettes continue to be identified to this day. This has led to a sustained and wide-ranging array of highly effective regulatory, public health, and clinical efforts that have been informed by extensive scientific data, resulting in marked decreases in the use of cigarettes. Unfortunately, the dramatic recent decline in cigarette use in the U.S., has been accompanied by an upsurge in adolescent and young adult use of new, non-cigarette tobacco and nicotine-delivery products, commonly referred to as alternative tobacco products (ATPs). Commonly used ATPs include hookah, cigars, smokeless tobacco, and electronic cigarettes. While there have been a number of review articles that focus on adult ATP use, the purpose of this review is to provide an overview of what is, and is not known about emerging ATP use among U.S. adolescents on a national scale; as well as to identify research gaps in knowledge, and discuss future health and policy needs for this growing public health concern. This paper is not meant to systemically review all published survey data, but to present clear depiction of selected ATP usage in youth populations using national survey data. PMID:25323124

  14. Making Decisions under Ambiguity : Judgment Bias Tasks for Assessing Emotional State in Animals

    NARCIS (Netherlands)

    Roelofs, Sanne|info:eu-repo/dai/nl/413320626; Boleij, Hetty|info:eu-repo/dai/nl/315028815; Nordquist, Rebecca E|info:eu-repo/dai/nl/296303291; van der Staay, Franz Josef|info:eu-repo/dai/nl/074262653

    2016-01-01

    Judgment bias tasks (JBTs) are considered as a family of promising tools in the assessment of emotional states of animals. JBTs provide a cognitive measure of optimism and/or pessimism by recording behavioral responses to ambiguous stimuli. For instance, a negative emotional state is expected to

  15. Mother and Infant Talk about Mental States Relates to Desire Language and Emotion Understanding

    Science.gov (United States)

    Taumoepeau, Mele; Ruffman, Ted

    2006-01-01

    This study assessed the relation between mother mental state language and child desire language and emotion understanding in 15--24-month-olds. At both times point, mothers described pictures to their infants and mother talk was coded for mental and nonmental state language. Children were administered 2 emotion understanding tasks and their mental…

  16. Young Children's Reasoning about the Effects of Emotional and Physiological States on Academic Performance

    Science.gov (United States)

    Amsterlaw, Jennifer; Lagattuta, Kristin Hansen; Meltzoff, Andrew N.

    2009-01-01

    This study assessed young children's understanding of the effects of emotional and physiological states on cognitive performance. Five, 6-, 7-year-olds, and adults (N = 96) predicted and explained how children experiencing a variety of physiological and emotional states would perform on academic tasks. Scenarios included: (a) negative and positive…

  17. Be Cool with Academic Stress: The Association between Emotional States and Regulatory Strategies among Chinese Adolescents

    Science.gov (United States)

    Sang, Biao; Pan, Tingting; Deng, Xinmei; Zhao, Xu

    2018-01-01

    Numerous studies have suggested that academic stress has negative impact on adolescents' psychological function, few of those studies, however, considered whether and how the impact of stress on adolescents' emotional states is moderated by corresponding regulation. This study aimed to examine the fluctuation of emotional states before and after…

  18. Emotions

    DEFF Research Database (Denmark)

    Kristensen, Liv Kondrup; Otrel-Cass, Kathrin

    2017-01-01

    Observing science classroom activities presents an opportunity to observe the emotional aspect of interactions, and this chapter presents how this can be done and why. Drawing on ideas proposed by French philosopher Maurice Merleau-Ponty, emotions are theorized as publicly embodied enactments......, where differences in behavior between people shape emotional responses. Merleau-Ponty’s theorization of the body and feelings is connected to embodiment while examining central concepts such as consciousness and perception. Merleau-Ponty describes what he calls the emotional atmosphere and how it shapes...... the ways we experience events and activities. We use our interpretation of his understanding of emotions to examine an example of a group of year 8 science students who were engaged in a physics activity. Using the analytical framework of analyzing bodily stance by Goodwin, Cekaite, and Goodwin...

  19. Developing an eBook-Integrated High-Fidelity Mobile App Prototype for Promoting Child Motor Skills and Taxonomically Assessing Children's Emotional Responses Using Face and Sound Topology.

    Science.gov (United States)

    Brown, William; Liu, Connie; John, Rita Marie; Ford, Phoebe

    2014-01-01

    Developing gross and fine motor skills and expressing complex emotion is critical for child development. We introduce "StorySense", an eBook-integrated mobile app prototype that can sense face and sound topologies and identify movement and expression to promote children's motor skills and emotional developmental. Currently, most interactive eBooks on mobile devices only leverage "low-motor" interaction (i.e. tapping or swiping). Our app senses a greater breath of motion (e.g. clapping, snapping, and face tracking), and dynamically alters the storyline according to physical responses in ways that encourage the performance of predetermined motor skills ideal for a child's gross and fine motor development. In addition, our app can capture changes in facial topology, which can later be mapped using the Facial Action Coding System (FACS) for later interpretation of emotion. StorySense expands the human computer interaction vocabulary for mobile devices. Potential clinical applications include child development, physical therapy, and autism.

  20. [Correlation between psychological state and emotional intelligence in residents of gynecology, and obstetrics].

    Science.gov (United States)

    Carranza-Lira, Sebastián

    2016-01-01

    Emotional intelligence is our capacity to acknowledge our own emotions, and the emotions of other people; it also has to do with the way emotions must be understood, and used productively. Given that an altered state of mind can have an impact on emotional intelligence, our objective was to correlate the psychological state with emotional intelligence in residents of gynecology, and obstetrics. We assessed 76 gynecology and obstetrics residents by using What's my M3 and TMMS-24 instruments, in order to know the influence of psychological state on emotional intelligence. In male students of second grade, there was a positive correlation between obsessive-compulsive disorder (OCD) and emotional attention (EA), and a negative correlation with emotional clarity (EC). In third grade males, anxiety, bipolar disorder, and posttraumatic stress disorder (PTSD) correlated positively with EA. In male students of fourth grade there was a positive correlation between OCD and EA. In second grade female students, depression correlated negatively with emotional repair (ER). In third grade female students anxiety, bipolar disorder, and PTSD correlated positively with EA. In fourth grade female students there was a negative correlation between depression and EA, and between anxiety, OCD, and PTSD with EC. Psychological status has a favorable impact on EA and a negative effect on EC and ER.

  1. Emotion talk in the context of young people self?harming: facing the feelings in family therapy

    OpenAIRE

    Rogers, Alice; Schmidt, Petra

    2016-01-01

    This article describes the use of emotion talk in the context of using a manualised approach to family therapy where the presenting problem is self?harm. Whilst we understand that there is an internal aspect to emotion, we also consider emotions to be socially purposeful, culturally constructed and interactional. We found that within the presenting families, negative emotions were often talked about as located within the young person. Through using ?emotion talk? (Fredman, 2004) in deconstruc...

  2. Emotion talk in the context of young people self‐harming: facing the feelings in family therapy

    Science.gov (United States)

    Schmidt, Petra

    2016-01-01

    This article describes the use of emotion talk in the context of using a manualised approach to family therapy where the presenting problem is self‐harm. Whilst we understand that there is an internal aspect to emotion, we also consider emotions to be socially purposeful, culturally constructed and interactional. We found that within the presenting families, negative emotions were often talked about as located within the young person. Through using ‘emotion talk’ (Fredman, 2004) in deconstructing and tracking emotions and exploring how emotions connected to family‐of‐origin and cultural contexts, we developed an interactional understanding of these emotions. This led to better emotional regulation within the family and offered alternative ways of relating. The article discusses the use of relational reflexivity, and using the therapist and team's emotions to enable the therapeutic process, encouraging reflexivity on the self of the therapist in relation to work with emotions. Practitioner points Emotions can be seen as both a reflection of feelings experienced by the individual and as a communication.An interactional understanding of emotions can be used therapeutically.Therapists should explore emotional displays and track the interactional patterns within the therapeutic system.Therapists should self‐reflexive about ways of doing emotions and use this awareness in practice. PMID:27667879

  3. Emotion talk in the context of young people self-harming: facing the feelings in family therapy.

    Science.gov (United States)

    Rogers, Alice; Schmidt, Petra

    2016-04-01

    This article describes the use of emotion talk in the context of using a manualised approach to family therapy where the presenting problem is self-harm. Whilst we understand that there is an internal aspect to emotion, we also consider emotions to be socially purposeful, culturally constructed and interactional. We found that within the presenting families, negative emotions were often talked about as located within the young person. Through using 'emotion talk' (Fredman, 2004) in deconstructing and tracking emotions and exploring how emotions connected to family-of-origin and cultural contexts, we developed an interactional understanding of these emotions. This led to better emotional regulation within the family and offered alternative ways of relating. The article discusses the use of relational reflexivity, and using the therapist and team's emotions to enable the therapeutic process, encouraging reflexivity on the self of the therapist in relation to work with emotions. Emotions can be seen as both a reflection of feelings experienced by the individual and as a communication.An interactional understanding of emotions can be used therapeutically.Therapists should explore emotional displays and track the interactional patterns within the therapeutic system.Therapists should self-reflexive about ways of doing emotions and use this awareness in practice.

  4. Self-esteem Modulates the P3 Component in Response to the Self-face Processing after Priming with Emotional Faces

    OpenAIRE

    Lili Guan; Lili Guan; Yufang Zhao; Yige Wang; Yujie Chen; Juan Yang

    2017-01-01

    The self-face processing advantage (SPA) refers to the research finding that individuals generally recognize their own face faster than another’s face; self-face also elicits an enhanced P3 amplitude compared to another’s face. It has been suggested that social evaluation threats could weaken the SPA and that self-esteem could be regarded as a threat buffer. However, little research has directly investigated the neural evidence of how self-esteem modulates the social evaluation threat to the ...

  5. The Effects of Transient Emotional State and Workload on Size Scaling in Perspective Displays

    Energy Technology Data Exchange (ETDEWEB)

    Tuan Q. Tran; Kimberly R. Raddatz

    2006-10-01

    Previous research has been devoted to the study of perceptual (e.g., number of depth cues) and cognitive (e.g., instructional set) factors that influence veridical size perception in perspective displays. However, considering that perspective displays have utility in high workload environments that often induce high arousal (e.g., aircraft cockpits), the present study sought to examine the effect of observers’ emotional state on the ability to perceive and judge veridical size. Within a dual-task paradigm, observers’ ability to make accurate size judgments was examined under conditions of induced emotional state (positive, negative, neutral) and high and low workload. Results showed that participants in both positive and negative induced emotional states were slower to make accurate size judgments than those not under induced emotional arousal. Results suggest that emotional state is an important factor that influences visual performance on perspective displays and is worthy of further study.

  6. When the face reveals what words do not: facial expressions of emotion, smiling, and the willingness to disclose childhood sexual abuse.

    Science.gov (United States)

    Bonanno, George A; Keltner, Dacher; Noll, Jennie G; Putnam, Frank W; Trickett, Penelope K; LeJeune, Jenna; Anderson, Cameron

    2002-07-01

    For survivors of childhood sexual abuse (CSA), verbal disclosure is often complex and painful. The authors examined the voluntary disclosure-nondisclosure of CSA in relation to nonverbal expressions of emotion in the face. Consistent with hypotheses derived from recent theorizing about the moral nature of emotion, CSA survivors who did not voluntarily disclose CSA showed greater facial expressions of shame, whereas CSA survivors who voluntarily disclosed CSA expressed greater disgust. Expressions of disgust also signaled sexual abuse accompanied by violence. Consistent with recent theorizing about smiling behavior, CSA nondisclosers made more polite smiles, whereas nonabused participants expressed greater genuine positive emotion. Discussion addressed the implications of these findings for the study of disclosure of traumatic events, facial expression, and the links between morality and emotion.

  7. Antecedents of and Reactions to Emotions in the United States and Japan

    OpenAIRE

    Matsumoto, David; Kudoh, Tsutomu; Scherer, Klaus R.; Wallbott, Harald

    1988-01-01

    In this study, we examined the degree of cultural similarity and specificity in emotional experience by asking subjects in the United States and Japan to report their experiences and reactions concerning seven different emotions. The data used for this study were part of a larger cross-cultural study of emotion antecedents and reaxtions involving more than 2 000 subjects in 27 countries (Wallbott & Scherer, 1986). The American-Japanese comparison is a particularly interesting onem given t...

  8. Is empathy necessary to comprehend the emotional faces? The empathic effect on attentional mechanisms (eye movements), cortical correlates (N200 event-related potentials) and facial behaviour (electromyography) in face processing.

    Science.gov (United States)

    Balconi, Michela; Canavesio, Ylenia

    2016-01-01

    The present research explored the effect of social empathy on processing emotional facial expressions. Previous evidence suggested a close relationship between emotional empathy and both the ability to detect facial emotions and the attentional mechanisms involved. A multi-measure approach was adopted: we investigated the association between trait empathy (Balanced Emotional Empathy Scale) and individuals' performance (response times; RTs), attentional mechanisms (eye movements; number and duration of fixations), correlates of cortical activation (event-related potential (ERP) N200 component), and facial responsiveness (facial zygomatic and corrugator activity). Trait empathy was found to affect face detection performance (reduced RTs), attentional processes (more scanning eye movements in specific areas of interest), ERP salience effect (increased N200 amplitude), and electromyographic activity (more facial responses). A second important result was the demonstration of strong, direct correlations among these measures. We suggest that empathy may function as a social facilitator of the processes underlying the detection of facial emotion, and a general "facial response effect" is proposed to explain these results. We assumed that empathy influences cognitive and the facial responsiveness, such that empathic individuals are more skilful in processing facial emotion.

  9. Human Emotional State and its Relevance for Military VR Training

    National Research Council Canada - National Science Library

    Rizzo, Albert; Morie, Jacquelyn F; Williams, Josh; Pair, Jarrell; Buckwalter, J. G

    2005-01-01

    .... Real world military training often naturally includes stress induction that aims to promote a similarity of internal emotional stimulus cues with what is expected to be present on the battlefield...

  10. Most frequent emotional states in convalescent patients of myocardial infarction and its relationship to cardiovascular health state

    Directory of Open Access Journals (Sweden)

    María C. García Martín

    2016-03-01

    Conclusions: There was a predominance of partially offset somatic state of health. High levels of anxiety and depression states were identified and it was found the existence of an important relation between anxiety-depression emotional states, and the somatic state of health relating to the cardiovascular system in patients convalescent from myocardial infarction.

  11. State-Dependent Differences in Emotion Regulation Between Unmedicated Bipolar Disorder and Major Depressive Disorder.

    Science.gov (United States)

    Rive, Maria M; Mocking, Roel J T; Koeter, Maarten W J; van Wingen, Guido; de Wit, Stella J; van den Heuvel, Odile A; Veltman, Dick J; Ruhé, Henricus G; Schene, Aart H

    2015-07-01

    Major depressive disorder (MDD) and bipolar disorder (BD) are difficult to distinguish clinically during the depressed or remitted states. Both mood disorders are characterized by emotion regulation disturbances; however, little is known about emotion regulation differences between MDD and BD. Better insight into these differences would be helpful for differentiation based on disorder-specific underlying pathophysiological mechanisms. Previous studies comparing these disorders often allowed medication use, limiting generalizability and validity. Moreover, patients with MDD and BD were mostly compared during the depressed, but not the remitted, state, while state might potentially modulate differences between MDD and BD. To investigate positive and negative emotion regulation in medication-free patients with MDD and BD in 2 mood states: depressed or remitted. A cross-sectional study conducted from May 2009 to August 2013 comparing behavioral and functional magnetic resonance imaging emotion regulation data of 42 patients with MDD, 35 with BD, and 36 healthy control (HC) participants free of psychotropic medication recruited from several psychiatric institutions across the Netherlands. A voluntary emotion regulation functional magnetic resonance imaging task using positive and negative pictures. Behavioral and functional magnetic resonance imaging blood oxygen level-dependent responses during emotion regulation. In the remitted state, only patients with BD showed impaired emotion regulation (t = 3.39; P emotion type and associated with increased dorsolateral prefrontal cortex activity compared with those with MDD and healthy control participants (P = .008). In the depressed state, patients with MDD and BD differed with regard to happy vs sad emotion regulation (t = 4.19; P differences in rostral anterior cingulate activity (P emotions poorly compared with those with BD and healthy control participants, while they demonstrated no rostral anterior

  12. Antecedents of and Reactions to Emotions in the United States and Japan.

    Science.gov (United States)

    Matsumoto, David; And Others

    1988-01-01

    Examines the degree of cultural similarity and specificity in the emotional experiences of subjects from the United States and Japan. Found a high degree of cultural agreement in the antecedent/evaluation process, but some differences in relative/expressive aspects of emotion. (Author/BJV)

  13. Knowledge Activation versus Sentence Mapping When Representing Fictional Characters' Emotional States.

    Science.gov (United States)

    Gernsbacher, Morton Ann; Robertson, Rachel R. W.

    1992-01-01

    In a study of knowledge activation and sentence mapping, subjects read stories that described concrete actions, and then the content of the stories was manipulated (i.e. stories were written that implied different emotional states). It is suggested that the more emotionally evoking situations one encounters the more memory traces are stored and…

  14. Fundamental Frequency Extraction Method using Central Clipping and its Importance for the Classification of Emotional State

    Directory of Open Access Journals (Sweden)

    Pavol Partila

    2012-01-01

    Full Text Available The paper deals with a classification of emotional state. We implemented a method for extracting the fundamental speech signal frequency by means of a central clipping and examined a correlation between emotional state and fundamental speech frequency. For this purpose, we applied an approach of exploratory data analysis. The ANOVA (Analysis of variance test confirmed that a modification in the speaker's emotional state changes the fundamental frequency of human vocal tract. The main contribution of the paper lies in investigation, of central clipping method by the ANOVA.

  15. Specific Patterns of Emotion Recognition from Faces in Children with ASD: Results of a Cross-Modal Matching Paradigm

    Science.gov (United States)

    Golan, Ofer; Gordon, Ilanit; Fichman, Keren; Keinan, Giora

    2018-01-01

    Children with ASD show emotion recognition difficulties, as part of their social communication deficits. We examined facial emotion recognition (FER) in intellectually disabled children with ASD and in younger typically developing (TD) controls, matched on mental age. Our emotion-matching paradigm employed three different modalities: facial, vocal…

  16. Does a single session of electroconvulsive therapy alter the neural response to emotional faces in depression? A randomised sham-controlled functional magnetic resonance imaging study.

    Science.gov (United States)

    Miskowiak, Kamilla W; Kessing, Lars V; Ott, Caroline V; Macoveanu, Julian; Harmer, Catherine J; Jørgensen, Anders; Revsbech, Rasmus; Jensen, Hans M; Paulson, Olaf B; Siebner, Hartwig R; Jørgensen, Martin B

    2017-09-01

    Negative neurocognitive bias is a core feature of major depressive disorder that is reversed by pharmacological and psychological treatments. This double-blind functional magnetic resonance imaging study investigated for the first time whether electroconvulsive therapy modulates negative neurocognitive bias in major depressive disorder. Patients with major depressive disorder were randomised to one active ( n=15) or sham electroconvulsive therapy ( n=12). The following day they underwent whole-brain functional magnetic resonance imaging at 3T while viewing emotional faces and performed facial expression recognition and dot-probe tasks. A single electroconvulsive therapy session had no effect on amygdala response to emotional faces. Whole-brain analysis revealed no effects of electroconvulsive therapy versus sham therapy after family-wise error correction at the cluster level, using a cluster-forming threshold of Z>3.1 ( p2.3; pelectroconvulsive therapy-induced changes in parahippocampal and superior frontal responses to fearful versus happy faces as well as in fear-specific functional connectivity between amygdala and occipito-temporal regions. Across all patients, greater fear-specific amygdala - occipital coupling correlated with lower fear vigilance. Despite no statistically significant shift in neural response to faces after a single electroconvulsive therapy session, the observed trend changes after a single electroconvulsive therapy session point to an early shift in emotional processing that may contribute to antidepressant effects of electroconvulsive therapy.

  17. The voice of emotion across species: how do human listeners recognize animals' affective states?

    Directory of Open Access Journals (Sweden)

    Marina Scheumann

    Full Text Available Voice-induced cross-taxa emotional recognition is the ability to understand the emotional state of another species based on its voice. In the past, induced affective states, experience-dependent higher cognitive processes or cross-taxa universal acoustic coding and processing mechanisms have been discussed to underlie this ability in humans. The present study sets out to distinguish the influence of familiarity and phylogeny on voice-induced cross-taxa emotional perception in humans. For the first time, two perspectives are taken into account: the self- (i.e. emotional valence induced in the listener versus the others-perspective (i.e. correct recognition of the emotional valence of the recording context. Twenty-eight male participants listened to 192 vocalizations of four different species (human infant, dog, chimpanzee and tree shrew. Stimuli were recorded either in an agonistic (negative emotional valence or affiliative (positive emotional valence context. Participants rated the emotional valence of the stimuli adopting self- and others-perspective by using a 5-point version of the Self-Assessment Manikin (SAM. Familiarity was assessed based on subjective rating, objective labelling of the respective stimuli and interaction time with the respective species. Participants reliably recognized the emotional valence of human voices, whereas the results for animal voices were mixed. The correct classification of animal voices depended on the listener's familiarity with the species and the call type/recording context, whereas there was less influence of induced emotional states and phylogeny. Our results provide first evidence that explicit voice-induced cross-taxa emotional recognition in humans is shaped more by experience-dependent cognitive mechanisms than by induced affective states or cross-taxa universal acoustic coding and processing mechanisms.

  18. Put on a happy face! Inhibitory control and socioemotional knowledge predict emotion regulation in 5- to 7-year-olds.

    Science.gov (United States)

    Hudson, Amanda; Jacques, Sophie

    2014-07-01

    Children's developing capacity to regulate emotions may depend on individual characteristics and other abilities, including age, sex, inhibitory control, theory of mind, and emotion and display rule knowledge. In the current study, we examined the relations between these variables and children's (N=107) regulation of emotion in a disappointing gift paradigm as well as their relations with the amount of effort to control emotion children exhibited after receiving the disappointing gift. Regression analyses were also conducted to identify unique predictors. Children's understanding of others' emotions and emotion display rules, as well as their inhibitory control skills, emerged as significant correlates of emotion regulation and predicted children's responses to the disappointing gift even after controlling for other relevant variables. Age and inhibitory control significantly predicted the amount of overt effort that went into regulating emotions, as did emotion knowledge (albeit only marginally). Together, findings suggest that effectively regulating emotions requires (a) knowledge of context-appropriate emotions along with (b) inhibitory skills to implement that knowledge. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. VALUE OF HEART RATE VARIABILITY ANALYSIS IN DIAGNOSTICS OF THE EMOTIONAL STATE

    Directory of Open Access Journals (Sweden)

    І. Chaykovskyi

    2012-11-01

    Full Text Available The is presented the development of method for evaluation of emotional state of man, what suitable for use at the workplace based on analysis of heart rate (HR variability. 28 healthy volunteers were examined. 3 audiovisual clips were consistently presented on the display of the personal computer for each of them. One clip contained information originating the positive emotions, the second one – negative emotions, the third one – neutral. All possible pairs of the emotional states were analysed with help of one- and multi-dimensional linear discriminant analysis based on HR variability. Showing the emotional video-clips (of both signs causes reliable slowing of HR frequency and also some decreasing of HR variability. In addition, negative emotions cause regularizing and simplification of structural organization of heart rhythm. Accuracy of discrimination for pair “emotional – neutral” video clips was 98 %, for pair “rest – neutral” was 74 %, for pair “positive – negative” was 91 %. Analysis of HR variability enables to determine the emotional state of observed person at the workplace with high reliability.

  20. Emotional State Classification in Virtual Reality Using Wearable Electroencephalography

    Science.gov (United States)

    Suhaimi, N. S.; Teo, J.; Mountstephens, J.

    2018-03-01

    This paper presents the classification of emotions on EEG signals. One of the key issues in this research is the lack of mental classification using VR as the medium to stimulate emotion. The approach towards this research is by using K-nearest neighbor (KNN) and Support Vector Machine (SVM). Firstly, each of the participant will be required to wear the EEG headset and recording their brainwaves when they are immersed inside the VR. The data points are then marked if they showed any physical signs of emotion or by observing the brainwave pattern. Secondly, the data will then be tested and trained with KNN and SVM algorithms. The accuracy achieved from both methods were approximately 82% throughout the brainwave spectrum (α, β, γ, δ, θ). These methods showed promising results and will be further enhanced using other machine learning approaches in VR stimulus.

  1. Autobiographically recalled emotional states impact forward gait initiation as a function of motivational direction.

    Science.gov (United States)

    Fawver, Bradley; Hass, Chris J; Park, Kyoungshin D; Janelle, Christopher M

    2014-12-01

    The impact of self-generated affective states on self-initiated motor behavior remains unspecified. The purpose of the current study was to determine how self-generated emotional states impact forward gait initiation. Participants recalled past emotional experiences (anger, fear, happy, sad, and neutral), "relived" those emotional memories before gait initiation (GI), and then walked ∼4 m across the laboratory floor. Kinetic and kinematic data revealed GI characteristics consistent with a motivational direction hypothesis. Specifically, participants produced greater posterior-lateral displacement and velocity of their center of pressure (COP) during the initial phase of GI after self-generation of happy and anger emotional states relative to sad ones. During the second phase of GI, greater medial displacement of COP was found during the happy condition compared with sad, greater velocity was occasioned during happy and angry trials compared with sad, and greater velocity was exhibited after happy compared with fear memories. Finally, greater anterior velocity was produced by participants during the final phase of GI for happy and angry memories compared with sad ones. Steady state kinetic and kinematic data when recalling happy and angry memories (longer, faster, and more forceful stepping behavior) followed the anticipatory postural adjustments noted during GI. Together the results from GI and steady state gait provide robust evidence that self-generated emotional states impact forward gait behavior based on motivational direction. Endogenous manipulations of emotional states hold promise for clinical and performance interventions aimed at improving self-initiated movement.

  2. THE INFLUENCE OF SELF-ESTEEM ON THE EMOTIONAL STATE OF AN ATHLETE AS PERSONALITY

    Directory of Open Access Journals (Sweden)

    Vysochina N.

    2010-03-01

    Full Text Available Annotation. Studies and analyses the influence of psychological factors on the emotional state of an athlete as personality. Scientific literature elucidates poorly the impact of self-esteem on the emotional state of an athlete as a factor promoting optimization of professional activity, which has made this problem very interesting for the study. The aim of this study is to trace the relationship between the self-esteem level and emotional state of an athlete personality as a factor promoting optimization of professional activity. The following methods were used: theoretical analysis, compilation and systematization of data from scientific literature. Research shows that the level of self-esteem exerts direct effect on the emotional state of an athlete, which predetermines his professional results.

  3. Preschool-aged children’s understanding of gratitude: Relations with emotion and mental state knowledge

    Science.gov (United States)

    Nelson, Jackie A.; de Lucca Freitas, Lia Beatriz; O’Brien, Marion; Calkins, Susan D.; Leerkes, Esther M.; Marcovitch, Stuart

    2016-01-01

    Developmental precursors to children’s early understanding of gratitude were examined. A diverse group of 263 children were tested for emotion and mental state knowledge at ages 3 and 4, and their understanding of gratitude was measured at age 5. Children varied widely in their understanding of gratitude, but most understood some aspects of gratitude-eliciting situations. A model-building path analysis approach was used to examine longitudinal relations among early emotion and mental state knowledge and later understanding of gratitude. Children with a better early understanding of emotions and mental states understand more about gratitude. Mental state knowledge at age 4 mediated the relation between emotion knowledge at age 3 and gratitude understanding at age 5. The current study contributes to the scant literature on the early emergence of children’s understanding of gratitude. PMID:23331105

  4. The effect of teacher’s positive personal resource of features of students’ emotional states

    Directory of Open Access Journals (Sweden)

    R.A. Trulyaev

    2013-10-01

    Full Text Available We reveal the psychological mechanisms of impact of the formation level of the teacher’s positive values on the academic performance of students, one of the key components of which are the emotional states of students. We describe a study aimed to test the hypothesis that the positive values and standing behind them “strong” character traits of the teacher determine the emotional states specific of his students during the lesson. The study involved 241 teachers of school subjects and 498 pupils of VI, VIII, X, XI grades of several schools in Krivoy Rog. The study demonstrated that a high level of expression of teacher’s positive values, reflected in his professional qualities, provide the appearance of positive emotional states of students. We also revealed patterns of influence of teacher’s positive personal resource on the intensity of the emotional states experienced by students during lessons.

  5. Public Higher-Education Systems Face Painful Choices as Three Northeastern States Confront Massive Deficits.

    Science.gov (United States)

    Blumenstyk, Goldie

    1989-01-01

    Massachusetts, Connecticut, and New York face giant deficits in their state budgets. The financial impact of the 1986 federal tax reform law was underestimated by colleges and income estimates were overly optimistic for 1988 and 1989. Unpopular, new taxes are seen as the way to solve the budget crunch. (MLW)

  6. Effect of stacking fault energy on steady-state creep rate of face ...

    African Journals Online (AJOL)

    Continuum elastic theory was used to establish the relationships between the force of interaction required to constrict dislocation partials, energy of constriction and climb velocity of the constricted thermal jogs, in order to examine the effect of stacking fault energy (SFE) on steady state creep rate of face centered cubic ...

  7. Functional connectivity decreases in autism in emotion, self, and face circuits identified by Knowledge-based Enrichment Analysis.

    Science.gov (United States)

    Cheng, Wei; Rolls, Edmund T; Zhang, Jie; Sheng, Wenbo; Ma, Liang; Wan, Lin; Luo, Qiang; Feng, Jianfeng

    2017-03-01

    A powerful new method is described called Knowledge based functional connectivity Enrichment Analysis (KEA) for interpreting resting state functional connectivity, using circuits that are functionally identified using search terms with the Neurosynth database. The method derives its power by focusing on neural circuits, sets of brain regions that share a common biological function, instead of trying to interpret single functional connectivity links. This provides a novel way of investigating how task- or function-related networks have resting state functional connectivity differences in different psychiatric states, provides a new way to bridge the gap between task and resting-state functional networks, and potentially helps to identify brain networks that might be treated. The method was applied to interpreting functional connectivity differences in autism. Functional connectivity decreases at the network circuit level in 394 patients with autism compared with 473 controls were found in networks involving the orbitofrontal cortex, anterior cingulate cortex, middle temporal gyrus cortex, and the precuneus, in networks that are implicated in the sense of self, face processing, and theory of mind. The decreases were correlated with symptom severity. Copyright © 2017. Published by Elsevier Inc.

  8. Emotion

    DEFF Research Database (Denmark)

    Jantzen, Christian; Vetner, Mikael

    2006-01-01

    En emotion er en evaluerende respons på en betydningsfuld hændelse, som har affektiv valens og motiverer organismen i forhold til objektverdenen (omverden). Emotioner fører til affekt: til smerte (negativ) eller glæde (positiv affekt). Både positive og negative emotioner påvirker organismens...

  9. Face-to-Face and Online: An Investigation of Children's and Adolescents' Bullying Behavior through the Lens of Moral Emotions and Judgments

    Science.gov (United States)

    Conway, Lauryn; Gomez-Garibello, Carlos; Talwar, Victoria; Shariff, Shaheen

    2016-01-01

    The current study investigated the influence of type of aggression (cyberbullying or traditional bullying) and participant role (bystander or perpetrator) on children and adolescents' self-attribution of moral emotions and judgments, while examining the influence of chronological age. Participants (N = 122, 8-16 years) evaluated vignettes and were…

  10. Combination of Empirical Mode Decomposition Components of HRV Signals for Discriminating Emotional States

    Directory of Open Access Journals (Sweden)

    Ateke Goshvarpour

    2016-06-01

    Full Text Available Introduction Automatic human emotion recognition is one of the most interesting topics in the field of affective computing. However, development of a reliable approach with a reasonable recognition rate is a challenging task. The main objective of the present study was to propose a robust method for discrimination of emotional responses thorough examination of heart rate variability (HRV. In the present study, considering the non-stationary and non-linear characteristics of HRV, empirical mode decomposition technique was utilized as a feature extraction approach. Materials and Methods In order to induce the emotional states, images indicating four emotional states, i.e., happiness, peacefulness, sadness, and fearfulness were presented. Simultaneously, HRV was recorded in 47 college students. The signals were decomposed into some intrinsic mode functions (IMFs. For each IMF and different IMF combinations, 17 standard and non-linear parameters were extracted. Wilcoxon test was conducted to assess the difference between IMF parameters in different emotional states. Afterwards, a probabilistic neural network was used to classify the features into emotional classes. Results Based on the findings, maximum classification rates were achieved when all IMFs were fed into the classifier. Under such circumstances, the proposed algorithm could discriminate the affective states with sensitivity, specificity, and correct classification rate of 99.01%, 100%, and 99.09%, respectively. In contrast, the lowest discrimination rates were attained by IMF1 frequency and its combinations. Conclusion The high performance of the present approach indicated that the proposed method is applicable for automatic emotion recognition.

  11. Embracing your emotions: affective state impacts lateralisation of human embraces.

    Science.gov (United States)

    Packheiser, Julian; Rook, Noemi; Dursun, Zeynep; Mesenhöller, Janne; Wenglorz, Alrescha; Güntürkün, Onur; Ocklenburg, Sebastian

    2018-01-18

    Humans are highly social animals that show a wide variety of verbal and non-verbal behaviours to communicate social intent. One of the most frequently used non-verbal social behaviours is embracing, commonly used as an expression of love and affection. However, it can also occur in a large variety of social situations entailing negative (fear or sadness) or neutral emotionality (formal greetings). Embracing is also experienced from birth onwards in mother-infant interactions and is thus accompanying human social interaction across the whole lifespan. Despite the importance of embraces for human social interactions, their underlying neurophysiology is unknown. Here, we demonstrated in a well-powered sample of more than 2500 adults that humans show a significant rightward bias during embracing. Additionally, we showed that this general motor preference is strongly modulated by emotional contexts: the induction of positive or negative affect shifted the rightward bias significantly to the left, indicating a stronger involvement of right-hemispheric neural networks during emotional embraces. In a second laboratory study, we were able to replicate both of these findings and furthermore demonstrated that the motor preferences during embracing correlate with handedness. Our studies therefore not only show that embracing is controlled by an interaction of motor and affective networks, they also demonstrate that emotional factors seem to activate right-hemispheric systems in valence-invariant ways.

  12. From Physiological data to Emotional States: Conducting a User Study and Comparing Machine Learning Classifiers

    Directory of Open Access Journals (Sweden)

    Ali Mehmood KHAN

    2016-06-01

    Full Text Available Recognizing emotional states is becoming a major part of a user's context for wearable computing applications. The system should be able to acquire a user's emotional states by using physiological sensors. We want to develop a personal emotional states recognition system that is practical, reliable, and can be used for health-care related applications. We propose to use the eHealth platform 1 which is a ready-made, light weight, small and easy to use device for recognizing a few emotional states like ‘Sad’, ‘Dislike’, ‘Joy’, ‘Stress’, ‘Normal’, ‘No-Idea’, ‘Positive’ and ‘Negative’ using decision tree (J48 and k-Nearest Neighbors (IBK classifiers. In this paper, we present an approach to build a system that exhibits this property and provides evidence based on data for 8 different emotional states collected from 24 different subjects. Our results indicate that the system has an accuracy rate of approximately 98 %. In our work, we used four physiological sensors i.e. ‘Blood Volume Pulse’ (BVP, ‘Electromyogram’ (EMG, ‘Galvanic Skin Response’ (GSR, and ‘Skin Temperature’ in order to recognize emotional states (i.e. Stress, Joy/Happy, Sad, Normal/Neutral, Dislike, No-idea, Positive and Negative.

  13. An Emotional Response To The State of Accounting Education::Developing Accounting Students’ Emotional Intelligence

    OpenAIRE

    McPhail, Ken

    2004-01-01

    This paper attempts to do three things. Firstly, in the light of growing concern over the expanding managerialism and rationalism within society in general and accounting education in particular, the paper presents a theoretical reappraisal of the extent to which conventional perspectives on rationalism and managerialism might be misconstrued. In particular, the paper address a question that relates to the role of emotion within business decision making: ‘while we might feel uneasy about basi...

  14. A Fuzzy Aproach For Facial Emotion Recognition

    Science.gov (United States)

    Gîlcă, Gheorghe; Bîzdoacă, Nicu-George

    2015-09-01

    This article deals with an emotion recognition system based on the fuzzy sets. Human faces are detected in images with the Viola - Jones algorithm and for its tracking in video sequences we used the Camshift algorithm. The detected human faces are transferred to the decisional fuzzy system, which is based on the variable fuzzyfication measurements of the face: eyebrow, eyelid and mouth. The system can easily determine the emotional state of a person.

  15. More than meets the eye: the role of self-identity in decoding complex emotional states.

    Science.gov (United States)

    Stevenson, Michael T; Soto, José A; Adams, Reginald B

    2012-10-01

    Folk wisdom asserts that "the eyes are the window to the soul," and empirical science corroborates a prominent role for the eyes in the communication of emotion. Herein we examine variation in the ability to "read" the eyes of others as a function of social group membership, employing a widely used emotional state decoding task: "Reading the Mind in Eyes." This task has documented impaired emotional state decoding across racial groups, with cross-race performance on par with that previously reported as a function of autism spectrum disorders. The present study extended this work by examining the moderating role of social identity in such impairments. For college students more highly identified with their university, cross-race performance differences were not found for judgments of "same-school" eyes but remained for "rival-school" eyes. These findings suggest that impaired emotional state decoding across groups may thus be more amenable to remediation than previously realized.

  16. The effects of valence-based and discrete emotional states on aesthetic response.

    Science.gov (United States)

    Cheng, Yin-Hui

    2013-01-01

    There is increasing recognition that consumer aesthetics--the responses of consumers to the aesthetic or appearance aspects of products--has become an important area of marketing in recent years. Consumer aesthetic responses to a product are a source of pleasure for the consumer. Previous research into the aesthetic responses to products has often emphasized exterior factors and visual design, but studies have seldom considered the psychological aesthetic experience of consumers, and in particular their emotional state. This study attempts to bridge this gap by examining the link between consumers' emotions and their aesthetic response to a product. Thus, the major goal of this study was to determine how valence-based and discrete emotional states influence choice. In Studies 1 and 2, positive and negative emotions were manipulated to implement two different induction techniques and explore the effect of emotions on participants' choices in two separate experiments. The results of both experiments confirmed the predictions, indicating that aesthetic responses and purchase intention are functions of emotional valence, such that both are stronger for people in a positive emotional state than for those in a negative emotional state. Study 2 also used a neutral affective state to establish the robustness of this observed effect of incidental affect. The results of Study 3 demonstrate that aesthetic response and purchase intention are not only a function of affect valence, but also are affected by the certainty appraisal associated with specific affective states. This research, therefore, contributes to the literature by offering empirical evidence that incidental affect is a determinant of aesthetic response.

  17. Differences in Attributions for Public and Private Face-to-face and Cyber Victimization Among Adolescents in China, Cyprus, the Czech Republic, India, Japan, and the United States.

    Science.gov (United States)

    Wright, Michelle F; Yanagida, Takuya; Aoyama, Ikuko; Dědková, Lenka; Li, Zheng; Kamble, Shanmukh V; Bayraktar, Fatih; Ševčíková, Anna; Soudi, Shruti; Macháčková, Hana; Lei, Li; Shu, Chang

    2017-01-01

    The authors' aim was to investigate gender and cultural differences in the attributions used to determine causality for hypothetical public and private face-to-face and cyber victimization scenarios among 3,432 adolescents (age range = 11-15 years; 49% girls) from China, Cyprus, the Czech Republic, India, Japan, and the United States, while accounting for their individualism and collectivism. Adolescents completed a questionnaire on cultural values and read four hypothetical victimization scenarios, including public face-to-face victimization, public cyber victimization, private face-to-face victimization, and private cyber victimization. After reading the scenarios, they rated different attributions (i.e., self-blame, aggressor-blame, joking, normative, conflict) according to how strongly they believed the attributions explained why victimization occurred. Overall, adolescents reported that they would utilize the attributions of self-blame, aggressor-blame, and normative more for public forms of victimization and face-to-face victimization than for private forms of victimization and cyber victimization. Differences were found according to gender and country of origin as well. Such findings underscore the importance of delineating between different forms of victimization when examining adolescents' attributions.

  18. Motivated emotion and the rally around the flag effect: liberals are motivated to feel collective angst (like conservatives) when faced with existential threat.

    Science.gov (United States)

    Porat, Roni; Tamir, Maya; Wohl, Michael J A; Gur, Tamar; Halperin, Eran

    2018-04-18

    A careful look at societies facing threat reveals a unique phenomenon in which liberals and conservatives react emotionally and attitudinally in a similar manner, rallying around the conservative flag. Previous research suggests that this rally effect is the result of liberals shifting in their attitudes and emotional responses toward the conservative end. Whereas theories of motivated social cognition provide a motivation-based account of cognitive processes (i.e. attitude shift), it remains unclear whether emotional shifts are, in fact, also a motivation-based process. Herein, we propose that under threat, liberals are motivated to feel existential concern about their group's future vitality (i.e. collective angst) to the same extent as conservatives, because this group-based emotion elicits support for ingroup protective action. Within the context of the Palestinian-Israeli conflict, we tested and found support for this hypothesis both inside (Study 1) and outside (Study 2) the laboratory. We did so using a behavioural index of motivation to experience collective angst. We discuss the implications of our findings for understanding motivated emotion regulation in the context of intergroup threat.

  19. Job resources and recovery experiences to face difficulties in emotion regulations at work: a diary study among nurses

    NARCIS (Netherlands)

    Blanco-Donoso, L.M.; Garrosa, E.; Demerouti, E.; Moreno-Jiménez, B.

    2017-01-01

    The present study examines the role of daily difficulties in emotion regulation at work in nurse’s daily well-being and how certain job resources and recovery experiences influence this relationship. We hypothesized that daily difficulties to regulate emotions at work would be significantly and

  20. Shades of Emotion: What the Addition of Sunglasses or Masks to Faces Reveals about the Development of Facial Expression Processing

    Science.gov (United States)

    Roberson, Debi; Kikutani, Mariko; Doge, Paula; Whitaker, Lydia; Majid, Asifa

    2012-01-01

    Three studies investigated developmental changes in facial expression processing, between 3 years-of-age and adulthood. For adults and older children, the addition of sunglasses to upright faces caused an equivalent decrement in performance to face inversion. However, younger children showed "better" classification of expressions of faces wearing…

  1. Greater Pupil Size in Response to Emotional Faces as an Early Marker of Social-Communicative Difficulties in Infants at High Risk for Autism.

    Science.gov (United States)

    Wagner, Jennifer B; Luyster, Rhiannon J; Tager-Flusberg, Helen; Nelson, Charles A

    2016-01-01

    When scanning faces, individuals with autism spectrum disorder (ASD) have shown reduced visual attention (e.g., less time on eyes) and atypical autonomic responses (e.g., heightened arousal). To understand how these differences might explain sub-clinical variability in social functioning, 9-month-olds, with or without a family history of ASD, viewed emotionally-expressive faces, and gaze and pupil diameter (a measure of autonomic activation) were recorded using eye-tracking. Infants at high-risk for ASD with no subsequent clinical diagnosis (HRA-) and low-risk controls (LRC) showed similar face scanning and attention to eyes and mouth. Attention was overall greater to eyes than mouth, but this varied as a function of the emotion presented. HRA- showed significantly larger pupil size than LRC. Correlations between scanning at 9 months, pupil size at 9 months, and 18-month social-communicative behavior, revealed positive associations between pupil size and attention to both face and eyes at 9 months in LRC, and a negative association between 9-month pupil size and 18-month social-communicative behavior in HRA-. The present findings point to heightened autonomic arousal in HRA-. Further, with greater arousal relating to worse social-communicative functioning at 18 months, this work points to a mechanism by which unaffected siblings might develop atypical social behavior.

  2. Touch communicates distinct emotions.

    Science.gov (United States)

    Hertenstein, Matthew J; Keltner, Dacher; App, Betsy; Bulleit, Brittany A; Jaskolka, Ariane R

    2006-08-01

    The study of emotional signaling has focused almost exclusively on the face and voice. In 2 studies, the authors investigated whether people can identify emotions from the experience of being touched by a stranger on the arm (without seeing the touch). In the 3rd study, they investigated whether observers can identify emotions from watching someone being touched on the arm. Two kinds of evidence suggest that humans can communicate numerous emotions with touch. First, participants in the United States (Study 1) and Spain (Study 2) could decode anger, fear, disgust, love, gratitude, and sympathy via touch at much-better-than-chance levels. Second, fine-grained coding documented specific touch behaviors associated with different emotions. In Study 3, the authors provide evidence that participants can accurately decode distinct emotions by merely watching others communicate via touch. The findings are discussed in terms of their contributions to affective science and the evolution of altruism and cooperation. (c) 2006 APA, all rights reserved

  3. The effect of weather and its changes on emotional state - individual characteristics that make us vulnerable

    Science.gov (United States)

    Spasova, Z.

    2011-03-01

    Given the proven effects of weather on the human organism, an attempt to examine its effects on a psychological and emotional level has been made. Emotions affect the bio tone, working ability, and concentration; hence their significance in various domains of economic life such as health care, education, transportation, and tourism. The present pilot study was conducted in Sofia, Bulgaria over a period of eight months, using five psychological methods: Eysenck Personality Questionnaire, State-Trait Anxiety Inventory, Test for Self-assessment of the emotional state, Test for evaluation of moods and Test ''Self-confidence-Activity-Mood''. The Fiodorov-Chubukov's complex-climatic method was used to characterize meteorological conditions in order to include a maximal number of meteorological elements in the analysis. Sixteen weather types are defined depending on the meteorological elements values according to this method. Abrupt weather changes from one day to another, defined by the same method, were also considered. The results obtained by t-test showed that the different categories of weather led to changes in the emotional status, which indicates a character either positive or negative for the organism. The abrupt weather changes, according to expectations, have negative effects on human emotions - but only when a transition to the cloudy weather or weather type, classified as ''unfavorable'', has been realized. The relationship between weather and human emotions is rather complicated since it depends on individual characteristics of people. One of these individual psychological characteristics, marked by the dimension ''neuroticism'', has a strong effect on emotional reactions in different weather conditions. Emotionally stable individuals are more ''resistant'' to the weather influence on their emotions, while those who are emotionally unstable have a stronger dependence on the impacts of weather.

  4. Advances in face detection and facial image analysis

    CERN Document Server

    Celebi, M; Smolka, Bogdan

    2016-01-01

    This book presents the state-of-the-art in face detection and analysis. It outlines new research directions, including in particular psychology-based facial dynamics recognition, aimed at various applications such as behavior analysis, deception detection, and diagnosis of various psychological disorders. Topics of interest include face and facial landmark detection, face recognition, facial expression and emotion analysis, facial dynamics analysis, face classification, identification, and clustering, and gaze direction and head pose estimation, as well as applications of face analysis.

  5. Performance-driven facial animation: basic research on human judgments of emotional state in facial avatars.

    Science.gov (United States)

    Rizzo, A A; Neumann, U; Enciso, R; Fidaleo, D; Noh, J Y

    2001-08-01

    three-dimensional avatar using a performance-driven facial animation (PDFA) system developed at the University of Southern California Integrated Media Systems Center. PDFA offers a means for creating high-fidelity visual representations of human faces and bodies. This effort explores the feasibility of sensing and reproducing a range of facial expressions with a PDFA system. In order to test concordance of human ratings of emotional expression between video and avatar facial delivery, we first had facial model subjects observe stimuli that were designed to elicit naturalistic facial expressions. The emotional stimulus induction involved presenting text-based, still image, and video clips to subjects that were previously rated to induce facial expressions for the six universals2 of facial expression (happy, sad, fear, anger, disgust, and surprise), in addition to attentiveness, puzzlement and frustration. Videotapes of these induced facial expressions that best represented prototypic examples of the above emotional states and three-dimensional avatar animations of the same facial expressions were randomly presented to 38 human raters. The raters used open-end, forced choice and seven-point Likert-type scales to rate expression in terms of identification. The forced choice and seven-point ratings provided the most usable data to determine video/animation concordance and these data are presented. To support a clear understanding of this data, a website has been set up that will allow readers to view the video and facial animation clips to illustrate the assets and limitations of these types of facial expression-rendering methods (www. USCAvatars.com/MMVR). This methodological first step in our research program has served to provide valuable human user-centered feedback to support the iterative design and development of facial avatar characteristics for expression of emotional communication.

  6. The effects of emotional states and traits on risky decision-making.

    Energy Technology Data Exchange (ETDEWEB)

    Bernard, Michael Lewis; Smith, Bruce W., 1959- (,University of New Mexico, Albuquerque, NM-)

    2006-12-01

    Understanding the role of emotional states is critical for predicting the kind of decisions people will make in risky situations. Currently, there is little understanding as to how emotion influences decision-making in situations such as terrorist attacks, natural disasters, pandemics, and combat. To help address this, we used behavioral and neuroimaging methods to examine how emotion states and traits influence decisions. Specifically, this study used a wheel of fortune behavioral task and functional magnetic resonance imaging (fMRI) to examine the effects of emotional states and traits on decision-making pertaining to the degree of risk people are willing to make in specific situations. The behavioral results are reported here. The neural data requires additional time to analyze and will be reported at a future date. Biases caused by emotion states and traits were found regarding the likelihood of making risky decisions. The behavioral results will help provide a solid empirical foundation for modeling the effects of emotion on decision in risky situations.

  7. Automatic Identification of E-Learner Emotional States to Ameliorate ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... offer a high quality training and educational services. ... will explore certain challenges which face the distant-learner: the ... Moreover, it is going to help teacher evaluate the learning .... trajectories, as the sign of a slow flow. ... [4] E.L., Vallerand, R.J., Pelletier, L.G. & Ryan, R.M. , Motivation and education:.

  8. Clinical and evoked pain, personality traits, and emotional states: can familial confounding explain the associations?

    Science.gov (United States)

    Strachan, Eric; Poeschla, Brian; Dansie, Elizabeth; Succop, Annemarie; Chopko, Laura; Afari, Niloofar

    2015-01-01

    Pain is a complex phenomenon influenced by context and person-specific factors. Affective dimensions of pain involve both enduring personality traits and fleeting emotional states. We examined how personality traits and emotional states are linked with clinical and evoked pain in a twin sample. 99 female twin pairs were evaluated for clinical and evoked pain using the McGill Pain Questionnaire (MPQ) and dolorimetry, and completed the 120-item International Personality Item Pool (IPIP), the Positive and Negative Affect Scale (PANAS), and ratings of stress and mood. Using a co-twin control design we examined a) the relationship of personality traits and emotional states with clinical and evoked pain and b) whether genetics and common environment (i.e. familial factors) may account for the associations. Neuroticism was associated with the sensory component of the MPQ; this relationship was not confounded by familial factors. None of the emotional state measures was associated with the MPQ. PANAS negative affect was associated with lower evoked pressure pain threshold and tolerance; these associations were confounded by familial factors. There were no associations between IPIP traits and evoked pain. A relationship exists between neuroticism and clinical pain that is not confounded by familial factors. There is no similar relationship between negative emotional states and clinical pain. In contrast, the relationship between negative emotional states and evoked pain is strong while the relationship with enduring personality traits is weak. The relationship between negative emotional states and evoked pain appears to be non-causal and due to familial factors. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Implications of State Dental Board Disciplinary Actions for Teaching Dental Students About Emotional Intelligence.

    Science.gov (United States)

    Munk, Lyle Kris

    2016-01-01

    The primary emphasis in dental education is on developing students' cognitive intelligence (thinking) and technical intelligence (doing), while emotional intelligence (being) receives less emphasis. The aim of this study was to explore a potential consequence of the paucity of emotional intelligence education by determining the level of emotional intelligence-related (EI-R) infractions in state dental board disciplinary actions and characterizing the types of those infractions. For this study, 1,100 disciplinary action reports from 21 state dental boards were reviewed, and disciplinary infractions were classified as cognitive intelligence-related (CI-R) infractions, technical intelligence-related (TI-R) infractions, and EI-R infractions. EI-R infractions were then subcategorized into emotional intelligence clusters and competencies using the Emotional and Social Competency Inventory (ESCI). The results showed that 56.6% of the infractions were EI-R. When the EI-R infractions were subcategorized, the four competencies most frequently violated involved transparency, teamwork and collaboration, organizational awareness, and accurate self-assessment. Understanding the frequency and nature of EI-R infractions may promote awareness of the need for increased attention to principles of emotional intelligence in dental education and may encourage integration of those principles across dental curricula to help students understand the impact of emotional intelligence on their future well-being and livelihood.

  10. Event-related brain responses to emotional words, pictures, and faces – a cross-domain comparison

    Science.gov (United States)

    Bayer, Mareike; Schacht, Annekathrin

    2014-01-01

    Emotion effects in event-related brain potentials (ERPs) have previously been reported for a range of visual stimuli, including emotional words, pictures, and facial expressions. Still, little is known about the actual comparability of emotion effects across these stimulus classes. The present study aimed to fill this gap by investigating emotion effects in response to words, pictures, and facial expressions using a blocked within-subject design. Furthermore, ratings of stimulus arousal and valence were collected from an independent sample of participants. Modulations of early posterior negativity (EPN) and late positive complex (LPC) were visible for all stimulus domains, but showed clear differences, particularly in valence processing. While emotion effects were limited to positive stimuli for words, they were predominant for negative stimuli in pictures and facial expressions. These findings corroborate the notion of a positivity offset for words and a negativity bias for pictures and facial expressions, which was assumed to be caused by generally lower arousal levels of written language. Interestingly, however, these assumed differences were not confirmed by arousal ratings. Instead, words were rated as overall more positive than pictures and facial expressions. Taken together, the present results point toward systematic differences in the processing of written words and pictorial stimuli of emotional content, not only in terms of a valence bias evident in ERPs, but also concerning their emotional evaluation captured by ratings of stimulus valence and arousal. PMID:25339927

  11. Seeing Red, Feeling Blue: The Impact of State Political Leaning on State Identification Rates for Emotional Disturbance

    Science.gov (United States)

    Wiley, Andrew; Siperstein, Gary

    2011-01-01

    Investigations of why students with emotional disturbance (ED) are underidentified in special education have often focused on economic factors and problems with the definition of ED. The present study focuses on variation in underidentification across states and its relationship to political ideology. State-level political, economic, and…

  12. determinant of psychophysiological state of sportsmen of high qualification with different emotional characteristics

    OpenAIRE

    Korobeynikova L.G.

    2011-01-01

    The purpose of the work - to study the determinants of psycho-physiological state of sportsmen of high qualification with different emotional characteristics. In experiment took part 19 highly skilled athletes involved in the Greco-Roman wrestling. The survey was carried out using a hardware-software complex psychodiagnostic "Multipsihometr-05. Determined by the emotional background of athletes according to the method A. Rukavishnikova features of visual perception and perceptual speed.

  13. determinant of psychophysiological state of sportsmen of high qualification with different emotional characteristics

    Directory of Open Access Journals (Sweden)

    Korobeynikova L.G.

    2011-04-01

    Full Text Available The purpose of the work - to study the determinants of psycho-physiological state of sportsmen of high qualification with different emotional characteristics. In experiment took part 19 highly skilled athletes involved in the Greco-Roman wrestling. The survey was carried out using a hardware-software complex psychodiagnostic "Multipsihometr-05. Determined by the emotional background of athletes according to the method A. Rukavishnikova features of visual perception and perceptual speed.

  14. Assessment of brain activities during an emotional stress state using fMRI

    International Nuclear Information System (INIS)

    Hayashi, Takuto; Mizuno-Matsumoto, Yuko; Kawasaki, Aika; Kato, Makoto; Murata, Tsutomu

    2011-01-01

    We investigated cerebrum activation using functional magnetic resonance imaging during a mental stress state. Thirty-four healthy adults participated. Before the experiment, we assessed their stress states using the Stress Self-rating Scale and divided the participants into Stress and Non-stress groups. The experiment consisted of 6 trials. Each trial consisted of a 20-s block of emotional audio-visual stimuli (4-s stimulation x 5 slides) and a fixation point. These processes were performed 3 times continuously (Relaxed, Pleasant, Unpleasant stimuli) in a random order. These results showed that the Non-stress group indicated activation of the amygdala and hippocampus in the Pleasant and Unpleasant stimuli while the Stress group indicated activation of the hippocampus in Pleasant stimuli, and the amygdala and hippocampus in Unpleasant stimuli. These findings suggested that the mental stress state engages the reduction of emotional processing. Also, the responsiveness of the memory system remained during and after the emotional stress state. (author)

  15. Wavelet Packet Entropy in Speaker-Independent Emotional State Detection from Speech Signal

    Directory of Open Access Journals (Sweden)

    Mina Kadkhodaei Elyaderani

    2015-01-01

    Full Text Available In this paper, wavelet packet entropy is proposed for speaker-independent emotion detection from speech. After pre-processing, wavelet packet decomposition using wavelet type db3 at level 4 is calculated and Shannon entropy in its nodes is calculated to be used as feature. In addition, prosodic features such as first four formants, jitter or pitch deviation amplitude, and shimmer or energy variation amplitude besides MFCC features are applied to complete the feature vector. Then, Support Vector Machine (SVM is used to classify the vectors in multi-class (all emotions or two-class (each emotion versus normal state format. 46 different utterances of a single sentence from Berlin Emotional Speech Dataset are selected. These are uttered by 10 speakers in sadness, happiness, fear, boredom, anger, and normal emotional state. Experimental results show that proposed features can improve emotional state detection accuracy in multi-class situation. Furthermore, adding to other features wavelet entropy coefficients increase the accuracy of two-class detection for anger, fear, and happiness.

  16. Psilocybin biases facial recognition, goal-directed behavior, and mood state toward positive relative to negative emotions through different serotonergic subreceptors.

    Science.gov (United States)

    Kometer, Michael; Schmidt, André; Bachmann, Rosilla; Studerus, Erich; Seifritz, Erich; Vollenweider, Franz X

    2012-12-01

    Serotonin (5-HT) 1A and 2A receptors have been associated with dysfunctional emotional processing biases in mood disorders. These receptors further predominantly mediate the subjective and behavioral effects of psilocybin and might be important for its recently suggested antidepressive effects. However, the effect of psilocybin on emotional processing biases and the specific contribution of 5-HT2A receptors across different emotional domains is unknown. In a randomized, double-blind study, 17 healthy human subjects received on 4 separate days placebo, psilocybin (215 μg/kg), the preferential 5-HT2A antagonist ketanserin (50 mg), or psilocybin plus ketanserin. Mood states were assessed by self-report ratings, and behavioral and event-related potential measurements were used to quantify facial emotional recognition and goal-directed behavior toward emotional cues. Psilocybin enhanced positive mood and attenuated recognition of negative facial expression. Furthermore, psilocybin increased goal-directed behavior toward positive compared with negative cues, facilitated positive but inhibited negative sequential emotional effects, and valence-dependently attenuated the P300 component. Ketanserin alone had no effects but blocked the psilocybin-induced mood enhancement and decreased recognition of negative facial expression. This study shows that psilocybin shifts the emotional bias across various psychological domains and that activation of 5-HT2A receptors is central in mood regulation and emotional face recognition in healthy subjects. These findings may not only have implications for the pathophysiology of dysfunctional emotional biases but may also provide a framework to delineate the mechanisms underlying psylocybin's putative antidepressant effects. Copyright © 2012 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  17. THE EMOTIONAL STATE OF PARENTS IN THE STRUCTURE OF THE STAHES OF THE EXPERIENCE OF HAVING A CHILD WITH DEVELOPMENTAL DISABILITIES

    Directory of Open Access Journals (Sweden)

    BOGDANNA ANDREYKO

    2016-09-01

    Full Text Available The article analyses research data and scholarly approaches to the study: of problems of parents arising from their child’s illness; the emotional states of parents raising a child with developmental disabilities; stages of emotional experience related to the birth of a sick child. The family as an integral unit has to face various situations determined by the social impact of the child’s disease or impairment, as well as emotional and psychological reactions of the parents to it. Being aware of the psychological stages singled out in the grief theory helps professionals: to understand the reaction of the family of a child with developmental disabilities; realise when and how it is best to intervene, flexibly apply the theory of stages, and account for the specific characteristics of a particular family and individual reactions to such shocks.

  18. Making Decisions under Ambiguity: Judgment Bias Tasks for Assessing Emotional State in Animals

    Science.gov (United States)

    Roelofs, Sanne; Boleij, Hetty; Nordquist, Rebecca E.; van der Staay, Franz Josef

    2016-01-01

    Judgment bias tasks (JBTs) are considered as a family of promising tools in the assessment of emotional states of animals. JBTs provide a cognitive measure of optimism and/or pessimism by recording behavioral responses to ambiguous stimuli. For instance, a negative emotional state is expected to produce a negative or pessimistic judgment of an ambiguous stimulus, whereas a positive emotional state produces a positive or optimistic judgment of the same ambiguous stimulus. Measuring an animal’s emotional state or mood is relevant in both animal welfare research and biomedical research. This is reflected in the increasing use of JBTs in both research areas. We discuss the different implementations of JBTs with animals, with a focus on their potential as an accurate measure of emotional state. JBTs have been successfully applied to a very broad range of species, using many different types of testing equipment and experimental protocols. However, further validation of this test is deemed necessary. For example, the often extensive training period required for successful judgment bias testing remains a possible factor confounding results. Also, the issue of ambiguous stimuli losing their ambiguity with repeated testing requires additional attention. Possible improvements are suggested to further develop the JBTs in both animal welfare and biomedical research. PMID:27375454

  19. Experimental methods to validate measures of emotional state and readiness for duty in critical operations

    International Nuclear Information System (INIS)

    Weston, Louise Marie

    2007-01-01

    A recent report on criticality accidents in nuclear facilities indicates that human error played a major role in a significant number of incidents with serious consequences and that some of these human errors may be related to the emotional state of the individual. A pre-shift test to detect a deleterious emotional state could reduce the occurrence of such errors in critical operations. The effectiveness of pre-shift testing is a challenge because of the need to gather predictive data in a relatively short test period and the potential occurrence of learning effects due to a requirement for frequent testing. This report reviews the different types of reliability and validity methods and testing and statistical analysis procedures to validate measures of emotional state. The ultimate value of a validation study depends upon the percentage of human errors in critical operations that are due to the emotional state of the individual. A review of the literature to identify the most promising predictors of emotional state for this application is highly recommended

  20. Evaluation of cognitive load and emotional states during multidisciplinary critical care simulation sessions.

    Science.gov (United States)

    Pawar, Swapnil; Jacques, Theresa; Deshpande, Kush; Pusapati, Raju; Meguerdichian, Michael J

    2018-04-01

    The simulation in critical care setting involves a heterogeneous group of participants with varied background and experience. Measuring the impacts of simulation on emotional state and cognitive load in this setting is not often performed. The feasibility of such measurement in the critical care setting needs further exploration. Medical and nursing staff with varying levels of experience from a tertiary intensive care unit participated in a standardised clinical simulation scenario. The emotional state of each participant was assessed before and after completion of the scenario using a validated eight-item scale containing bipolar oppositional descriptors of emotion. The cognitive load of each participant was assessed after the completion of the scenario using a validated subjective rating tool. A total of 103 medical and nursing staff participated in the study. The participants felt more relaxed (-0.28±1.15 vs 0.14±1, Pcognitive load for all participants was 6.67±1.41. There was no significant difference in the cognitive loads among medical staff versus nursing staff (6.61±2.3 vs 6.62±1.7; P>0.05). A well-designed complex high fidelity critical care simulation scenario can be evaluated to identify the relative cognitive load of the participants' experience and their emotional state. The movement of learners emotionally from a more negative state to a positive state suggests that simulation can be an effective tool for improved knowledge transfer and offers more opportunity for dynamic thinking.

  1. Attachment and couple satisfaction as predictors of expressed emotion in women facing breast cancer and their partners in the immediate post-surgery period.

    Science.gov (United States)

    Favez, Nicolas; Cairo Notari, Sarah; Antonini, Tania; Charvoz, Linda

    2017-02-01

    To investigate expressed emotion (EE) in couples facing breast cancer in the immediate post-surgery period. EE may be predictive of psychological disturbances that hinder both partners' capacities to cope with the stress of the disease. Severity of the disease, attachment tendencies, and couple satisfaction were tested as predictors of EE. The design was cross-sectional. Couples (N = 61) were interviewed 2 weeks after the women's breast surgery. Expressed emotion was assessed in women and in partners with the Five-Minute Speech Sample, with a focus on overt and covert criticisms. Self-reported EE, attachment tendencies, and couple satisfaction were assessed with questionnaires. Hierarchical regression analyses were performed to test the predictors and possible interactions between them. Both partners expressed overt and covert criticisms; women expressed more overt criticisms than did their partners. Cancer stage was inversely related to the number of overt criticisms in women and to the number of covert criticisms in partners. Regression analyses showed that in women, higher attachment anxiety and lower couple satisfaction were positive predictors of overt criticisms; in partners, a higher cancer stage was a negative predictor of overt and covert criticisms. Practitioners should pay attention to the couple relationship in breast cancer. EE is most likely to appear when the cancer stage is low, showing that even when the medical prognosis is optimal, relational and emotional disturbances may occur. Statement of contribution What is already known on this subject? The couple relationship is of paramount importance in breast cancer. Expressed emotion (EE) is related to negative individual and relational psychological outcomes in psychiatric and somatic diseases. Expressed emotion has not yet been studied in the context of breast cancer. What does this study add? Expressed emotion is present in breast cancer situations, especially when the cancer stage is low. There

  2. The impact of oxytocin administration and maternal love withdrawal on event-related potential (ERP) responses to emotional faces with performance feedback.

    Science.gov (United States)

    Huffmeijer, Renske; Alink, Lenneke R A; Tops, Mattie; Grewen, Karen M; Light, Kathleen C; Bakermans-Kranenburg, Marian J; van Ijzendoorn, Marinus H

    2013-03-01

    This is the first experimental study on the effect of oxytocin administration on the neural processing of facial stimuli conducted with female participants that uses event-related potentials (ERPs). Using a double-blind, placebo-controlled within-subjects design, we studied the effects of 16 IU of intranasal oxytocin on ERPs to pictures combining performance feedback with emotional facial expressions in 48 female undergraduate students. Participants also reported on the amount of love withdrawal they experienced from their mothers. Vertex positive potential (VPP) and late positive potential (LPP) amplitudes were more positive after oxytocin compared to placebo administration. This suggests that oxytocin increased attention to the feedback stimuli (LPP) and enhanced the processing of emotional faces (VPP). Oxytocin heightened processing of the happy and disgusted faces primarily for those reporting less love withdrawal. Significant associations with LPP amplitude suggest that more maternal love withdrawal relates to the allocation of attention toward the motivationally relevant combination of negative feedback with a disgusted face. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Adjunctive selective estrogen receptor modulator increases neural activity in the hippocampus and inferior frontal gyrus during emotional face recognition in schizophrenia.

    Science.gov (United States)

    Ji, E; Weickert, C S; Lenroot, R; Kindler, J; Skilleter, A J; Vercammen, A; White, C; Gur, R E; Weickert, T W

    2016-05-03

    Estrogen has been implicated in the development and course of schizophrenia with most evidence suggesting a neuroprotective effect. Treatment with raloxifene, a selective estrogen receptor modulator, can reduce symptom severity, improve cognition and normalize brain activity during learning in schizophrenia. People with schizophrenia are especially impaired in the identification of negative facial emotions. The present study was designed to determine the extent to which adjunctive raloxifene treatment would alter abnormal neural activity during angry facial emotion recognition in schizophrenia. Twenty people with schizophrenia (12 men, 8 women) participated in a 13-week, randomized, double-blind, placebo-controlled, crossover trial of adjunctive raloxifene treatment (120 mg per day orally) and performed a facial emotion recognition task during functional magnetic resonance imaging after each treatment phase. Two-sample t-tests in regions of interest selected a priori were performed to assess activation differences between raloxifene and placebo conditions during the recognition of angry faces. Adjunctive raloxifene significantly increased activation in the right hippocampus and left inferior frontal gyrus compared with the placebo condition (family-wise error, Precognition in schizophrenia. These findings support the hypothesis that estrogen plays a modifying role in schizophrenia and shows that adjunctive raloxifene treatment may reverse abnormal neural activity during facial emotion recognition, which is relevant to impaired social functioning in men and women with schizophrenia.

  4. Contributions of emotional state and attention to the processing of syntactic agreement errors: Evidence from P600

    NARCIS (Netherlands)

    Verhees, M.W.F.T.; Chwilla, D.J.; Tromp, J.; Vissers, C.T.W.M.

    2015-01-01

    The classic account of language is that language processing occurs in isolation from other cognitive systems, like perception, motor action, and emotion. The central theme of this paper is the relationship between a participant's emotional state and language comprehension. Does emotional context

  5. Influence of Emotional States on the Functioning of Perceptual Sphere and Characteristics of the Personality

    Directory of Open Access Journals (Sweden)

    Polyakova Irina Vadimovna

    2015-09-01

    Full Text Available The article deals with the psychological interplay between the formation of perceptual skills and personal qualities. The purpose of the study was to determine the characteristics of this relationship during playback of a given sample in different emotional states. It is hypothesized that there is a connection between such personal qualities as a spontaneous aggressiveness, depression, irritability, emotional lability and peculiarities of functioning of sensory-perceptual sphere during playback of a given sample. 55 students of the Smolensk State University took part in the study. The instrumentation consisted of FPI techniques, ITO, R. Plutchik and measurement precision motor skills in different emotional states of the subject. In the experimental part of the work the special research tool specifically created for the given aim was used; it measured errors when playing a sample of the right and left hand of the subject which then was correlated with the results of the survey recipients. Comparative mapping of indicators of pressure on the levers of the meter in different hands clarified their specific contribution to skill formation and its strong correlation with psychological features of a person. The authors made conclusion that the change of emotional state transforms the perception of the sample; in a state of emotional arousal the errors of the right hand when playing the sample increase more rapidly than the errors of the left hand compared with similar work in the state that recipients assessed as normal working conditions; changes in emotional state affect the fidelity of a given sample; examinees do not fully appreciated errors in reproducing the set of standards.

  6. Designing Emotionally Expressive Robots

    DEFF Research Database (Denmark)

    Tsiourti, Christiana; Weiss, Astrid; Wac, Katarzyna

    2017-01-01

    Socially assistive agents, be it virtual avatars or robots, need to engage in social interactions with humans and express their internal emotional states, goals, and desires. In this work, we conducted a comparative study to investigate how humans perceive emotional cues expressed by humanoid...... robots through five communication modalities (face, head, body, voice, locomotion) and examined whether the degree of a robot's human-like embodiment affects this perception. In an online survey, we asked people to identify emotions communicated by Pepper -a highly human-like robot and Hobbit – a robot...... for robots....

  7. Computer-Assisted Face Processing Instruction Improves Emotion Recognition, Mentalizing, and Social Skills in Students with ASD

    Science.gov (United States)

    Rice, Linda Marie; Wall, Carla Anne; Fogel, Adam; Shic, Frederick

    2015-01-01

    This study examined the extent to which a computer-based social skills intervention called "FaceSay"™ was associated with improvements in affect recognition, mentalizing, and social skills of school-aged children with Autism Spectrum Disorder (ASD). "FaceSay"™ offers students simulated practice with eye gaze, joint attention,…

  8. The Perception of Four Basic Emotions in Human and Nonhuman Faces by Children with Autism and Other Developmental Disabilities

    Science.gov (United States)

    Gross, Thomas F.

    2004-01-01

    Children who experienced autism, mental retardation, and language disorders; and, children in a clinical control group were shown photographs of human female, orangutan, and canine (boxer) faces expressing happiness, sadness, anger, surprise and a neutral expression. For each species of faces, children were asked to identify the happy, sad, angry,…

  9. Cognitive appraisal of environmental stimuli induces emotion-like states in fish.

    Science.gov (United States)

    Cerqueira, M; Millot, S; Castanheira, M F; Félix, A S; Silva, T; Oliveira, G A; Oliveira, C C; Martins, C I M; Oliveira, R F

    2017-10-13

    The occurrence of emotions in non-human animals has been the focus of debate over the years. Recently, an interest in expanding this debate to non-tetrapod vertebrates and to invertebrates has emerged. Within vertebrates, the study of emotion in teleosts is particularly interesting since they represent a divergent evolutionary radiation from that of tetrapods, and thus they provide an insight into the evolution of the biological mechanisms of emotion. We report that Sea Bream exposed to stimuli that vary according to valence (positive, negative) and salience (predictable, unpredictable) exhibit different behavioural, physiological and neuromolecular states. Since according to the dimensional theory of emotion valence and salience define a two-dimensional affective space, our data can be interpreted as evidence for the occurrence of distinctive affective states in fish corresponding to each the four quadrants of the core affective space. Moreover, the fact that the same stimuli presented in a predictable vs. unpredictable way elicited different behavioural, physiological and neuromolecular states, suggests that stimulus appraisal by the individual, rather than an intrinsic characteristic of the stimulus, has triggered the observed responses. Therefore, our data supports the occurrence of emotion-like states in fish that are regulated by the individual's perception of environmental stimuli.

  10. A grounded theory of young tennis players use of music to manipulate emotional state.

    Science.gov (United States)

    Bishop, Daniel T; Karageorghis, Costas I; Loizou, Georgios

    2007-10-01

    The main objectives of this study were (a) to elucidate young tennis players' use of music to manipulate emotional states, and (b) to present a model grounded in present data to illustrate this phenomenon and to stimulate further research. Anecdotal evidence suggests that music listening is used regularly by elite athletes as a preperformance strategy, but only limited empirical evidence corroborates such use. Young tennis players (N = 14) were selected purposively for interview and diary data collection. Results indicated that participants consciously selected music to elicit various emotional states; frequently reported consequences of music listening included improved mood, increased arousal, and visual and auditory imagery. The choice of music tracks and the impact of music listening were mediated by a number of factors, including extramusical associations, inspirational lyrics, music properties, and desired emotional state. Implications for the future investigation of preperformance music are discussed.

  11. Multimodal emotional state recognition using sequence-dependent deep hierarchical features.

    Science.gov (United States)

    Barros, Pablo; Jirak, Doreen; Weber, Cornelius; Wermter, Stefan

    2015-12-01

    Emotional state recognition has become an important topic for human-robot interaction in the past years. By determining emotion expressions, robots can identify important variables of human behavior and use these to communicate in a more human-like fashion and thereby extend the interaction possibilities. Human emotions are multimodal and spontaneous, which makes them hard to be recognized by robots. Each modality has its own restrictions and constraints which, together with the non-structured behavior of spontaneous expressions, create several difficulties for the approaches present in the literature, which are based on several explicit feature extraction techniques and manual modality fusion. Our model uses a hierarchical feature representation to deal with spontaneous emotions, and learns how to integrate multiple modalities for non-verbal emotion recognition, making it suitable to be used in an HRI scenario. Our experiments show that a significant improvement of recognition accuracy is achieved when we use hierarchical features and multimodal information, and our model improves the accuracy of state-of-the-art approaches from 82.5% reported in the literature to 91.3% for a benchmark dataset on spontaneous emotion expressions. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. The Contact State Monitoring for Seal End Faces Based on Acoustic Emission Detection

    Directory of Open Access Journals (Sweden)

    Xiaohui Li

    2016-01-01

    Full Text Available Monitoring the contact state of seal end faces would help the early warning of the seal failure. In the acoustic emission (AE detection for mechanical seal, the main difficulty is to reduce the background noise and to classify the dispersed features. To solve these problems and achieve higher detection rates, a new approach based on genetic particle filter with autoregression (AR-GPF and hypersphere support vector machine (HSSVM is presented. First, AR model is used to build the dynamic state space (DSS of the AE signal, and GPF is used for signal filtering. Then, multiple features are extracted, and a classification model based on HSSVM is constructed for state recognition. In this approach