WorldWideScience

Sample records for perceiving emotional faces

  1. The Perception of Time While Perceiving Dynamic Emotional Faces

    Directory of Open Access Journals (Sweden)

    Wang On eLi

    2015-08-01

    Full Text Available Emotion plays an essential role in the perception of time such that time is perceived to fly when events are enjoyable, while unenjoyable moments are perceived to drag. Previous studies have reported a time-drag effect when participants are presented with emotional facial expressions, regardless of the emotion presented. This effect can hardly be explained by induced emotion given the heterogeneous nature of emotional expressions. We conducted two experiments (n=44 & n=39 to examine the cognitive mechanism underlying this effect by presenting dynamic sequences of emotional expressions to participants. Each sequence started with a particular expression, then morphed to another. The presentation of dynamic facial expressions allows a comparison between the time-drag effect of homogeneous pairs of emotional expressions sharing similar valence and arousal to heterogeneous pairs. Sequences of seven durations (400ms, 600ms, 800ms, 1,000ms, 1,200ms, 1,400ms, 1,600ms were presented to participants, who were asked to judge whether the sequences were closer to 400ms or 1,600ms in a two-alternative forced choice task. The data were then collated according to conditions and fit into cumulative Gaussian curves to estimate the point of subjective equivalence indicating the perceived duration of 1,000ms. Consistent with previous reports, a feeling of time dragging is induced regardless of the sequence presented, such that 1,000ms is perceived to be longer than 1,000ms. In addition, dynamic facial expressions exert a greater effect on perceived time drag than static expressions. The effect is most prominent when the dynamics involve an angry face or a change in valence. The significance of this sensitivity is discussed in terms of emotion perception and its evolutionary significance for our attention mechanism.

  2. Modulation of the composite face effect by unintended emotion cues.

    Science.gov (United States)

    Gray, Katie L H; Murphy, Jennifer; Marsh, Jade E; Cook, Richard

    2017-04-01

    When upper and lower regions from different emotionless faces are aligned to form a facial composite, observers 'fuse' the two halves together, perceptually. The illusory distortion induced by task-irrelevant ('distractor') halves hinders participants' judgements about task-relevant ('target') halves. This composite-face effect reveals a tendency to integrate feature information from disparate regions of intact upright faces, consistent with theories of holistic face processing. However, observers frequently perceive emotion in ostensibly neutral faces, contrary to the intentions of experimenters. This study sought to determine whether this 'perceived emotion' influences the composite-face effect. In our first experiment, we confirmed that the composite effect grows stronger as the strength of distractor emotion increased. Critically, effects of distractor emotion were induced by weak emotion intensities, and were incidental insofar as emotion cues hindered image matching, not emotion labelling per se . In Experiment 2, we found a correlation between the presence of perceived emotion in a set of ostensibly neutral distractor regions sourced from commonly used face databases, and the strength of illusory distortion they induced. In Experiment 3, participants completed a sequential matching composite task in which half of the distractor regions were rated high and low for perceived emotion, respectively. Significantly stronger composite effects were induced by the high-emotion distractor halves. These convergent results suggest that perceived emotion increases the strength of the composite-face effect induced by supposedly emotionless faces. These findings have important implications for the study of holistic face processing in typical and atypical populations.

  3. The complex duration perception of emotional faces: Effects of face direction

    Directory of Open Access Journals (Sweden)

    Katrin Martina Kliegl

    2015-03-01

    Full Text Available The perceived duration of emotional face stimuli strongly depends on the expressed emotion. But, emotional faces also differ regarding a number of other features like gaze, face direction, or sex. Usually, these features have been controlled by only using pictures of female models with straight gaze and face direction. Doi and Shinohara (2009 reported that an overestimation of angry faces could only be found when the model’s gaze was oriented towards the observer. We aimed at replicating this effect for face direction. Moreover, we explored the effect of face direction on the duration perception sad faces. Controlling for the sex of the face model and the participant, female and male participants rated the duration of neutral, angry and sad face stimuli of both sexes photographed from different perspectives in a bisection task. In line with current findings, we report a significant overestimation of angry compared to neutral face stimuli that was modulated by face direction. Moreover, the perceived duration of sad face stimuli did not differ from that of neutral faces and was not influenced by face direction. Furthermore, we found that faces of the opposite sex appeared to last longer than those of the same sex. This outcome is discussed with regards to stimulus parameters like the induced arousal, social relevance and an evolutionary context.

  4. Perceived emotion genuineness: normative ratings for popular facial expression stimuli and the development of perceived-as-genuine and perceived-as-fake sets.

    Science.gov (United States)

    Dawel, Amy; Wright, Luke; Irons, Jessica; Dumbleton, Rachael; Palermo, Romina; O'Kearney, Richard; McKone, Elinor

    2017-08-01

    In everyday social interactions, people's facial expressions sometimes reflect genuine emotion (e.g., anger in response to a misbehaving child) and sometimes do not (e.g., smiling for a school photo). There is increasing theoretical interest in this distinction, but little is known about perceived emotion genuineness for existing facial expression databases. We present a new method for rating perceived genuineness using a neutral-midpoint scale (-7 = completely fake; 0 = don't know; +7 = completely genuine) that, unlike previous methods, provides data on both relative and absolute perceptions. Normative ratings from typically developing adults for five emotions (anger, disgust, fear, sadness, and happiness) provide three key contributions. First, the widely used Pictures of Facial Affect (PoFA; i.e., "the Ekman faces") and the Radboud Faces Database (RaFD) are typically perceived as not showing genuine emotion. Also, in the only published set for which the actual emotional states of the displayers are known (via self-report; the McLellan faces), percepts of emotion genuineness often do not match actual emotion genuineness. Second, we provide genuine/fake norms for 558 faces from several sources (PoFA, RaFD, KDEF, Gur, FacePlace, McLellan, News media), including a list of 143 stimuli that are event-elicited (rather than posed) and, congruently, perceived as reflecting genuine emotion. Third, using the norms we develop sets of perceived-as-genuine (from event-elicited sources) and perceived-as-fake (from posed sources) stimuli, matched on sex, viewpoint, eye-gaze direction, and rated intensity. We also outline the many types of research questions that these norms and stimulus sets could be used to answer.

  5. Body Weight Can Change How Your Emotions Are Perceived.

    Directory of Open Access Journals (Sweden)

    Yujung Oh

    Full Text Available Accurately interpreting other's emotions through facial expressions has important adaptive values for social interactions. However, due to the stereotypical social perception of overweight individuals as carefree, humorous, and light-hearted, the body weight of those with whom we interact may have a systematic influence on our emotion judgment even though it has no relevance to the expressed emotion itself. In this experimental study, we examined the role of body weight in faces on the affective perception of facial expressions. We hypothesized that the weight perceived in a face would bias the assessment of an emotional expression, with overweight faces generally more likely to be perceived as having more positive and less negative expressions than healthy weight faces. Using two-alternative forced-choice perceptual decision tasks, participants were asked to sort the emotional expressions of overweight and healthy weight facial stimuli that had been gradually morphed across six emotional intensity levels into one of two categories-"neutral vs. happy" (Experiment 1 and "neutral vs. sad" (Experiment 2. As predicted, our results demonstrated that overweight faces were more likely to be categorized as happy (i.e., lower happy decision threshold and less likely to be categorized as sad (i.e., higher sad decision threshold compared to healthy weight faces that had the same levels of emotional intensity. The neutral-sad decision threshold shift was negatively correlated with participant's own fear of becoming fat, that is, those without a fear of becoming fat more strongly perceived overweight faces as sad relative to those with a higher fear. These findings demonstrate that the weight of the face systematically influences how its emotional expression is interpreted, suggesting that being overweight may make emotional expressions appear more happy and less sad than they really are.

  6. The contribution of emotional empathy to approachability judgements assigned to emotional faces is context specific

    Directory of Open Access Journals (Sweden)

    Megan L Willis

    2015-08-01

    Full Text Available Previous research on approachability judgements has indicated that facial expressions modulate how these judgements are made, but the relationship between emotional empathy and context in this decision-making process has not yet been examined. This study examined the contribution of emotional empathy to approachability judgements assigned to emotional faces in different contexts. One hundred and twenty female participants completed the Questionnaire Measure of Emotional Empathy. Participants provided approachability judgements to faces displaying angry, disgusted, fearful, happy, neutral and sad expressions, in three different contexts – when evaluating whether they would approach another individual to: 1 receive help; 2 give help; or 3 when no contextual information was provided. In addition, participants were also required to provide ratings of perceived threat, emotional intensity and label facial expressions. Emotional empathy significantly predicted approachability ratings for specific emotions in each context, over and above the contribution of perceived threat and intensity, which were associated with emotional empathy. Higher emotional empathy predicted less willingness to approach people with angry and disgusted faces to receive help, and a greater willingness to approach people with happy faces to receive help. Higher emotional empathy also predicted a greater willingness to approach people with sad faces to offer help, and more willingness to approach people with happy faces when no contextual information was provided. These results highlight the important contribution of individual differences in emotional empathy in predicting how approachability judgements are assigned to facial expressions in context.

  7. Differential emotion attribution to neutral faces of own and other races.

    Science.gov (United States)

    Hu, Chao S; Wang, Qiandong; Han, Tong; Weare, Ethan; Fu, Genyue

    2017-02-01

    Past research has demonstrated differential recognition of emotion on faces of different races. This paper reports the first study to explore differential emotion attribution to neutral faces of different races. Chinese and Caucasian adults viewed a series of Chinese and Caucasian neutral faces and judged their outward facial expression: neutral, positive, or negative. The results showed that both Chinese and Caucasian viewers perceived more Chinese faces than Caucasian faces as neutral. Nevertheless, Chinese viewers attributed positive emotion to Caucasian faces more than to Chinese faces, whereas Caucasian viewers attributed negative emotion to Caucasian faces more than to Chinese faces. Moreover, Chinese viewers attributed negative and neutral emotion to the faces of both races without significant difference in frequency, whereas Caucasian viewers mostly attributed neutral emotion to the faces. These differences between Chinese and Caucasian viewers may be due to differential visual experience, culture, racial stereotype, or expectation of the experiment. We also used eye tracking among the Chinese participants to explore the relationship between face-processing strategy and emotion attribution to neutral faces. The results showed that the interaction between emotion attribution and face race was significant on face-processing strategy, such as fixation proportion on eyes and saccade amplitude. Additionally, pupil size during processing Caucasian faces was larger than during processing Chinese faces.

  8. Serotonergic modulation of face-emotion recognition

    Directory of Open Access Journals (Sweden)

    C.M. Del-Ben

    2008-04-01

    Full Text Available Facial expressions of basic emotions have been widely used to investigate the neural substrates of emotion processing, but little is known about the exact meaning of subjective changes provoked by perceiving facial expressions. Our assumption was that fearful faces would be related to the processing of potential threats, whereas angry faces would be related to the processing of proximal threats. Experimental studies have suggested that serotonin modulates the brain processes underlying defensive responses to environmental threats, facilitating risk assessment behavior elicited by potential threats and inhibiting fight or flight responses to proximal threats. In order to test these predictions about the relationship between fearful and angry faces and defensive behaviors, we carried out a review of the literature about the effects of pharmacological probes that affect 5-HT-mediated neurotransmission on the perception of emotional faces. The hypothesis that angry faces would be processed as a proximal threat and that, as a consequence, their recognition would be impaired by an increase in 5-HT function was not supported by the results reviewed. In contrast, most of the studies that evaluated the behavioral effects of serotonin challenges showed that increased 5-HT neurotransmission facilitates the recognition of fearful faces, whereas its decrease impairs the same performance. These results agree with the hypothesis that fearful faces are processed as potential threats and that 5-HT enhances this brain processing.

  9. Investigating emotional top down modulation of ambiguous faces by single pulse TMS on early visual cortices

    Directory of Open Access Journals (Sweden)

    Zachary Adam Yaple

    2016-06-01

    Full Text Available Top-down processing is a mechanism in which memory, context and expectation are used to perceive stimuli. For this study we investigated how emotion content, induced by music mood, influences perception of happy and sad emoticons. Using single pulse TMS we stimulated right occipital face area (rOFA, primary visual cortex (V1 and vertex while subjects performed a face-detection task and listened to happy and sad music. At baseline, incongruent audio-visual pairings decreased performance, demonstrating dependence of emotion while perceiving ambiguous faces. However, performance of face identification decreased during rOFA stimulation regardless of emotional content. No effects were found between Cz and V1 stimulation. These results suggest that while rOFA is important for processing faces regardless of emotion, V1 stimulation had no effect. Our findings suggest that early visual cortex activity may not integrate emotional auditory information with visual information during emotion top-down modulation of faces.

  10. Social Media Use and Perceived Emotional Support Among US Young Adults

    Science.gov (United States)

    Shensa, Ariel; Sidani, Jaime E.; Lin, Liu yi; Bowman, Nicholas; Primack, Brian A.

    2015-01-01

    Low emotional support is associated with poor health outcomes. Engagement with face-to-face social networks is one way of increasing emotional support. However, it is not yet known whether engagement with proliferating electronic social networks is similarly associated with increased emotional support. Thus, the purpose of this study was to assess associations between social media use and perceived emotional support in a large, nationally-representative sample. In October 2014, we collected data from 1,796 U.S. adults ages 19-32. We assessed social media use using both total time spent and frequency of visits to each of the 11 most popular social media platforms. Our dependent variable was perceived emotional support as measured by the brief Patient-Reported Outcomes Measurement Information System (PROMIS) emotional support scale. A multivariable model including all sociodemographic covariates and accounting for survey weights demonstrated that, compared with the lowest quartile of time on social media, being in the highest quartile (spending two or more hours per day) was significantly associated with decreased odds of having higher perceived emotional support (AOR = 0.62, 95% CI = 0.40, 0.94). However, compared with those in the lowest quartile, being in the highest quartile regarding frequency of social media use was not significantly associated with perceived emotional support (AOR = 0.70, 95% CI = 0.45, 1.09). In conclusion, while the cross-sectional nature of these data hinder inference regarding directionality, it seems that heavy users of social media may actually feel less and not more emotional support. PMID:26613936

  11. Neurons in the human amygdala selective for perceived emotion

    Science.gov (United States)

    Wang, Shuo; Tudusciuc, Oana; Mamelak, Adam N.; Ross, Ian B.; Adolphs, Ralph; Rutishauser, Ueli

    2014-01-01

    The human amygdala plays a key role in recognizing facial emotions and neurons in the monkey and human amygdala respond to the emotional expression of faces. However, it remains unknown whether these responses are driven primarily by properties of the stimulus or by the perceptual judgments of the perceiver. We investigated these questions by recording from over 200 single neurons in the amygdalae of 7 neurosurgical patients with implanted depth electrodes. We presented degraded fear and happy faces and asked subjects to discriminate their emotion by button press. During trials where subjects responded correctly, we found neurons that distinguished fear vs. happy emotions as expressed by the displayed faces. During incorrect trials, these neurons indicated the patients’ subjective judgment. Additional analysis revealed that, on average, all neuronal responses were modulated most by increases or decreases in response to happy faces, and driven predominantly by judgments about the eye region of the face stimuli. Following the same analyses, we showed that hippocampal neurons, unlike amygdala neurons, only encoded emotions but not subjective judgment. Our results suggest that the amygdala specifically encodes the subjective judgment of emotional faces, but that it plays less of a role in simply encoding aspects of the image array. The conscious percept of the emotion shown in a face may thus arise from interactions between the amygdala and its connections within a distributed cortical network, a scheme also consistent with the long response latencies observed in human amygdala recordings. PMID:24982200

  12. Perceived intimacy of expressed emotion.

    Science.gov (United States)

    Howell, A; Conway, M

    1990-08-01

    Research on norms for emotional expression and self-disclosure provided the basis for two hypotheses concerning the perceived intimacy of emotional self-disclosure. The first hypothesis was that the perceived intimacy of negative emotional disclosure would be greater than that of positive emotional disclosure; the second was that disclosures of more intense emotional states would be perceived as more intimate than disclosures of less intense emotional states for both negative and positive disclosures. Both hypotheses received support when male students in Canada rated the perceived intimacy of self-disclosures that were equated for topic and that covered a comprehensive sample of emotions and a range of emotional intensities. The effects were observed across all the topics of disclosure examined.

  13. Emotion Words: Adding Face Value.

    Science.gov (United States)

    Fugate, Jennifer M B; Gendron, Maria; Nakashima, Satoshi F; Barrett, Lisa Feldman

    2017-06-12

    Despite a growing number of studies suggesting that emotion words affect perceptual judgments of emotional stimuli, little is known about how emotion words affect perceptual memory for emotional faces. In Experiments 1 and 2 we tested how emotion words (compared with control words) affected participants' abilities to select a target emotional face from among distractor faces. Participants were generally more likely to false alarm to distractor emotional faces when primed with an emotion word congruent with the face (compared with a control word). Moreover, participants showed both decreased sensitivity (d') to discriminate between target and distractor faces, as well as altered response biases (c; more likely to answer "yes") when primed with an emotion word (compared with a control word). In Experiment 3 we showed that emotion words had more of an effect on perceptual memory judgments when the structural information in the target face was limited, as well as when participants were only able to categorize the face with a partially congruent emotion word. The overall results are consistent with the idea that emotion words affect the encoding of emotional faces in perceptual memory. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. A note on age differences in mood-congruent vs. mood-incongruent emotion processing in faces.

    Science.gov (United States)

    Voelkle, Manuel C; Ebner, Natalie C; Lindenberger, Ulman; Riediger, Michaela

    2014-01-01

    (1) Does experienced mood affect emotion perception in faces and is this perception mood-congruent or mood-incongruent?(2) Are there age-group differences in the interplay between experienced mood and emotion perception? (3) Does emotion perception in faces change as a function of the temporal sequence of study sessions and stimuli presentation, and (4) does emotion perception in faces serve a mood-regulatory function? One hundred fifty-four adults of three different age groups (younger: 20-31 years; middle-aged: 44-55 years; older adults: 70-81 years) were asked to provide multidimensional emotion ratings of a total of 1026 face pictures of younger, middle-aged, and older men and women, each displaying six different prototypical (primary) emotional expressions. By analyzing the likelihood of ascribing an additional emotional expression to a face whose primary emotion had been correctly recognized, the multidimensional rating approach permits the study of emotion perception while controlling for emotion recognition. Following up on previous research on mood responses to recurring unpleasant situations using the same dataset (Voelkle et al., 2013), crossed random effects analyses supported a mood-congruent relationship between experienced mood and perceived emotions in faces. In particular older adults were more likely to perceive happiness in faces when being in a positive mood and less likely to do so when being in a negative mood. This did not apply to younger adults. Temporal sequence of study sessions and stimuli presentation had a strong effect on the likelihood of ascribing an additional emotional expression. In contrast to previous findings, however, there was neither evidence for a change from mood-congruent to mood-incongruent responses over time nor evidence for a mood-regulatory effect.

  15. How do medium naturalness and personality traits shape academic achievement and perceived learning? An experimental study of face-to-face and synchronous e-learning

    Directory of Open Access Journals (Sweden)

    Ina Blau

    2017-07-01

    Full Text Available This controlled experiment examined how academic achievement and cognitive, emotional and social aspects of perceived learning are affected by the level of medium naturalness (face-to-face, one-way and two-way videoconferencing and by learners’ personality traits (extroversion–introversion and emotional stability–neuroticism. The Media Naturalness Theory explains the degree of medium naturalness by comparing its characteristics to face-to-face communication, considered to be the most natural form of communication. A total of 76 participants were randomly assigned to three experimental conditions: face-to-face, one-way and two-way videoconferencing. E-learning conditions were conducted through Zoom videoconferencing, which enables natural and spontaneous communication. Findings shed light on the trade-off involved in media naturalness: one-way videoconferencing, the less natural learning condition, enhanced the cognitive aspect of perceived learning but compromised the emotional and social aspects. Regarding the impact of personality, neurotic students tended to enjoy and succeed more in face-to-face learning, whereas emotionally stable students enjoyed and succeeded in all of the learning conditions. Extroverts tended to enjoy more natural learning environments but had lower achievements in these conditions. In accordance with the ‘poor get richer’ principle, introverts enjoyed environments with a low level of medium naturalness. However, they remained focused and had higher achievements in the face-to-face learning.

  16. A note on age differences in mood-congruent versus mood-incongruent emotion processing in faces

    Directory of Open Access Journals (Sweden)

    Manuel C. Voelkle

    2014-06-01

    Full Text Available This article addresses four interrelated research questions: (1 Does experienced mood affect emotion perception in faces and is this perception mood-congruent or mood-incongruent? (2 Are there age-group differences in the interplay between experienced mood and emotion perception? (3 Does emotion perception in faces change as a function of the temporal sequence of study sessions and stimuli presentation, and (4 does emotion perception in faces serve a mood-regulatory function? One hundred fifty-four adults of three different age groups (younger: 20–31 years; middle-aged: 44–55 years; older adults: 70–81 years were asked to provide multidimensional emotion ratings of a total of 1,026 face pictures of younger, middle-aged, and older men and women, each displaying six different prototypical (primary emotional expressions. By analyzing the likelihood of ascribing an additional emotional expression to a face whose primary emotion had been correctly recognized, the multidimensional rating approach permits the study of emotion perception while controlling for emotion recognition. Following up on previous research on mood responses to recurring unpleasant situations using the same dataset (Voelkle, Ebner, Lindenberger, & Riediger, 2013, crossed random effects analyses supported a mood-congruent relationship between experienced mood and perceived emotions in faces. In particular older adults were more likely to perceive happiness in faces when being in a positive mood and less likely to do so when being in a negative mood. This did not apply to younger adults. Temporal sequence of study sessions and stimuli presentation had a strong effect on the likelihood of ascribing an additional emotional expression. In contrast to previous findings, however, there was neither evidence for a change from mood-congruent to mood-incongruent responses over time nor evidence for a mood-regulatory effect.

  17. Adult age-differences in subjective impression of emotional faces are reflected in emotion-related attention and memory tasks

    Directory of Open Access Journals (Sweden)

    Joakim eSvard

    2014-05-01

    Full Text Available Although younger and older adults appear to attend to and remember emotional faces differently, less is known about age-related differences in the subjective emotional impression (arousal, potency, and valence of emotional faces and how these differences, in turn, are reflected in age differences in various emotional tasks. In the current study, we used the same facial emotional stimuli (angry and happy faces in four tasks: emotional rating, attention, categorical perception, and visual short-term memory (VSTM. The aim of this study was to investigate effects of age on the subjective emotional impression of angry and happy faces and to examine whether any age differences were mirrored in measures of emotional behavior (attention, categorical perception, and memory.In addition, regression analyses were used to further study impression-behavior associations. Forty younger adults (range 20-30 years and thirty-nine older adults (range 65-75 years participated in the experiment. The emotional rating task showed that older adults perceived less arousal, potency, and valence than younger adults and that the difference was more pronounced for angry than happy faces. Similarly, the results of the attention and memory tasks demonstrated interaction effects between emotion and age, and age differences on these measures were larger for angry than for happy faces. Regression analyses confirmed that in both age groups, higher potency ratings predicted both visual search and visual short-term memory efficiency. Future studies should consider the possibility that age differences in the subjective emotional impression of facial emotional stimuli may explain age differences in attention to and memory of such stimuli.

  18. Lateralized hybrid faces: evidence of a valence-specific bias in the processing of implicit emotions.

    Science.gov (United States)

    Prete, Giulia; Laeng, Bruno; Tommasi, Luca

    2014-01-01

    It is well known that hemispheric asymmetries exist for both the analyses of low-level visual information (such as spatial frequency) and high-level visual information (such as emotional expressions). In this study, we assessed which of the above factors underlies perceptual laterality effects with "hybrid faces": a type of stimulus that allows testing for unaware processing of emotional expressions, when the emotion is displayed in the low-frequency information while an image of the same face with a neutral expression is superimposed to it. Despite hybrid faces being perceived as neutral, the emotional information modulates observers' social judgements. In the present study, participants were asked to assess friendliness of hybrid faces displayed tachistoscopically, either centrally or laterally to fixation. We found a clear influence of the hidden emotions also with lateral presentations. Happy faces were rated as more friendly and angry faces as less friendly with respect to neutral faces. In general, hybrid faces were evaluated as less friendly when they were presented in the left visual field/right hemisphere than in the right visual field/left hemisphere. The results extend the validity of the valence hypothesis in the specific domain of unaware (subcortical) emotion processing.

  19. Face-body integration of intense emotional expressions of victory and defeat.

    Directory of Open Access Journals (Sweden)

    Lili Wang

    Full Text Available Human facial expressions can be recognized rapidly and effortlessly. However, for intense emotions from real life, positive and negative facial expressions are difficult to discriminate and the judgment of facial expressions is biased towards simultaneously perceived body expressions. This study employed event-related potentials (ERPs to investigate the neural dynamics involved in the integration of emotional signals from facial and body expressions of victory and defeat. Emotional expressions of professional players were used to create pictures of face-body compounds, with either matched or mismatched emotional expressions in faces and bodies. Behavioral results showed that congruent emotional information of face and body facilitated the recognition of facial expressions. ERP data revealed larger P1 amplitudes for incongruent compared to congruent stimuli. Also, a main effect of body valence on the P1 was observed, with enhanced amplitudes for the stimuli with losing compared to winning bodies. The main effect of body expression was also observed in N170 and N2, with winning bodies producing larger N170/N2 amplitudes. In the later stage, a significant interaction of congruence by body valence was found on the P3 component. Winning bodies elicited lager P3 amplitudes than losing bodies did when face and body conveyed congruent emotional signals. Beyond the knowledge based on prototypical facial and body expressions, the results of this study facilitate us to understand the complexity of emotion evaluation and categorization out of laboratory.

  20. Method for Face-Emotion Retrieval Using A Cartoon Emotional Expression Approach

    Science.gov (United States)

    Kostov, Vlaho; Yanagisawa, Hideyoshi; Johansson, Martin; Fukuda, Shuichi

    A simple method for extracting emotion from a human face, as a form of non-verbal communication, was developed to cope with and optimize mobile communication in a globalized and diversified society. A cartoon face based model was developed and used to evaluate emotional content of real faces. After a pilot survey, basic rules were defined and student subjects were asked to express emotion using the cartoon face. Their face samples were then analyzed using principal component analysis and the Mahalanobis distance method. Feature parameters considered as having relations with emotions were extracted and new cartoon faces (based on these parameters) were generated. The subjects evaluated emotion of these cartoon faces again and we confirmed these parameters were suitable. To confirm how these parameters could be applied to real faces, we asked subjects to express the same emotions which were then captured electronically. Simple image processing techniques were also developed to extract these features from real faces and we then compared them with the cartoon face parameters. It is demonstrated via the cartoon face that we are able to express the emotions from very small amounts of information. As a result, real and cartoon faces correspond to each other. It is also shown that emotion could be extracted from still and dynamic real face images using these cartoon-based features.

  1. Young Adults with Autism Spectrum Disorder Show Early Atypical Neural Activity during Emotional Face Processing

    Directory of Open Access Journals (Sweden)

    Rachel C. Leung

    2018-02-01

    Full Text Available Social cognition is impaired in autism spectrum disorder (ASD. The ability to perceive and interpret affect is integral to successful social functioning and has an extended developmental course. However, the neural mechanisms underlying emotional face processing in ASD are unclear. Using magnetoencephalography (MEG, the present study explored neural activation during implicit emotional face processing in young adults with and without ASD. Twenty-six young adults with ASD and 26 healthy controls were recruited. Participants indicated the location of a scrambled pattern (target that was presented alongside a happy or angry face. Emotion-related activation sources for each emotion were estimated using the Empirical Bayes Beamformer (pcorr ≤ 0.001 in Statistical Parametric Mapping 12 (SPM12. Emotional faces elicited elevated fusiform, amygdala and anterior insula and reduced anterior cingulate cortex (ACC activity in adults with ASD relative to controls. Within group comparisons revealed that angry vs. happy faces elicited distinct neural activity in typically developing adults; there was no distinction in young adults with ASD. Our data suggest difficulties in affect processing in ASD reflect atypical recruitment of traditional emotional processing areas. These early differences may contribute to difficulties in deriving social reward from faces, ascribing salience to faces, and an immature threat processing system, which collectively could result in deficits in emotional face processing.

  2. Mapping the emotional face. How individual face parts contribute to successful emotion recognition.

    Directory of Open Access Journals (Sweden)

    Martin Wegrzyn

    Full Text Available Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes to disgust and happiness (mouth. The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation.

  3. Mapping the emotional face. How individual face parts contribute to successful emotion recognition

    Science.gov (United States)

    Wegrzyn, Martin; Vogt, Maria; Kireclioglu, Berna; Schneider, Julia; Kissler, Johanna

    2017-01-01

    Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation. PMID:28493921

  4. Neural correlates of top-down processing in emotion perception: an ERP study of emotional faces in white noise versus noise-alone stimuli.

    Science.gov (United States)

    Lee, Kyu-Yong; Lee, Tae-Ho; Yoon, So-Jeong; Cho, Yang Seok; Choi, June-Seek; Kim, Hyun Taek

    2010-06-14

    In the present study, we investigated the neural correlates underlying the perception of emotion in response to facial stimuli in order to elucidate the extent to which emotional perception is affected by the top-down process. Subjects performed a forced, two-choice emotion discrimination task towards ambiguous visual stimuli consisted of emotional faces embedded in different levels of visual white noise, including white noise-alone stimuli. ERP recordings and behavioral responses were analyzed according to the four response categories: hit, miss, false alarm and correct rejection. We observed enlarged EPN and LPP amplitudes when subjects reported seeing fearful faces and a typical emotional EPN response in the white noise-alone conditions when fearful faces were not presented. The two components of the ERP data which imply the characteristic modulation reflecting emotional processing showed the type of emotion each individual subjectively perceived. The results suggest that top-down modulations might be indispensable for emotional perception, which consists of two distinct stages of stimulus processing in the brain. (c) 2010 Elsevier B.V. All rights reserved.

  5. Emotional tears facilitate the recognition of sadness and the perceived need for social support.

    Science.gov (United States)

    Balsters, Martijn J H; Krahmer, Emiel J; Swerts, Marc G J; Vingerhoets, Ad J J M

    2013-02-12

    The tearing effect refers to the relevance of tears as an important visual cue adding meaning to human facial expression. However, little is known about how people process these visual cues and their mediating role in terms of emotion perception and person judgment. We therefore conducted two experiments in which we measured the influence of tears on the identification of sadness and the perceived need for social support at an early perceptional level. In two experiments (1 and 2), participants were exposed to sad and neutral faces. In both experiments, the face stimuli were presented for 50 milliseconds. In experiment 1, tears were digitally added to sad faces in one condition. Participants demonstrated a significant faster recognition of sad faces with tears compared to those without tears. In experiment 2, tears were added to neutral faces as well. Participants had to indicate to what extent the displayed individuals were in need of social support. Study participants reported a greater perceived need for social support to both sad and neutral faces with tears than to those without tears. This study thus demonstrated that emotional tears serve as important visual cues at an early (pre-attentive) level.

  6. Pitching Emotions: The Interpersonal Effects of Emotions in Professional Baseball

    Directory of Open Access Journals (Sweden)

    Arik eCheshin

    2016-02-01

    Full Text Available Sports games are inherently emotional situations, but surprisingly little is known about the social consequences of these emotions. We examined the interpersonal effects of emotional expressions in professional baseball. Specifically, we investigated whether pitchers' facial displays influence how pitches are assessed and responded to. Using footage from MLB World Series finals, we isolated incidents where the pitcher's face was visible before a pitch. A pre-study indicated that participants consistently perceived anger, happiness, and worry in pitchers' facial displays. An independent sample then predicted pitch characteristics and batter responses based on the same perceived emotional displays. Participants expected pitchers perceived as happy to throw more accurate balls, pitchers perceived as angry to throw faster and more difficult balls, and pitchers perceived as worried to throw slower and less accurate balls. Batters were expected to approach (swing when faced with a pitcher perceived as happy and to avoid (no swing when faced with a pitcher perceived as worried. Whereas previous research focused on using emotional expressions as information regarding past and current situations, our work suggests that people also use perceived emotional expressions to predict future behavior. Our results attest to the impact perceived emotional expressions can have on professional sports.

  7. Pitching Emotions: The Interpersonal Effects of Emotions in Professional Baseball.

    Science.gov (United States)

    Cheshin, Arik; Heerdink, Marc W; Kossakowski, Jolanda J; Van Kleef, Gerben A

    2016-01-01

    Sports games are inherently emotional situations, but surprisingly little is known about the social consequences of these emotions. We examined the interpersonal effects of emotional expressions in professional baseball. Specifically, we investigated whether pitchers' facial displays influence how pitches are assessed and responded to. Using footage from the Major League Baseball World Series finals, we isolated incidents where the pitcher's face was visible before a pitch. A pre-study indicated that participants consistently perceived anger, happiness, and worry in pitchers' facial displays. An independent sample then predicted pitch characteristics and batter responses based on the same perceived emotional displays. Participants expected pitchers perceived as happy to throw more accurate balls, pitchers perceived as angry to throw faster and more difficult balls, and pitchers perceived as worried to throw slower and less accurate balls. Batters were expected to approach (swing) when faced with a pitcher perceived as happy and to avoid (no swing) when faced with a pitcher perceived as worried. Whereas previous research focused on using emotional expressions as information regarding past and current situations, our work suggests that people also use perceived emotional expressions to predict future behavior. Our results attest to the impact perceived emotional expressions can have on professional sports.

  8. Interference among the Processing of Facial Emotion, Face Race, and Face Gender

    Science.gov (United States)

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender). PMID:27840621

  9. Interference among the Processing of Facial Emotion, Face Race, and Face Gender.

    Science.gov (United States)

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender).

  10. Perceiving Facial and Vocal Expressions of Emotion in Individuals with Williams Syndrome

    Science.gov (United States)

    Plesa-Skwerer, Daniela; Faja, Susan; Schofield, Casey; Verbalis, Alyssa; Tager-Flusberg, Helen

    2006-01-01

    People with Williams syndrome are extremely sociable, empathic, and expressive in communication. Some researchers suggest they may be especially sensitive to perceiving emotional expressions. We administered the Faces and Paralanguage subtests of the Diagnostic Analysis of Nonverbal Accuracy Scale (DANVA2), a standardized measure of emotion…

  11. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions.

    Science.gov (United States)

    Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the

  12. The effect of intranasal oxytocin on perceiving and understanding emotion on the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT).

    Science.gov (United States)

    Cardoso, Christopher; Ellenbogen, Mark A; Linnen, Anne-Marie

    2014-02-01

    Evidence suggests that intranasal oxytocin enhances the perception of emotion in facial expressions during standard emotion identification tasks. However, it is not clear whether this effect is desirable in people who do not show deficits in emotion perception. That is, a heightened perception of emotion in faces could lead to "oversensitivity" to the emotions of others in nonclinical participants. The goal of this study was to assess the effects of intranasal oxytocin on emotion perception using ecologically valid social and nonsocial visual tasks. Eighty-two participants (42 women) self-administered a 24 IU dose of intranasal oxytocin or a placebo in a double-blind, randomized experiment and then completed the perceiving and understanding emotion components of the Mayer-Salovey-Caruso Emotional Intelligence Test. In this test, emotion identification accuracy is based on agreement with a normative sample. As expected, participants administered intranasal oxytocin rated emotion in facial stimuli as expressing greater emotional intensity than those given a placebo. Consequently, accurate identification of emotion in faces, based on agreement with a normative sample, was impaired in the oxytocin group relative to placebo. No such effect was observed for tests using nonsocial stimuli. The results are consistent with the hypothesis that intranasal oxytocin enhances the salience of social stimuli in the environment, but not nonsocial stimuli. The present findings support a growing literature showing that the effects of intranasal oxytocin on social cognition can be negative under certain circumstances, in this case promoting "oversensitivity" to emotion in faces in healthy people. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  13. Perceived face size in healthy adults.

    Science.gov (United States)

    D'Amour, Sarah; Harris, Laurence R

    2017-01-01

    Perceptual body size distortions have traditionally been studied using subjective, qualitative measures that assess only one type of body representation-the conscious body image. Previous research on perceived body size has typically focused on measuring distortions of the entire body and has tended to overlook the face. Here, we present a novel psychophysical method for determining perceived body size that taps into implicit body representation. Using a two-alternative forced choice (2AFC), participants were sequentially shown two life-size images of their own face, viewed upright, upside down, or tilted 90°. In one interval, the width or length dimension was varied, while the other interval contained an undistorted image. Participants reported which image most closely matched their own face. An adaptive staircase adjusted the distorted image to hone in on the image that was equally likely to be judged as matching their perceived face as the accurate image. When viewed upright or upside down, face width was overestimated and length underestimated, whereas perception was accurate for the on-side views. These results provide the first psychophysically robust measurements of how accurately healthy participants perceive the size of their face, revealing distortions of the implicit body representation independent of the conscious body image.

  14. Emotion elicitor or emotion messenger? Subliminal priming reveals two faces of facial expressions.

    Science.gov (United States)

    Ruys, Kirsten I; Stapel, Diederik A

    2008-06-01

    Facial emotional expressions can serve both as emotional stimuli and as communicative signals. The research reported here was conducted to illustrate how responses to both roles of facial emotional expressions unfold over time. As an emotion elicitor, a facial emotional expression (e.g., a disgusted face) activates a response that is similar to responses to other emotional stimuli of the same valence (e.g., a dirty, nonflushed toilet). As an emotion messenger, the same facial expression (e.g., a disgusted face) serves as a communicative signal by also activating the knowledge that the sender is experiencing a specific emotion (e.g., the sender feels disgusted). By varying the duration of exposure to disgusted, fearful, angry, and neutral faces in two subliminal-priming studies, we demonstrated that responses to faces as emotion elicitors occur prior to responses to faces as emotion messengers, and that both types of responses may unfold unconsciously.

  15. Emotion recognition through static faces and moving bodies: a comparison between typically developed adults and individuals with high level of autistic traits.

    Science.gov (United States)

    Actis-Grosso, Rossana; Bossi, Francesco; Ricciardelli, Paola

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits or with High Functioning Autism Spectrum Disorder (HAT group) was compared in the recognition of four emotions (Happiness, Anger, Fear, and Sadness) either shown in static faces or conveyed by moving body patch-light displays (PLDs). Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage). Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that (i) emotion recognition is not generally impaired in HAT individuals, (ii) the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals.

  16. Emotion recognition through static faces and moving bodies: a comparison between typically-developed adults and individuals with high level of autistic traits

    Directory of Open Access Journals (Sweden)

    Rossana eActis-Grosso

    2015-10-01

    Full Text Available We investigated whether the type of stimulus (pictures of static faces vs. body motion contributes differently to the recognition of emotions. The performance (accuracy and response times of 25 Low Autistic Traits (LAT group young adults (21 males and 20 young adults (16 males with either High Autistic Traits (HAT group or with High Functioning Autism Spectrum Disorder was compared in the recognition of four emotions (Happiness, Anger, Fear and Sadness either shown in static faces or conveyed by moving bodies (patch-light displays, PLDs. Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage. Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that i emotion recognition is not generally impaired in HAT individuals, ii the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals.

  17. From specificity to sensitivity: affective states modulate visual working memory for emotional expressive faces.

    Science.gov (United States)

    Maran, Thomas; Sachse, Pierre; Furtner, Marco

    2015-01-01

    Previous findings suggest that visual working memory (VWM) preferentially remembers angry looking faces. However, the meaning of facial actions is construed in relation to context. To date, there are no studies investigating the role of perceiver-based context when processing emotional cues in VWM. To explore the influence of affective context on VWM for faces, we conducted two experiments using both a VWM task for emotionally expressive faces and a mood induction procedure. Affective context was manipulated by unpleasant (Experiment 1) and pleasant (Experiment 2) IAPS pictures in order to induce an affect high in motivational intensity (defensive or appetitive, respectively) compared to a low arousal control condition. Results indicated specifically increased sensitivity of VWM for angry looking faces in the neutral condition. Enhanced VWM for angry faces was prevented by inducing affects of high motivational intensity. In both experiments, affective states led to a switch from specific enhancement of angry expressions in VWM to an equally sensitive representation of all emotional expressions. Our findings demonstrate that emotional expressions are of different behavioral relevance for the receiver depending on the affective context, supporting a functional organization of VWM along with flexible resource allocation. In VWM, stimulus processing adjusts to situational requirements and transitions from a specifically prioritizing default mode in predictable environments to a sensitive, hypervigilant mode in exposure to emotional events.

  18. Face and emotion expression processing and the serotonin transporter polymorphism 5-HTTLPR/rs22531.

    Science.gov (United States)

    Hildebrandt, A; Kiy, A; Reuter, M; Sommer, W; Wilhelm, O

    2016-06-01

    Face cognition, including face identity and facial expression processing, is a crucial component of socio-emotional abilities, characterizing humans as highest developed social beings. However, for these trait domains molecular genetic studies investigating gene-behavior associations based on well-founded phenotype definitions are still rare. We examined the relationship between 5-HTTLPR/rs25531 polymorphisms - related to serotonin-reuptake - and the ability to perceive and recognize faces and emotional expressions in human faces. For this aim we conducted structural equation modeling on data from 230 young adults, obtained by using a comprehensive, multivariate task battery with maximal effort tasks. By additionally modeling fluid intelligence and immediate and delayed memory factors, we aimed to address the discriminant relationships of the 5-HTTLPR/rs25531 polymorphisms with socio-emotional abilities. We found a robust association between the 5-HTTLPR/rs25531 polymorphism and facial emotion perception. Carriers of two long (L) alleles outperformed carriers of one or two S alleles. Weaker associations were present for face identity perception and memory for emotional facial expressions. There was no association between the 5-HTTLPR/rs25531 polymorphism and non-social abilities, demonstrating discriminant validity of the relationships. We discuss the implications and possible neural mechanisms underlying these novel findings. © 2016 John Wiley & Sons Ltd and International Behavioural and Neural Genetics Society.

  19. Grounding Context in Face Processing: Color, Emotion and Gender

    Directory of Open Access Journals (Sweden)

    Sandrine eGil

    2015-03-01

    Full Text Available In recent years, researchers have become interested in the way that the affective quality of contextual information transfers to a perceived target. We therefore examined the effect of a red (versus green, mixed red/green and achromatic background–known to be valenced−on the processing of stimuli that play a key role in human interactions, namely facial expressions. We also examined whether the valenced-color effect can be modulated by gender, which is also known to be valenced. Female and male adult participants performed a categorization task of facial expressions of emotion in which the faces of female and male posers expressing two ambiguous emotions (i.e., neutral and surprise were presented against the four different colored backgrounds. Additionally, this task was completed by collecting subjective ratings for each colored background in the form of five semantic differential scales corresponding to both discrete and dimensional perspectives of emotion. We found that the red background resulted in more negative face perception than the green background, whether the poser was female or male. However, whereas this valenced-color effect was the only effect for female posers, for male posers, the effect was modulated by both the nature of the ambiguous emotion and the decoder’s gender. Overall, our findings offer evidence that color and gender have a common valence-based dimension.

  20. Emotional Labor, Face and Guan xi

    Institute of Scientific and Technical Information of China (English)

    Tianwenling

    2017-01-01

    Emotional Labor, Face and Guan xi are all relevant to performance, appearance, and emotional feelings, which are essential elements in work place. In other words, not only front-line workers, but all employees in an organization is faced up with the three

  1. Emotionally anesthetized: media violence induces neural changes during emotional face processing

    OpenAIRE

    Stockdale, Laura A.; Morrison, Robert G.; Kmiecik, Matthew J.; Garbarino, James; Silton, Rebecca L.

    2015-01-01

    Media violence exposure causes increased aggression and decreased prosocial behavior, suggesting that media violence desensitizes people to the emotional experience of others. Alterations in emotional face processing following exposure to media violence may result in desensitization to others’ emotional states. This study used scalp electroencephalography methods to examine the link between exposure to violence and neural changes associated with emotional face processing. Twenty-five particip...

  2. Happy faces are preferred regardless of familiarity--sad faces are preferred only when familiar.

    Science.gov (United States)

    Liao, Hsin-I; Shimojo, Shinsuke; Yeh, Su-Ling

    2013-06-01

    Familiarity leads to preference (e.g., the mere exposure effect), yet it remains unknown whether it is objective familiarity, that is, repetitive exposure, or subjective familiarity that contributes to preference. In addition, it is unexplored whether and how different emotions influence familiarity-related preference. The authors investigated whether happy or sad faces are preferred or perceived as more familiar and whether this subjective familiarity judgment correlates with preference for different emotional faces. An emotional face--happy or sad--was paired with a neutral face, and participants rated the relative preference and familiarity of each of the paired faces. For preference judgment, happy faces were preferred and sad faces were less preferred, compared with neutral faces. For familiarity judgment, happy faces did not show any bias, but sad faces were perceived as less familiar than neutral faces. Item-by-item correlational analyses show preference for sad faces--but not happy faces--positively correlate with familiarity. These results suggest a direct link between positive emotion and preference, and argue at least partly against a common cause for familiarity and preference. Instead, facial expression of different emotional valence modulates the link between familiarity and preference.

  3. Task-irrelevant emotion facilitates face discrimination learning.

    Science.gov (United States)

    Lorenzino, Martina; Caudek, Corrado

    2015-03-01

    We understand poorly how the ability to discriminate faces from one another is shaped by visual experience. The purpose of the present study is to determine whether face discrimination learning can be facilitated by facial emotions. To answer this question, we used a task-irrelevant perceptual learning paradigm because it closely mimics the learning processes that, in daily life, occur without a conscious intention to learn and without an attentional focus on specific facial features. We measured face discrimination thresholds before and after training. During the training phase (4 days), participants performed a contrast discrimination task on face images. They were not informed that we introduced (task-irrelevant) subtle variations in the face images from trial to trial. For the Identity group, the task-irrelevant features were variations along a morphing continuum of facial identity. For the Emotion group, the task-irrelevant features were variations along an emotional expression morphing continuum. The Control group did not undergo contrast discrimination learning and only performed the pre-training and post-training tests, with the same temporal gap between them as the other two groups. Results indicate that face discrimination improved, but only for the Emotion group. Participants in the Emotion group, moreover, showed face discrimination improvements also for stimulus variations along the facial identity dimension, even if these (task-irrelevant) stimulus features had not been presented during training. The present results highlight the importance of emotions for face discrimination learning. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Recognition of Face and Emotional Facial Expressions in Autism

    Directory of Open Access Journals (Sweden)

    Muhammed Tayyib Kadak

    2013-03-01

    Full Text Available Autism is a genetically transferred neurodevelopmental disorder characterized by severe and permanent deficits in many interpersonal relation areas like communication, social interaction and emotional responsiveness. Patients with autism have deficits in face recognition, eye contact and recognition of emotional expression. Both recognition of face and expression of facial emotion carried on face processing. Structural and functional impairment in fusiform gyrus, amygdala, superior temporal sulcus and other brain regions lead to deficits in recognition of face and facial emotion. Therefore studies suggest that face processing deficits resulted in problems in areas of social interaction and emotion in autism. Studies revealed that children with autism had problems in recognition of facial expression and used mouth region more than eye region. It was also shown that autistic patients interpreted ambiguous expressions as negative emotion. In autism, deficits related in various stages of face processing like detection of gaze, face identity, recognition of emotional expression were determined, so far. Social interaction impairments in autistic spectrum disorders originated from face processing deficits during the periods of infancy, childhood and adolescence. Recognition of face and expression of facial emotion could be affected either automatically by orienting towards faces after birth, or by “learning” processes in developmental periods such as identity and emotion processing. This article aimed to review neurobiological basis of face processing and recognition of emotional facial expressions during normal development and in autism.

  5. Oxytocin and social pretreatment have similar effects on processing of negative emotional faces in healthy adult males

    Directory of Open Access Journals (Sweden)

    Anna eKis

    2013-08-01

    Full Text Available Oxytocin has been shown to affect several aspects of human social cognition, including facial emotion processing. There is also evidence that social stimuli (such as eye-contact can effectively modulate endogenous oxytocin levels.In the present study we directly tested whether intranasal oxytocin administration and pre-treatment with social stimuli had similar effects on face processing at the behavioural level. Subjects (N=52 healthy adult males were presented with a set of faces with expressions of different valence (negative, neutral, positive following different types of pretreatment (oxytocin – OT or placebo – PL and social interaction – Soc or no social interaction – NSoc, N=13 in each and were asked to rate all faces for perceived emotion and trustworthiness. On the next day subjects’ recognition memory was tested on a set of neutral faces and additionally they had to again rate each face for trustworthiness and emotion.Subjects in both the OT and the Soc pretreatment group (as compared to the PL and to the NSoc groups gave higher emotion and trustworthiness scores for faces with negative emotional expression. Moreover, 24 h later, subjects in the OT and Soc groups (unlike in control groups gave lower trustworthiness scores for previously negative faces, than for faces previously seen as emotionally neutral or positive.In sum these results provide the first direct evidence of the similar effects of intranasal oxytocin administration and social stimulation on the perception of negative facial emotions as well as on the delayed recall of negative emotional information.

  6. Emotionally anesthetized: media violence induces neural changes during emotional face processing.

    Science.gov (United States)

    Stockdale, Laura A; Morrison, Robert G; Kmiecik, Matthew J; Garbarino, James; Silton, Rebecca L

    2015-10-01

    Media violence exposure causes increased aggression and decreased prosocial behavior, suggesting that media violence desensitizes people to the emotional experience of others. Alterations in emotional face processing following exposure to media violence may result in desensitization to others' emotional states. This study used scalp electroencephalography methods to examine the link between exposure to violence and neural changes associated with emotional face processing. Twenty-five participants were shown a violent or nonviolent film clip and then completed a gender discrimination stop-signal task using emotional faces. Media violence did not affect the early visual P100 component; however, decreased amplitude was observed in the N170 and P200 event-related potentials following the violent film, indicating that exposure to film violence leads to suppression of holistic face processing and implicit emotional processing. Participants who had just seen a violent film showed increased frontal N200/P300 amplitude. These results suggest that media violence exposure may desensitize people to emotional stimuli and thereby require fewer cognitive resources to inhibit behavior. © The Author (2015). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  7. Perceived antecedents of emotional reactions in inter-ethnic relations.

    Science.gov (United States)

    Dijker, A J; Koomen, W; van den Heuvel, H; Frijda, N H

    1996-06-01

    It is argued that the aspects of intergroup relations that potentially can arouse emotions in the perceiver are likely to become central and motivationally relevant elements of group stereotypes. Asking participants to report on the perceived antecedents of their emotional reactions to in-group and out-group members should therefore be an especially useful method to reveal the content of stereotypes. Native Dutch participants reported both the frequencies with which different emotions were felt in different intergroup relations and the perceived causes of these emotions. Analysis of self-reported antecedents of emotional reactions revealed that (a) despite a general in-group favourability bias, both the in-group and the two out-groups employed arouse different kinds of negative and positive emotions; and (b) differences in emotional reactions to the two out-groups are related to salient differences in perceived antecedents between these groups. Theoretical and practical implications of the present emphasis on the cognitive foundation of emotion in intergroup relations are discussed.

  8. Interference among the Processing of Facial Emotion, Face Race, and Face Gender

    OpenAIRE

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People are able to simultaneously process multiple dimensions of facial properties. Facial processing models are based on the processing of facial properties. This paper examined the processing of facial emotion, face race and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interfered with face race in all the tasks. The interaction of face race and face gend...

  9. From Specificity to Sensitivity: Affective states modulate visual working memory for emotional expressive faces

    Directory of Open Access Journals (Sweden)

    Thomas eMaran

    2015-08-01

    Full Text Available Previous findings suggest that visual working memory preferentially remembers angry looking faces. However, the meaning of facial actions is construed in relation to context. To date, there are no studies investigating the role of perceiver-based context when processing emotional cues in visual working memory. To explore the influence of affective context on visual working memory for faces, we conducted two experiments using both a visual working memory task for emotionally expressive faces and a mood induction procedure. Affective context was manipulated by unpleasant (Experiment 1 and pleasant (Experiment 2 IAPS pictures in order to induce an affect high in motivational intensity (defensive or appetitive, respectively compared to a low arousal control condition. Results indicated specifically increased sensitivity of visual working memory for angry looking faces in the neutral condition. Enhanced visual working memory for angry faces was prevented by inducing affects of high motivational intensity. In both experiments, affective states led to a switch from specific enhancement of angry expressions in visual working memory to an equally sensitive representation of all emotional expressions. Our findings demonstrate that emotional expressions are of different behavioral relevance for the receiver depending on the affective context, supporting a functional organization of visual working memory along with flexible resource allocation. In visual working memory, stimulus processing adjusts to situational requirements and transitions from a specifically prioritizing default mode in predictable environments to a sensitive, hypervigilant mode in exposure to emotional events.

  10. Identification of emotions in mixed disgusted-happy faces as a function of depressive symptom severity.

    Science.gov (United States)

    Sanchez, Alvaro; Romero, Nuria; Maurage, Pierre; De Raedt, Rudi

    2017-12-01

    Interpersonal difficulties are common in depression, but their underlying mechanisms are not yet fully understood. The role of depression in the identification of mixed emotional signals with a direct interpersonal value remains unclear. The present study aimed to clarify this question. A sample of 39 individuals reporting a broad range of depression levels completed an emotion identification task where they viewed faces expressing three emotional categories (100% disgusted and 100% happy faces, as well as their morphed 50% disgusted - 50% happy exemplars). Participants were asked to identify the corresponding depicted emotion as "clearly disgusted", "mixed", or "clearly happy". Higher depression levels were associated with lower identification of positive emotions in 50% disgusted - 50% happy faces. The study was conducted with an analogue sample reporting individual differences in subclinical depression levels. Further research must replicate these findings in a clinical sample and clarify whether differential emotional identification patterns emerge in depression for different mixed negative-positive emotions (sad-happy vs. disgusted-happy). Depression may account for a lower bias to perceive positive states when ambiguous states from others include subtle signals of social threat (i.e., disgust), leading to an under-perception of positive social signals. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. When a face type is perceived as threatening: Using general recognition theory to understand biased categorization of Afrocentric faces.

    Science.gov (United States)

    Kleider-Offutt, Heather M; Bond, Alesha D; Williams, Sarah E; Bohil, Corey J

    2018-03-07

    Prior research indicates that stereotypical Black faces (e.g., wide nose, full lips: Afrocentric) are often associated with crime and violence. The current study investigated whether stereotypical faces may bias the interpretation of facial expression to seem threatening. Stimuli were prerated by face type (stereotypical, nonstereotypical) and expression (neutral, threatening). Later in a forced-choice task, different participants categorized face stimuli as stereotypical or not and threatening or not. Regardless of prerated expression, stereotypical faces were judged as more threatening than were nonstereotypical faces. These findings were supported using computational models based on general recognition theory (GRT), indicating that decision boundaries were more biased toward the threatening response for stereotypical faces than for nonstereotypical faces. GRT analysis also indicated that perception of face stereotypicality and emotional expression are dependent, both across categories and within individual categories. Higher perceived stereotypicality predicts higher perception of threat, and, conversely, higher ratings of threat predict higher perception of stereotypicality. Implications for racial face-type bias influencing perception and decision-making in a variety of social and professional contexts are discussed.

  12. When does subliminal affective image priming influence the ability of schizophrenic patients to perceive face emotions?

    Science.gov (United States)

    Vaina, Lucia Maria; Rana, Kunjan D; Cotos, Ionela; Li-Yang, Chen; Huang, Melissa A; Podea, Delia

    2014-12-24

    Deficits in face emotion perception are among the most pervasive aspects of schizophrenia impairments which strongly affects interpersonal communication and social skills. Schizophrenic patients (PSZ) and healthy control subjects (HCS) performed 2 psychophysical tasks. One, the SAFFIMAP test, was designed to determine the impact of subliminally presented affective or neutral images on the accuracy of face-expression (angry or neutral) perception. In the second test, FEP, subjects saw pictures of face-expression and were asked to rate them as angry, happy, or neutral. The following clinical scales were used to determine the acute symptoms in PSZ: Positive and Negative Syndrome (PANSS), Young Mania Rating (YMRS), Hamilton Depression (HAM-D), and Hamilton Anxiety (HAM-A). On the SAFFIMAP test, different from the HCS group, the PSZ group tended to categorize the neutral expression of test faces as angry and their response to the test-face expression was not influenced by the affective content of the primes. In PSZ, the PANSS-positive score was significantly correlated with correct perception of angry faces for aggressive or pleasant primes. YMRS scores were strongly correlated with PSZ's tendency to recognize angry face expressions when the prime was a pleasant or a neutral image. The HAM-D score was positively correlated with categorizing the test-faces as neutral, regardless of the affective content of the prime or of the test-face expression (angry or neutral). Despite its exploratory nature, this study provides the first evidence that conscious perception and categorization of facial emotions (neutral or angry) in PSZ is directly affected by their positive or negative symptoms of the disease as defined by their individual scores on the clinical diagnostic scales.

  13. Use of context in emotion perception: The role of top-down control, cue type, and perceiver's age.

    Science.gov (United States)

    Ngo, Nhi; Isaacowitz, Derek M

    2015-06-01

    Although context is crucial to emotion perception, there are various factors that can modulate contextual influence. The current research investigated how cue type, top-down control, and the perceiver's age influence attention to context in facial emotion perception. In 2 experiments, younger and older adults identified facial expressions contextualized by other faces, isolated objects, and scenes. In the first experiment, participants were instructed to ignore face, object, and scene contexts. Face context was found to influence perception the least, whereas scene context produced the most contextual effect. Older adults were more influenced by context than younger adults, but both age groups were similarly influenced by different types of contextual cues, even when they were instructed to ignore the context. In the second experiment, when explicitly instructed that the context had no meaningful relationship to the target, younger and older adults both were less influenced by context than when they were instructed that the context was relevant to the target. Results from both studies indicate that contextual influence on emotion perception is not constant, but can vary based on the type of contextual cue, cue relevance, and the perceiver's age. (c) 2015 APA, all rights reserved).

  14. Enhanced amygdala reactivity to emotional faces in adults reporting childhood emotional maltreatment

    Science.gov (United States)

    van Tol, Marie-José; Demenescu, Liliana R.; van der Wee, Nic J. A.; Veltman, Dick J.; Aleman, André; van Buchem, Mark A.; Spinhoven, Philip; Penninx, Brenda W. J. H.; Elzinga, Bernet M.

    2013-01-01

    In the context of chronic childhood emotional maltreatment (CEM; emotional abuse and/or neglect), adequately responding to facial expressions is an important skill. Over time, however, this adaptive response may lead to a persistent vigilance for emotional facial expressions. The amygdala and the medial prefrontal cortex (mPFC) are key regions in face processing. However, the neurobiological correlates of face processing in adults reporting CEM are yet unknown. We examined amydala and mPFC reactivity to emotional faces (Angry, Fearful, Sad, Happy, Neutral) vs scrambled faces in healthy controls and unmedicated patients with depression and/or anxiety disorders reporting CEM before the age of 16 years (n = 60), and controls and patients who report no childhood abuse (n = 75). We found that CEM was associated with enhanced bilateral amygdala reactivity to emotional faces in general, and independent of psychiatric status. Furthermore, we found no support for differential mPFC functioning, suggesting that amygdala hyper-responsivity to emotional facial perception in adults reporting CEM may be independent from top–down influences of the mPFC. These findings may be key in understanding the increased emotional sensitivity and interpersonal difficulties, that have been reported in individuals with a history of CEM. PMID:22258799

  15. Task-dependent neural bases of perceiving emotionally expressive targets

    Directory of Open Access Journals (Sweden)

    Jamil eZaki

    2012-08-01

    Full Text Available Social cognition is fundamentally interpersonal: individuals’ behavior and dispositions critically affect their interaction partners’ information processing. However, cognitive neuroscience studies, partially because of methodological constraints, have remained largely perceiver-centric: focusing on the abilities, motivations, and goals of social perceivers while largely ignoring interpersonal effects. Here, we address this knowledge gap by examining the neural bases of perceiving emotionally expressive and inexpressive social targets. Sixteen perceivers were scanned using fMRI while they watched targets discussing emotional autobiographical events. Perceivers continuously rated each target’s emotional state or eye-gaze direction. The effects of targets’ emotional expressivity on perceiver’s brain activity depended on task set: when perceivers explicitly attended to targets’ emotions, expressivity predicted activity in neural structures—including medial prefrontal and posterior cingulate cortex—associated with drawing inferences about mental states. When perceivers instead attended to targets’ eye-gaze, target expressivity predicted activity in regions—including somatosensory cortex, fusiform gyrus, and motor cortex—associated with monitoring sensorimotor states and biological motion. These findings suggest that expressive targets affect information processing in manner that depends on perceivers’ goals. More broadly, these data provide an early step towards understanding the neural bases of interpersonal social cognition.

  16. How Do Humans Perceive Emotion?

    Institute of Scientific and Technical Information of China (English)

    LI Wen

    2017-01-01

    Emotion carries crucial qualities of the human condition, representing one of the major challenges in artificial intelligence. Re-search in psychology and neuroscience in the past two to three decades has generated rich insights into the processes underlying human emotion. Cognition and emotion represent the two main pillars of the human psyche and human intelligence. While the hu-man cognitive system and cognitive brain has inspired and informed computer science and artificial intelligence, the future is ripe for the human emotion system to be integrated into artificial intelligence and robotic systems. Here, we review behavioral and neu-ral findings in human emotion perception, including facial emotion perception, olfactory emotion perception, multimodal emotion perception, and the time course of emotion perception. It is our hope that knowledge of how humans perceive emotion will help bring artificial intelligence strides closer to human intelligence.

  17. Perceived utility of emotion: the structure and construct validity of the Perceived Affect Utility Scale in a cross-ethnic sample.

    Science.gov (United States)

    Chow, Philip I; Berenbaum, Howard

    2012-01-01

    This study introduces a new measure of the perceived utility of emotion, which is the degree to which emotions are perceived to be useful in achieving goals. In this study, we administered this new measure, the Perceived Affect Utility Scale (PAUSe), to a sample of 142 European American and 156 East Asian American college students. Confirmatory factor analyses provided support for a new, culturally informed parsing of emotion and for perceived utility of emotion to be distinguishable from ideal affect, a related but separate construct. Next, we explored the potential importance of perceived utility of emotion in cultural research. Through path analyses, we found that: (a) culturally relevant variables (e.g., independence) played a mediating role in the link between ethnic group and perceived utility of emotion; and (b) perceived utility of emotion played a mediating role in the link between culturally relevant variables and ideal affect. In particular, perceived utility of self-centered emotions (e.g., pride) was found to be associated with independence and ideal affect of those same emotions. In contrast, perceived utility of other-centered emotions (e.g., appreciation) was found to be associated with interdependence, dutifulness/self-discipline, and ideal affect of those same emotions. Implications for perceived utility of emotion in understanding cultural factors are discussed.

  18. Seeing emotion with your ears: emotional prosody implicitly guides visual attention to faces.

    Directory of Open Access Journals (Sweden)

    Simon Rigoulot

    Full Text Available Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality and emotional speech prosody (auditory modality which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0-1250 ms], [1250-2500 ms], [2500-5000 ms] were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect, although this effect was often emotion-specific (with greatest effects for fear. Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions.

  19. Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

    Science.gov (United States)

    Rigoulot, Simon; Pell, Marc D.

    2012-01-01

    Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions. PMID:22303454

  20. Alcoholism and dampened temporal limbic activation to emotional faces.

    Science.gov (United States)

    Marinkovic, Ksenija; Oscar-Berman, Marlene; Urban, Trinity; O'Reilly, Cara E; Howard, Julie A; Sawyer, Kayle; Harris, Gordon J

    2009-11-01

    Excessive chronic drinking is accompanied by a broad spectrum of emotional changes ranging from apathy and emotional flatness to deficits in comprehending emotional information, but their neural bases are poorly understood. Emotional abnormalities associated with alcoholism were examined with functional magnetic resonance imaging in abstinent long-term alcoholic men in comparison to healthy demographically matched controls. Participants were presented with emotionally valenced words and photographs of faces during deep (semantic) and shallow (perceptual) encoding tasks followed by recognition. Overall, faces evoked stronger activation than words, with the expected material-specific laterality (left hemisphere for words, and right for faces) and depth of processing effects. However, whereas control participants showed stronger activation in the amygdala and hippocampus when viewing faces with emotional (relative to neutral) expressions, the alcoholics responded in an undifferentiated manner to all facial expressions. In the alcoholic participants, amygdala activity was inversely correlated with an increase in lateral prefrontal activity as a function of their behavioral deficits. Prefrontal modulation of emotional function as a compensation for the blunted amygdala activity during a socially relevant face appraisal task is in agreement with a distributed network engagement during emotional face processing. Deficient activation of amygdala and hippocampus may underlie impaired processing of emotional faces associated with long-term alcoholism and may be a part of the wide array of behavioral problems including disinhibition, concurring with previously documented interpersonal difficulties in this population. Furthermore, the results suggest that alcoholics may rely on prefrontal rather than temporal limbic areas in order to compensate for reduced limbic responsivity and to maintain behavioral adequacy when faced with emotionally or socially challenging situations.

  1. Emotion recognition training using composite faces generalises across identities but not all emotions.

    Science.gov (United States)

    Dalili, Michael N; Schofield-Toloza, Lawrence; Munafò, Marcus R; Penton-Voak, Ian S

    2017-08-01

    Many cognitive bias modification (CBM) tasks use facial expressions of emotion as stimuli. Some tasks use unique facial stimuli, while others use composite stimuli, given evidence that emotion is encoded prototypically. However, CBM using composite stimuli may be identity- or emotion-specific, and may not generalise to other stimuli. We investigated the generalisability of effects using composite faces in two experiments. Healthy adults in each study were randomised to one of four training conditions: two stimulus-congruent conditions, where same faces were used during all phases of the task, and two stimulus-incongruent conditions, where faces of the opposite sex (Experiment 1) or faces depicting another emotion (Experiment 2) were used after the modification phase. Our results suggested that training effects generalised across identities. However, our results indicated only partial generalisation across emotions. These findings suggest effects obtained using composite stimuli may extend beyond the stimuli used in the task but remain emotion-specific.

  2. Psychosocial effects of perceived emotional synchrony in collective gatherings.

    Science.gov (United States)

    Páez, Dario; Rimé, Bernard; Basabe, Nekane; Wlodarczyk, Anna; Zumeta, Larraitz

    2015-05-01

    In a classic theory, Durkheim (1912) predicted that because of the social sharing of emotion they generate, collective gatherings bring participants to a stage of collective effervescence in which they experience a sense of union with others and a feeling of empowerment accompanied by positive affect. This would lead them to leave the collective situation with a renewed sense of confidence in life and in social institutions. A century after Durkheim's predictions of these effects, though, they remained untested as a whole. This article reports 4 studies, 2 correlational, 1 semilongitudinal, and 1 experimental, assessing the positive effects of participation in either positively valenced (folkloric marches) or negatively valenced (protest demonstrations) collective gatherings. Results confirmed that collective gatherings consistently strengthened collective identity, identity fusion, and social integration, as well as enhancing personal and collective self-esteem and efficacy, positive affect, and positive social beliefs among participants. In line with a central tenet of the theory, emotional communion, or perceived emotional synchrony with others mediated these effects. Higher perceived emotional synchrony was associated with stronger emotional reactions, stronger social support, and higher endorsement of social beliefs and values. Participation in symbolic collective gatherings also particularly reinforced identity fusion when perceived emotional synchrony was high. The respective contributions of perceived emotional synchrony and flow, or optimal experience, were also assessed. Whereas perceived emotional synchrony emerged as strongly related to the various social outcomes, flow was observed to be related first to collective efficacy and self-esteem, and thus, to encompass mainly empowerment effects. (c) 2015 APA, all rights reserved).

  3. Face Age and Eye Gaze Influence Older Adults' Emotion Recognition.

    Science.gov (United States)

    Campbell, Anna; Murray, Janice E; Atkinson, Lianne; Ruffman, Ted

    2017-07-01

    Eye gaze has been shown to influence emotion recognition. In addition, older adults (over 65 years) are not as influenced by gaze direction cues as young adults (18-30 years). Nevertheless, these differences might stem from the use of young to middle-aged faces in emotion recognition research because older adults have an attention bias toward old-age faces. Therefore, using older face stimuli might allow older adults to process gaze direction cues to influence emotion recognition. To investigate this idea, young and older adults completed an emotion recognition task with young and older face stimuli displaying direct and averted gaze, assessing labeling accuracy for angry, disgusted, fearful, happy, and sad faces. Direct gaze rather than averted gaze improved young adults' recognition of emotions in young and older faces, but for older adults this was true only for older faces. The current study highlights the impact of stimulus face age and gaze direction on emotion recognition in young and older adults. The use of young face stimuli with direct gaze in most research might contribute to age-related emotion recognition differences. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  4. 5-HTTLPR differentially predicts brain network responses to emotional faces

    DEFF Research Database (Denmark)

    Fisher, Patrick M; Grady, Cheryl L; Madsen, Martin K

    2015-01-01

    The effects of the 5-HTTLPR polymorphism on neural responses to emotionally salient faces have been studied extensively, focusing on amygdala reactivity and amygdala-prefrontal interactions. Despite compelling evidence that emotional face paradigms engage a distributed network of brain regions...... to fearful faces was significantly greater in S' carriers compared to LA LA individuals. These findings provide novel evidence for emotion-specific 5-HTTLPR effects on the response of a distributed set of brain regions including areas responsive to emotionally salient stimuli and critical components...... involved in emotion, cognitive and visual processing, less is known about 5-HTTLPR effects on broader network responses. To address this, we evaluated 5-HTTLPR differences in the whole-brain response to an emotional faces paradigm including neutral, angry and fearful faces using functional magnetic...

  5. The relationship between perceived family atmosphere and emotional stability of adolescents

    OpenAIRE

    KRÁČMAROVÁ, Martina

    2015-01-01

    The bachelor thesis called "The relationship between perceived family atmosphere and emotional stability of adolescents" deals with adolescence especially in relation to the perceived family and emotional stability of adolescent. The main goal of my work is to describe how the perceived family atmosphere and events in the family influences the emotion of Adolescents. Firstly I was clarifying the specifics of adolescent period, the importance of relationships, family atmosphere and emotional a...

  6. The Effect of Self-Referential Expectation on Emotional Face Processing.

    Directory of Open Access Journals (Sweden)

    Mel McKendrick

    Full Text Available The role of self-relevance has been somewhat neglected in static face processing paradigms but may be important in understanding how emotional faces impact on attention, cognition and affect. The aim of the current study was to investigate the effect of self-relevant primes on processing emotional composite faces. Sentence primes created an expectation of the emotion of the face before sad, happy, neutral or composite face photos were viewed. Eye movements were recorded and subsequent responses measured the cognitive and affective impact of the emotion expressed. Results indicated that primes did not guide attention, but impacted on judgments of valence intensity and self-esteem ratings. Negative self-relevant primes led to the most negative self-esteem ratings, although the effect of the prime was qualified by salient facial features. Self-relevant expectations about the emotion of a face and subsequent attention to a face that is congruent with these expectations strengthened the affective impact of viewing the face.

  7. Women's greater ability to perceive happy facial emotion automatically: gender differences in affective priming.

    Directory of Open Access Journals (Sweden)

    Uta-Susan Donges

    Full Text Available There is evidence that women are better in recognizing their own and others' emotions. The female advantage in emotion recognition becomes even more apparent under conditions of rapid stimulus presentation. Affective priming paradigms have been developed to examine empirically whether facial emotion stimuli presented outside of conscious awareness color our impressions. It was observed that masked emotional facial expression has an affect congruent influence on subsequent judgments of neutral stimuli. The aim of the present study was to examine the effect of gender on affective priming based on negative and positive facial expression. In our priming experiment sad, happy, neutral, or no facial expression was briefly presented (for 33 ms and masked by neutral faces which had to be evaluated. 81 young healthy volunteers (53 women participated in the study. Subjects had no subjective awareness of emotional primes. Women did not differ from men with regard to age, education, intelligence, trait anxiety, or depressivity. In the whole sample, happy but not sad facial expression elicited valence congruent affective priming. Between-group analyses revealed that women manifested greater affective priming due to happy faces than men. Women seem to have a greater ability to perceive and respond to positive facial emotion at an automatic processing level compared to men. High perceptual sensitivity to minimal social-affective signals may contribute to women's advantage in understanding other persons' emotional states.

  8. Women's greater ability to perceive happy facial emotion automatically: gender differences in affective priming.

    Science.gov (United States)

    Donges, Uta-Susan; Kersting, Anette; Suslow, Thomas

    2012-01-01

    There is evidence that women are better in recognizing their own and others' emotions. The female advantage in emotion recognition becomes even more apparent under conditions of rapid stimulus presentation. Affective priming paradigms have been developed to examine empirically whether facial emotion stimuli presented outside of conscious awareness color our impressions. It was observed that masked emotional facial expression has an affect congruent influence on subsequent judgments of neutral stimuli. The aim of the present study was to examine the effect of gender on affective priming based on negative and positive facial expression. In our priming experiment sad, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces which had to be evaluated. 81 young healthy volunteers (53 women) participated in the study. Subjects had no subjective awareness of emotional primes. Women did not differ from men with regard to age, education, intelligence, trait anxiety, or depressivity. In the whole sample, happy but not sad facial expression elicited valence congruent affective priming. Between-group analyses revealed that women manifested greater affective priming due to happy faces than men. Women seem to have a greater ability to perceive and respond to positive facial emotion at an automatic processing level compared to men. High perceptual sensitivity to minimal social-affective signals may contribute to women's advantage in understanding other persons' emotional states.

  9. Emotional Faces Capture Spatial Attention in 5-Year-Old Children

    Directory of Open Access Journals (Sweden)

    Kit K. Elam

    2010-10-01

    Full Text Available Emotional facial expressions are important social cues that convey salient affective information. Infants, younger children, and adults all appear to orient spatial attention to emotional faces with a particularly strong bias to fearful faces. Yet in young children it is unclear whether or not both happy and fearful faces extract attention. Given that the processing of emotional faces is believed by some to serve an evolutionarily adaptive purpose, attentional biases to both fearful and happy expressions would be expected in younger children. However, the extent to which this ability is present in young children and whether or not this ability is genetically mediated is untested. Therefore, the aims of the current study were to assess the spatial-attentional properties of emotional faces in young children, with a preliminary test of whether this effect was influenced by genetics. Five-year-old twin pairs performed a dot-probe task. The results suggest that children preferentially direct spatial attention to emotional faces, particularly right visual field faces. The results provide support for the notion that the direction of spatial attention to emotional faces serves an evolutionarily adaptive function and may be mediated by genetic mechanisms.

  10. An Adult Developmental Approach to Perceived Facial Attractiveness and Distinctiveness

    Directory of Open Access Journals (Sweden)

    Natalie C. Ebner

    2018-05-01

    Full Text Available Attractiveness and distinctiveness constitute facial features with high biological and social relevance. Bringing a developmental perspective to research on social-cognitive face perception, we used a large set of faces taken from the FACES Lifespan Database to examine effects of face and perceiver characteristics on subjective evaluations of attractiveness and distinctiveness in young (20–31 years, middle-aged (44–55 years, and older (70–81 years men and women. We report novel findings supporting variations by face and perceiver age, in interaction with gender and emotion: although older and middle-aged compared to young perceivers generally rated faces of all ages as more attractive, young perceivers gave relatively higher attractiveness ratings to young compared to middle-aged and older faces. Controlling for variations in attractiveness, older compared to young faces were viewed as more distinctive by young and middle-aged perceivers. Age affected attractiveness more negatively for female than male faces. Furthermore, happy faces were rated as most attractive, while disgusted faces were rated as least attractive, particularly so by middle-aged and older perceivers and for young and female faces. Perceivers largely agreed on distinctiveness ratings for neutral and happy emotions, but older and middle-aged compared to young perceivers rated faces displaying negative emotions as more distinctive. These findings underscore the importance of a lifespan perspective on perception of facial characteristics and suggest possible effects of age on goal-directed perception, social motivation, and in-group bias. This publication makes available picture-specific normative data for experimental stimulus selection.

  11. Facial emotion recognition, face scan paths, and face perception in children with neurofibromatosis type 1.

    Science.gov (United States)

    Lewis, Amelia K; Porter, Melanie A; Williams, Tracey A; Bzishvili, Samantha; North, Kathryn N; Payne, Jonathan M

    2017-05-01

    This study aimed to investigate face scan paths and face perception abilities in children with Neurofibromatosis Type 1 (NF1) and how these might relate to emotion recognition abilities in this population. The authors investigated facial emotion recognition, face scan paths, and face perception in 29 children with NF1 compared to 29 chronological age-matched typically developing controls. Correlations between facial emotion recognition, face scan paths, and face perception in children with NF1 were examined. Children with NF1 displayed significantly poorer recognition of fearful expressions compared to controls, as well as a nonsignificant trend toward poorer recognition of anger. Although there was no significant difference between groups in time spent viewing individual core facial features (eyes, nose, mouth, and nonfeature regions), children with NF1 spent significantly less time than controls viewing the face as a whole. Children with NF1 also displayed significantly poorer face perception abilities than typically developing controls. Facial emotion recognition deficits were not significantly associated with aberrant face scan paths or face perception abilities in the NF1 group. These results suggest that impairments in the perception, identification, and interpretation of information from faces are important aspects of the social-cognitive phenotype of NF1. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  12. Perceived emotion suppression and culture: Effects on psychological well-being.

    Science.gov (United States)

    Kwon, Heewon; Kim, Young-Hoon

    2018-04-03

    Whether the negative effects of emotion suppression on psychological well-being are applicable cross-culturally is a long-debated topic. The present study attempted to shed light on this debate, focusing on the effects of perceived emotion suppression and examining the psychological processes leading from perceived emotion suppression to lower psychological well-being. We used a scale manipulation to lead 196 American and 213 Chinese participants to perceive themselves as having suppressed their emotions to a greater or lesser extent and then measured their life satisfaction. As expected, both the American and Chinese participants reported lower life satisfaction in the high-suppression condition than in the low-suppression condition; this negative effect was mediated by positive affect and moderated by self-esteem. Specifically, perceived high emotion suppression decreased positive affect, which in turn led to lower well-being. This effect was observed only for those with low self-esteem, but the patterns and mechanisms were consistent cross-culturally. © 2018 International Union of Psychological Science.

  13. Perceived Emotional Intelligence, Self-Esteem and Life Satisfaction in Adolescents

    Directory of Open Access Journals (Sweden)

    Lourdes Rey

    2011-07-01

    Full Text Available The present study examined the relationship between perceived emotional intelligence, selfesteem and life satisfaction in a sample of 316 Spanish adolescents (179 females and 137 males, ranging in age from 14 to 18. Demographic information was collected, along with data through the use of three self-report measures: the Trait Meta-Mood Scale, the Rosenberg Self-Esteem Scale and the Satisfaction with Life Scale. As expected, perceived emotional dimensions, particularly mood clarity and repair, showed positive associations with life satisfaction. Self-esteem also correlated significantly and positively with levels of adolescents´ satisfaction with life. More interestingly, results of structural equation modelling indicated that mood clarity and emotional repair had a significant direct and indirect link (via selfesteem with life satisfaction in adolescents. The present study contributes to an emerging understanding of the underlying process between perceived emotional intelligence and life satisfaction. Our findings encourage moving beyond the examination of direct association between perceived emotional intelligence and life satisfaction and focusing on the role of potential mechanisms such as self-esteem involved in the link between perceived emotional intelligence and life satisfaction in adolescents. Implications of the present findings for future research are discussed, as well as potential interventions for increasing subjective well-being in adolescents.

  14. Men appear more lateralized when noticing emotion in male faces.

    Science.gov (United States)

    Rahman, Qazi; Anchassi, Tarek

    2012-02-01

    Empirical tests of the "right hemisphere dominance" versus "valence" theories of emotion processing are confounded by known sex differences in lateralization. Moreover, information about the sex of the person posing an emotion might be processed differently by men and women because of an adaptive male bias to notice expressions of threat and vigilance in other male faces. The purpose of this study was to investigate whether sex of poser and emotion displayed influenced lateralization in men and women by analyzing "laterality quotient" scores on a test which depicts vertically split chimeric faces, formed with one half showing a neutral expression and the other half showing an emotional expression. We found that men (N = 50) were significantly more lateralized for emotions indicative of vigilance and threat (happy, sad, angry, and surprised) in male faces relative to female faces and compared to women (N = 44). These data indicate that sex differences in functional cerebral lateralization for facial emotion may be specific to the emotion presented and the sex of face presenting it. PsycINFO Database Record (c) 2012 APA, all rights reserved

  15. PERCEIVED EMOTIONAL INTELLIGENCE AS A MODERATOR VARIABLE BETWEEN CYBERVICTIMIZATION AND ITS EMOTIONAL IMPACT

    Directory of Open Access Journals (Sweden)

    Paz eElipe

    2015-04-01

    Full Text Available The negative effects of traditional bullying and, recently, cyberbullying on victims are well documented, and abundant empirical evidence for it exists. Cybervictimization affects areas such as academic performance, social integration and self-esteem, and causes emotions ranging from anger and sadness to more complex problems such as depression. However, not all victims are equally affected, and the differences seem to be due to certain situational and personal characteristics. The objective of this study is to analyze the relationship between perceived emotional intelligence and the emotional impact of cybervictimization. We hypothesize that emotional intelligence, which has previously been found to play a role in traditional bullying and cyberbullying, may also affect the emotional impact of cyberbullying.The participants in our study were 636 university students from two universities in the south of Spain. Three self-report questionnaires were used: the European Cyberbullying Intervention Project Questionnaire, the Cyberbullying Emotional Impact Scale; and Trait Meta-Mood Scale-24. Structural Equation Models were used to test the relationships between the analyzed variables.The results support the idea that perceived emotional intelligence, by way of a moderator effect, affects the relationship between cybervictimization and emotional impact. Taken together, cybervictimization and perceived emotional intelligence explain much of the variance observed in the emotional impact in general and in the negative dimensions of that impact in particular. Attention and Repair were found to be inversely related to Annoyance and Dejection, and positively related to Invigoration. Clarity has the opposite pattern; a positive relationship with Annoyance and Dejection and an inverse relationship with Invigoration. Various hypothetical explanations of these patterns are discussed.

  16. Short-term memory for emotional faces in dysphoria.

    Science.gov (United States)

    Noreen, Saima; Ridout, Nathan

    2010-07-01

    The study aimed to determine if the memory bias for negative faces previously demonstrated in depression and dysphoria generalises from long- to short-term memory. A total of 29 dysphoric (DP) and 22 non-dysphoric (ND) participants were presented with a series of faces and asked to identify the emotion portrayed (happiness, sadness, anger, or neutral affect). Following a delay, four faces were presented (the original plus three distractors) and participants were asked to identify the target face. Half of the trials assessed memory for facial emotion, and the remaining trials examined memory for facial identity. At encoding, no group differences were apparent. At memory testing, relative to ND participants, DP participants exhibited impaired memory for all types of facial emotion and for facial identity when the faces featured happiness, anger, or neutral affect, but not sadness. DP participants exhibited impaired identity memory for happy faces relative to angry, sad, and neutral, whereas ND participants exhibited enhanced facial identity memory when faces were angry. In general, memory for faces was not related to performance at encoding. However, in DP participants only, memory for sad faces was related to sadness recognition at encoding. The results suggest that the negative memory bias for faces in dysphoria does not generalise from long- to short-term memory.

  17. Perceived vulnerability as a common basis of moral emotions.

    Science.gov (United States)

    Dijker, Anton J M

    2010-06-01

    It is theorized that many moral emotions are triggered when a mechanism for (parental) care is activated by perceived vulnerability, and changes in the care object's well-being are subsequently evaluated and causally attributed. Participants reported different moral emotions (tenderness, concern, sympathy, guilt, and moral anger) in relation to different photographs of males and females widely differing in age. Using variation between emotion objects, it was shown that emotional reactions were highly intercorrelated and strongly related to perceived vulnerability and aroused protective tendency; with children and elderly arousing the strongest, and adult males the weakest, emotions. Moreover, these intercorrelations largely disappeared when vulnerability and protective tendency were statistically controlled. Theoretical implications are discussed.

  18. BESST (Bochum Emotional Stimulus Set)--a pilot validation study of a stimulus set containing emotional bodies and faces from frontal and averted views.

    Science.gov (United States)

    Thoma, Patrizia; Soria Bauser, Denise; Suchan, Boris

    2013-08-30

    This article introduces the freely available Bochum Emotional Stimulus Set (BESST), which contains pictures of bodies and faces depicting either a neutral expression or one of the six basic emotions (happiness, sadness, fear, anger, disgust, and surprise), presented from two different perspectives (0° frontal view vs. camera averted by 45° to the left). The set comprises 565 frontal view and 564 averted view pictures of real-life bodies with masked facial expressions and 560 frontal and 560 averted view faces which were synthetically created using the FaceGen 3.5 Modeller. All stimuli were validated in terms of categorization accuracy and the perceived naturalness of the expression. Additionally, each facial stimulus was morphed into three age versions (20/40/60 years). The results show high recognition of the intended facial expressions, even under speeded forced-choice conditions, as corresponds to common experimental settings. The average naturalness ratings for the stimuli range between medium and high. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  19. Neurophysiological evidence (ERPs) for hemispheric processing of facial expressions of emotions: Evidence from whole face and chimeric face stimuli.

    Science.gov (United States)

    Damaskinou, Nikoleta; Watling, Dawn

    2018-05-01

    This study was designed to investigate the patterns of electrophysiological responses of early emotional processing at frontocentral sites in adults and to explore whether adults' activation patterns show hemispheric lateralization for facial emotion processing. Thirty-five adults viewed full face and chimeric face stimuli. After viewing two faces, sequentially, participants were asked to decide which of the two faces was more emotive. The findings from the standard faces and the chimeric faces suggest that emotion processing is present during the early phases of face processing in the frontocentral sites. In particular, sad emotional faces are processed differently than neutral and happy (including happy chimeras) faces in these early phases of processing. Further, there were differences in the electrode amplitudes over the left and right hemisphere, particularly in the early temporal window. This research provides supporting evidence that the chimeric face test is a test of emotion processing that elicits right hemispheric processing.

  20. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias.

    Science.gov (United States)

    Invitto, Sara; Calcagnì, Antonio; Mignozzi, Arianna; Scardino, Rosanna; Piraino, Giulia; Turchi, Daniele; De Feudis, Irio; Brunetti, Antonio; Bevilacqua, Vitoantonio; de Tommaso, Marina

    2017-01-01

    Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians). Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP) and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment). A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.

  1. Processing of emotional faces in social phobia

    Directory of Open Access Journals (Sweden)

    Nicole Kristjansen Rosenberg

    2011-02-01

    Full Text Available Previous research has found that individuals with social phobia differ from controls in their processing of emotional faces. For instance, people with social phobia show increased attention to briefly presented threatening faces. However, when exposure times are increased, the direction of this attentional bias is more unclear. Studies investigating eye movements have found both increased as well as decreased attention to threatening faces in socially anxious participants. The current study investigated eye movements to emotional faces in eight patients with social phobia and 34 controls. Three different tasks with different exposure durations were used, which allowed for an investigation of the time course of attention. At the early time interval, patients showed a complex pattern of both vigilance and avoidance of threatening faces. At the longest time interval, patients avoided the eyes of sad, disgust, and neutral faces more than controls, whereas there were no group differences for angry faces.

  2. Detecting and Categorizing Fleeting Emotions in Faces

    Science.gov (United States)

    Sweeny, Timothy D.; Suzuki, Satoru; Grabowecky, Marcia; Paller, Ken A.

    2013-01-01

    Expressions of emotion are often brief, providing only fleeting images from which to base important social judgments. We sought to characterize the sensitivity and mechanisms of emotion detection and expression categorization when exposure to faces is very brief, and to determine whether these processes dissociate. Observers viewed 2 backward-masked facial expressions in quick succession, 1 neutral and the other emotional (happy, fearful, or angry), in a 2-interval forced-choice task. On each trial, observers attempted to detect the emotional expression (emotion detection) and to classify the expression (expression categorization). Above-chance emotion detection was possible with extremely brief exposures of 10 ms and was most accurate for happy expressions. We compared categorization among expressions using a d′ analysis, and found that categorization was usually above chance for angry versus happy and fearful versus happy, but consistently poor for fearful versus angry expressions. Fearful versus angry categorization was poor even when only negative emotions (fearful, angry, or disgusted) were used, suggesting that this categorization is poor independent of decision context. Inverting faces impaired angry versus happy categorization, but not emotion detection, suggesting that information from facial features is used differently for emotion detection and expression categorizations. Emotion detection often occurred without expression categorization, and expression categorization sometimes occurred without emotion detection. These results are consistent with the notion that emotion detection and expression categorization involve separate mechanisms. PMID:22866885

  3. Emotional facial expressions reduce neural adaptation to face identity.

    Science.gov (United States)

    Gerlicher, Anna M V; van Loon, Anouk M; Scholte, H Steven; Lamme, Victor A F; van der Leij, Andries R

    2014-05-01

    In human social interactions, facial emotional expressions are a crucial source of information. Repeatedly presented information typically leads to an adaptation of neural responses. However, processing seems sustained with emotional facial expressions. Therefore, we tested whether sustained processing of emotional expressions, especially threat-related expressions, would attenuate neural adaptation. Neutral and emotional expressions (happy, mixed and fearful) of same and different identity were presented at 3 Hz. We used electroencephalography to record the evoked steady-state visual potentials (ssVEP) and tested to what extent the ssVEP amplitude adapts to the same when compared with different face identities. We found adaptation to the identity of a neutral face. However, for emotional faces, adaptation was reduced, decreasing linearly with negative valence, with the least adaptation to fearful expressions. This short and straightforward method may prove to be a valuable new tool in the study of emotional processing.

  4. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias

    Directory of Open Access Journals (Sweden)

    Sara Invitto

    2017-08-01

    Full Text Available Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians. Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment. A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.

  5. DAT by perceived MC interaction on human prefrontal activity and connectivity during emotion processing.

    Science.gov (United States)

    Taurisano, Paolo; Blasi, Giuseppe; Romano, Raffaella; Sambataro, Fabio; Fazio, Leonardo; Gelao, Barbara; Ursini, Gianluca; Lo Bianco, Luciana; Di Giorgio, Annabella; Ferrante, Francesca; Papazacharias, Apostolos; Porcelli, Annamaria; Sinibaldi, Lorenzo; Popolizio, Teresa; Bertolino, Alessandro

    2013-12-01

    Maternal care (MC) and dopamine modulate brain activity during emotion processing in inferior frontal gyrus (IFG), striatum and amygdala. Reuptake of dopamine from the synapse is performed by the dopamine transporter (DAT), whose abundance is predicted by variation in its gene (DAT 3'VNTR; 10 > 9-repeat alleles). Here, we investigated the interaction between perceived MC and DAT 3'VNTR genotype on brain activity during processing of aversive facial emotional stimuli. Sixty-one healthy subjects were genotyped for DAT 3'VNTR and categorized in low and high MC individuals. They underwent functional magnetic resonance imaging while performing a task requiring gender discrimination of facial stimuli with angry, fearful or neutral expressions. An interaction between facial expression, DAT genotype and MC was found in left IFG, such that low MC and homozygosity for the 10-repeat allele are associated with greater activity during processing of fearful faces. This greater activity was also inversely correlated with a measure of emotion control as scored with the Big Five Questionnaire. Moreover, MC and DAT genotype described a double dissociation on functional connectivity between IFG and amygdala. These findings suggest that perceived early parental bonding may interact with DAT 3'VNTR genotype in modulating brain activity during emotionally relevant inputs.

  6. Your emotion or mine: Labeling feelings alters emotional face perception- An ERP study on automatic and intentional affect labeling

    Directory of Open Access Journals (Sweden)

    Cornelia eHerbert

    2013-07-01

    Full Text Available Empirical evidence suggests that words are powerful regulators of emotion processing. Although a number of studies have used words as contextual cues for emotion processing, the role of what is being labeled by the words (i.e. one’s own emotion as compared to the emotion expressed by the sender is poorly understood. The present study reports results from two experiments which used ERP methodology to evaluate the impact of emotional faces and self- versus sender-related emotional pronoun-noun pairs (e.g. my fear vs. his fear as cues for emotional face processing. The influence of self- and sender-related cues on the processing of fearful, angry and happy faces was investigated in two contexts: an automatic (experiment 1 and intentional affect labeling task (experiment 2, along with control conditions of passive face processing. ERP patterns varied as a function of the label’s reference (self vs. sender and the intentionality of the labelling task (experiment 1 vs. experiment 2. In experiment 1, self-related labels increased the motivational relevance of the emotional faces in the time-window of the EPN component. Processing of sender-related labels improved emotion recognition specifically for fearful faces in the N170 time-window. Spontaneous processing of affective labels modulated later stages of face processing as well. Amplitudes of the late positive potential (LPP were reduced for fearful, happy, and angry faces relative to the control condition of passive viewing. During intentional regulation (experiment 2 amplitudes of the LPP were enhanced for emotional faces when subjects used the self-related emotion labels to label their own emotion during face processing, and they rated the faces as higher in arousal than the emotional faces that had been presented in the label sender’s emotion condition or the passive viewing condition. The present results argue in favor of a differentiated view of language-as-context for emotion processing.

  7. Detecting and categorizing fleeting emotions in faces.

    Science.gov (United States)

    Sweeny, Timothy D; Suzuki, Satoru; Grabowecky, Marcia; Paller, Ken A

    2013-02-01

    Expressions of emotion are often brief, providing only fleeting images from which to base important social judgments. We sought to characterize the sensitivity and mechanisms of emotion detection and expression categorization when exposure to faces is very brief, and to determine whether these processes dissociate. Observers viewed 2 backward-masked facial expressions in quick succession, 1 neutral and the other emotional (happy, fearful, or angry), in a 2-interval forced-choice task. On each trial, observers attempted to detect the emotional expression (emotion detection) and to classify the expression (expression categorization). Above-chance emotion detection was possible with extremely brief exposures of 10 ms and was most accurate for happy expressions. We compared categorization among expressions using a d' analysis, and found that categorization was usually above chance for angry versus happy and fearful versus happy, but consistently poor for fearful versus angry expressions. Fearful versus angry categorization was poor even when only negative emotions (fearful, angry, or disgusted) were used, suggesting that this categorization is poor independent of decision context. Inverting faces impaired angry versus happy categorization, but not emotion detection, suggesting that information from facial features is used differently for emotion detection and expression categorizations. Emotion detection often occurred without expression categorization, and expression categorization sometimes occurred without emotion detection. These results are consistent with the notion that emotion detection and expression categorization involve separate mechanisms. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  8. Perceived differences between chimpanzee (Pan troglodytes) and human (Homo sapiens) facial expressions are related to emotional interpretation.

    Science.gov (United States)

    Waller, Bridget M; Bard, Kim A; Vick, Sarah-Jane; Smith Pasqualini, Marcia C

    2007-11-01

    Human face perception is a finely tuned, specialized process. When comparing faces between species, therefore, it is essential to consider how people make these observational judgments. Comparing facial expressions may be particularly problematic, given that people tend to consider them categorically as emotional signals, which may affect how accurately specific details are processed. The bared-teeth display (BT), observed in most primates, has been proposed as a homologue of the human smile (J. A. R. A. M. van Hooff, 1972). In this study, judgments of similarity between BT displays of chimpanzees (Pan troglodytes) and human smiles varied in relation to perceived emotional valence. When a chimpanzee BT was interpreted as fearful, observers tended to underestimate the magnitude of the relationship between certain features (the extent of lip corner raise) and human smiles. These judgments may reflect the combined effects of categorical emotional perception, configural face processing, and perceptual organization in mental imagery and may demonstrate the advantages of using standardized observational methods in comparative facial expression research. Copyright 2007 APA.

  9. Understanding emotional transitions: the interpersonal consequences of changing emotions in negotiations.

    Science.gov (United States)

    Filipowicz, Allan; Barsade, Sigal; Melwani, Shimul

    2011-09-01

    Research on the interpersonal functions of emotions has focused primarily on steady-state emotion rather than on emotional transitions, the movement between emotion states. The authors examined the influence of emotional transitions on social interactions and found that emotional transitions led to consistently different outcomes than their corresponding steady-state emotions. Across 2 computer-mediated negotiations and a face-to-face negotiation, participants negotiating with partners who displayed a "becoming angry" (happy to angry) emotional transition accepted worse negotiation outcomes yet formed better relational impressions of their partners than participants negotiating with partners who displayed steady-state anger. This relationship was mediated through 2 mechanisms: attributional and emotional contagion processes. The "becoming happy" (angry to happy) emotional transition as compared with steady-state happiness was not significantly related to differences in negotiation outcomes but was significantly related to differences in relational impressions, where perceivers of the "becoming happy" emotional transition gave their partners lower relational impression ratings than perceivers of steady-state happiness. PsycINFO Database Record (c) 2011 APA, all rights reserved.

  10. Face or body? Oxytocin improves perception of emotions from facial expressions in incongruent emotional body context.

    Science.gov (United States)

    Perry, Anat; Aviezer, Hillel; Goldstein, Pavel; Palgi, Sharon; Klein, Ehud; Shamay-Tsoory, Simone G

    2013-11-01

    The neuropeptide oxytocin (OT) has been repeatedly reported to play an essential role in the regulation of social cognition in humans in general, and specifically in enhancing the recognition of emotions from facial expressions. The later was assessed in different paradigms that rely primarily on isolated and decontextualized emotional faces. However, recent evidence has indicated that the perception of basic facial expressions is not context invariant and can be categorically altered by context, especially body context, at early perceptual levels. Body context has a strong effect on our perception of emotional expressions, especially when the actual target face and the contextually expected face are perceptually similar. To examine whether and how OT affects emotion recognition, we investigated the role of OT in categorizing facial expressions in incongruent body contexts. Our results show that in the combined process of deciphering emotions from facial expressions and from context, OT gives an advantage to the face. This advantage is most evident when the target face and the contextually expected face are perceptually similar. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. State-dependent alteration in face emotion recognition in depression.

    Science.gov (United States)

    Anderson, Ian M; Shippen, Clare; Juhasz, Gabriella; Chase, Diana; Thomas, Emma; Downey, Darragh; Toth, Zoltan G; Lloyd-Williams, Kathryn; Elliott, Rebecca; Deakin, J F William

    2011-04-01

    Negative biases in emotional processing are well recognised in people who are currently depressed but are less well described in those with a history of depression, where such biases may contribute to vulnerability to relapse. To compare accuracy, discrimination and bias in face emotion recognition in those with current and remitted depression. The sample comprised a control group (n = 101), a currently depressed group (n = 30) and a remitted depression group (n = 99). Participants provided valid data after receiving a computerised face emotion recognition task following standardised assessment of diagnosis and mood symptoms. In the control group women were more accurate in recognising emotions than men owing to greater discrimination. Among participants with depression, those in remission correctly identified more emotions than controls owing to increased response bias, whereas those currently depressed recognised fewer emotions owing to decreased discrimination. These effects were most marked for anger, fear and sadness but there was no significant emotion × group interaction, and a similar pattern tended to be seen for happiness although not for surprise or disgust. These differences were confined to participants who were antidepressant-free, with those taking antidepressants having similar results to the control group. Abnormalities in face emotion recognition differ between people with current depression and those in remission. Reduced discrimination in depressed participants may reflect withdrawal from the emotions of others, whereas the increased bias in those with a history of depression could contribute to vulnerability to relapse. The normal face emotion recognition seen in those taking medication may relate to the known effects of antidepressants on emotional processing and could contribute to their ability to protect against depressive relapse.

  12. Similar representations of emotions across faces and voices.

    Science.gov (United States)

    Kuhn, Lisa Katharina; Wydell, Taeko; Lavan, Nadine; McGettigan, Carolyn; Garrido, Lúcia

    2017-09-01

    [Correction Notice: An Erratum for this article was reported in Vol 17(6) of Emotion (see record 2017-18585-001). In the article, the copyright attribution was incorrectly listed and the Creative Commons CC-BY license disclaimer was incorrectly omitted from the author note. The correct copyright is "© 2017 The Author(s)" and the omitted disclaimer is below. All versions of this article have been corrected. "This article has been published under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Copyright for this article is retained by the author(s). Author(s) grant(s) the American Psychological Association the exclusive right to publish the article and identify itself as the original publisher."] Emotions are a vital component of social communication, carried across a range of modalities and via different perceptual signals such as specific muscle contractions in the face and in the upper respiratory system. Previous studies have found that emotion recognition impairments after brain damage depend on the modality of presentation: recognition from faces may be impaired whereas recognition from voices remains preserved, and vice versa. On the other hand, there is also evidence for shared neural activation during emotion processing in both modalities. In a behavioral study, we investigated whether there are shared representations in the recognition of emotions from faces and voices. We used a within-subjects design in which participants rated the intensity of facial expressions and nonverbal vocalizations for each of the 6 basic emotion labels. For each participant and each modality, we then computed a representation matrix with the intensity ratings of each emotion. These matrices allowed us to examine the patterns of confusions between emotions and to characterize the representations

  13. Emotion categorization does not depend on explicit face categorization

    NARCIS (Netherlands)

    Seirafi, M.; de Weerd, P.; de Gelder, B.

    2013-01-01

    Face perception and emotion recognition have been extensively studied in the past decade; however, the relation between them is still poorly understood. A traditional view is that successful emotional categorization requires categorization of the stimulus as a ‘face', at least at the basic level.

  14. Digitizing the moving face: asymmetries of emotion and gender

    Directory of Open Access Journals (Sweden)

    Ashish Desai

    2009-04-01

    Full Text Available In a previous study with dextral males, Richardson and Bowers (1999 digitized real time video signals and found movement asymmetries over the left lower face for emotional, but not non-emotional expressions. These findings correspond to observations, based on subjective ratings of static pictures, that the left side of the face is more intensely expressive than the right (Sackeim, 1978. From a neuropsychological perspective, one possible interpretation of these findings is that emotional priming of the right hemisphere of the brain results in more muscular activity over the contralateral left than ipsilateral right side of the lower face. The purpose of the present study was to use computer-imaging methodology to determine whether there were gender differences in movement asymmetries across the face. We hypothesized that females would show less evidence of facial movement asymmetries during the expression of emotion. This hypothesis was based on findings of gender differences in the degree to which specific cognitive functions may be lateralized in the brain (i.e., females less lateralized than males. Forty-eight normal dextral college students (25 females, 23 males were videotaped while they displayed voluntary emotional expressions. A quantitative measure of movement change (called entropy was computed by subtracting the values of corresponding pixel intensities between adjacent frames and summing their differences. The upper and lower hemiface regions were examined separately due to differences in the cortical enervation of facial muscles in the upper (bilateral versus lower face (contralateral. Repeated measures ANOVA’s were used to analyze for the amount of overall facial movement and for facial asymmetries. Certain emotions were associated with significantly greater overall facial movement than others (p fear > (angry =sad > neutral. Both males and females showed this same pattern, with no gender differences in the total amount of facial

  15. Effect of Perceived Politics and Perceived Support on Bullying and Emotional Exhaustion: The Moderating Role of Type A Personality.

    Science.gov (United States)

    Naseer, Saima; Raja, Usman; Donia, Magda Bezerra Leite

    2016-07-03

    Recognizing that bullying can occur in varying degrees of severity, the current study suggest the importance of individual traits in individual perceptions of being targets of bullying and ensuing emotional exhaustion. The present study extends the work environment hypothesis and trait activation theory by a joint investigation of the mediating role of (a) workplace bullying in linking perceived organization politics and perceived organization support with emotional exhaustion and (b) the moderating role of Type A behavioral pattern in influencing the mediation. Using a field sample of 262 employees working in different organizations of Pakistan, this study tested a moderated mediation model. Results were consistent with the hypothesized model, in that workplace bullying mediated the relationship of perceived organization politics and perceived organization support with emotional exhaustion. Type A behavior moderated the perceived politics-bullying, perceived support-bullying, and bullying-emotional exhaustion relationships. The mediation of bullying varied with levels of Type A behavior in these relationships.

  16. Disrupted neural processing of emotional faces in psychopathy.

    Science.gov (United States)

    Contreras-Rodríguez, Oren; Pujol, Jesus; Batalla, Iolanda; Harrison, Ben J; Bosque, Javier; Ibern-Regàs, Immaculada; Hernández-Ribas, Rosa; Soriano-Mas, Carles; Deus, Joan; López-Solà, Marina; Pifarré, Josep; Menchón, José M; Cardoner, Narcís

    2014-04-01

    Psychopaths show a reduced ability to recognize emotion facial expressions, which may disturb the interpersonal relationship development and successful social adaptation. Behavioral hypotheses point toward an association between emotion recognition deficits in psychopathy and amygdala dysfunction. Our prediction was that amygdala dysfunction would combine deficient activation with disturbances in functional connectivity with cortical regions of the face-processing network. Twenty-two psychopaths and 22 control subjects were assessed and functional magnetic resonance maps were generated to identify both brain activation and task-induced functional connectivity using psychophysiological interaction analysis during an emotional face-matching task. Results showed significant amygdala activation in control subjects only, but differences between study groups did not reach statistical significance. In contrast, psychopaths showed significantly increased activation in visual and prefrontal areas, with this latest activation being associated with psychopaths' affective-interpersonal disturbances. Psychophysiological interaction analyses revealed a reciprocal reduction in functional connectivity between the left amygdala and visual and prefrontal cortices. Our results suggest that emotional stimulation may evoke a relevant cortical response in psychopaths, but a disruption in the processing of emotional faces exists involving the reciprocal functional interaction between the amygdala and neocortex, consistent with the notion of a failure to integrate emotion into cognition in psychopathic individuals.

  17. Emotional facial expressions differentially influence predictions and performance for face recognition.

    Science.gov (United States)

    Nomi, Jason S; Rhodes, Matthew G; Cleary, Anne M

    2013-01-01

    This study examined how participants' predictions of future memory performance are influenced by emotional facial expressions. Participants made judgements of learning (JOLs) predicting the likelihood that they would correctly identify a face displaying a happy, angry, or neutral emotional expression in a future two-alternative forced-choice recognition test of identity (i.e., recognition that a person's face was seen before). JOLs were higher for studied faces with happy and angry emotional expressions than for neutral faces. However, neutral test faces with studied neutral expressions had significantly higher identity recognition rates than neutral test faces studied with happy or angry expressions. Thus, these data are the first to demonstrate that people believe happy and angry emotional expressions will lead to better identity recognition in the future relative to neutral expressions. This occurred despite the fact that neutral expressions elicited better identity recognition than happy and angry expressions. These findings contribute to the growing literature examining the interaction of cognition and emotion.

  18. State anxiety and emotional face recognition in healthy volunteers

    OpenAIRE

    Attwood, Angela S.; Easey, Kayleigh E.; Dalili, Michael N.; Skinner, Andrew L.; Woods, Andy; Crick, Lana; Ilett, Elizabeth; Penton-Voak, Ian S.; Munafò, Marcus R.

    2017-01-01

    High trait anxiety has been associated with detriments in emotional face processing. By contrast, relatively little is known about the effects of state anxiety on emotional face processing. We investigated the effects of state anxiety on recognition of emotional expressions (anger, sadness, surprise, disgust, fear and happiness) experimentally, using the 7.5% carbon dioxide (CO2) model to induce state anxiety, and in a large observational study. The experimental studies indicated reduced glob...

  19. Confidence in emotion perception in point-light displays varies with the ability to perceive own emotions.

    Science.gov (United States)

    Lorey, Britta; Kaletsch, Morten; Pilgramm, Sebastian; Bischoff, Matthias; Kindermann, Stefan; Sauerbier, Isabell; Stark, Rudolf; Zentgraf, Karen; Munzert, Jörn

    2012-01-01

    One central issue in social cognitive neuroscience is that perceiving emotions in others relates to activating the same emotion in oneself. In this study we sought to examine how the ability to perceive own emotions assessed with the Toronto Alexithymia Scale related to both the ability to perceive emotions depicted in point-light displays and the confidence in these perceptions. Participants observed video scenes of human interactions, rated the depicted valence, and judged their confidence in this rating. Results showed that people with higher alexithymia scores were significantly less confident about their decisions, but did not differ from people with lower alexithymia scores in the valence of their ratings. Furthermore, no modulating effect of social context on the effect of higher alexithymia scores was found. It is concluded that the used stimuli are fit to investigate the kinematic aspect of emotion perception and possibly separate people with high and low alexithymia scores via confidence differences. However, a general difference in emotion perception was not detected in the present setting.

  20. Confidence in emotion perception in point-light displays varies with the ability to perceive own emotions.

    Directory of Open Access Journals (Sweden)

    Britta Lorey

    Full Text Available One central issue in social cognitive neuroscience is that perceiving emotions in others relates to activating the same emotion in oneself. In this study we sought to examine how the ability to perceive own emotions assessed with the Toronto Alexithymia Scale related to both the ability to perceive emotions depicted in point-light displays and the confidence in these perceptions. Participants observed video scenes of human interactions, rated the depicted valence, and judged their confidence in this rating. Results showed that people with higher alexithymia scores were significantly less confident about their decisions, but did not differ from people with lower alexithymia scores in the valence of their ratings. Furthermore, no modulating effect of social context on the effect of higher alexithymia scores was found. It is concluded that the used stimuli are fit to investigate the kinematic aspect of emotion perception and possibly separate people with high and low alexithymia scores via confidence differences. However, a general difference in emotion perception was not detected in the present setting.

  1. Social anhedonia is associated with neural abnormalities during face emotion processing.

    Science.gov (United States)

    Germine, Laura T; Garrido, Lucia; Bruce, Lori; Hooker, Christine

    2011-10-01

    Human beings are social organisms with an intrinsic desire to seek and participate in social interactions. Social anhedonia is a personality trait characterized by a reduced desire for social affiliation and reduced pleasure derived from interpersonal interactions. Abnormally high levels of social anhedonia prospectively predict the development of schizophrenia and contribute to poorer outcomes for schizophrenia patients. Despite the strong association between social anhedonia and schizophrenia, the neural mechanisms that underlie individual differences in social anhedonia have not been studied and are thus poorly understood. Deficits in face emotion recognition are related to poorer social outcomes in schizophrenia, and it has been suggested that face emotion recognition deficits may be a behavioral marker for schizophrenia liability. In the current study, we used functional magnetic resonance imaging (fMRI) to see whether there are differences in the brain networks underlying basic face emotion processing in a community sample of individuals low vs. high in social anhedonia. We isolated the neural mechanisms related to face emotion processing by comparing face emotion discrimination with four other baseline conditions (identity discrimination of emotional faces, identity discrimination of neutral faces, object discrimination, and pattern discrimination). Results showed a group (high/low social anhedonia) × condition (emotion discrimination/control condition) interaction in the anterior portion of the rostral medial prefrontal cortex, right superior temporal gyrus, and left somatosensory cortex. As predicted, high (relative to low) social anhedonia participants showed less neural activity in face emotion processing regions during emotion discrimination as compared to each control condition. The findings suggest that social anhedonia is associated with abnormalities in networks responsible for basic processes associated with social cognition, and provide a

  2. Auditory Emotion Word Primes Influence Emotional Face Categorization in Children and Adults, but Not Vice Versa.

    Science.gov (United States)

    Vesker, Michael; Bahn, Daniela; Kauschke, Christina; Tschense, Monika; Degé, Franziska; Schwarzer, Gudrun

    2018-01-01

    In order to assess how the perception of audible speech and facial expressions influence one another for the perception of emotions, and how this influence might change over the course of development, we conducted two cross-modal priming experiments with three age groups of children (6-, 9-, and 12-years old), as well as college-aged adults. In Experiment 1, 74 children and 24 adult participants were tasked with categorizing photographs of emotional faces as positive or negative as quickly as possible after being primed with emotion words presented via audio in valence-congruent and valence-incongruent trials. In Experiment 2, 67 children and 24 adult participants carried out a similar categorization task, but with faces acting as visual primes, and emotion words acting as auditory targets. The results of Experiment 1 showed that participants made more errors when categorizing positive faces primed by negative words versus positive words, and that 6-year-old children are particularly sensitive to positive word primes, giving faster correct responses regardless of target valence. Meanwhile, the results of Experiment 2 did not show any congruency effects for priming by facial expressions. Thus, audible emotion words seem to exert an influence on the emotional categorization of faces, while faces do not seem to influence the categorization of emotion words in a significant way.

  3. KDEF-PT: Valence, Emotional Intensity, Familiarity and Attractiveness Ratings of Angry, Neutral, and Happy Faces.

    Science.gov (United States)

    Garrido, Margarida V; Prada, Marília

    2017-01-01

    The Karolinska Directed Emotional Faces (KDEF) is one of the most widely used human facial expressions database. Almost a decade after the original validation study (Goeleven et al., 2008), we present subjective rating norms for a sub-set of 210 pictures which depict 70 models (half female) each displaying an angry, happy and neutral facial expressions. Our main goals were to provide an additional and updated validation to this database, using a sample from a different nationality ( N = 155 Portuguese students, M = 23.73 years old, SD = 7.24) and to extend the number of subjective dimensions used to evaluate each image. Specifically, participants reported emotional labeling (forced-choice task) and evaluated the emotional intensity and valence of the expression, as well as the attractiveness and familiarity of the model (7-points rating scales). Overall, results show that happy faces obtained the highest ratings across evaluative dimensions and emotion labeling accuracy. Female (vs. male) models were perceived as more attractive, familiar and positive. The sex of the model also moderated the accuracy of emotional labeling and ratings of different facial expressions. Each picture of the set was categorized as low, moderate, or high for each dimension. Normative data for each stimulus (hits proportion, means, standard deviations, and confidence intervals per evaluative dimension) is available as supplementary material (available at https://osf.io/fvc4m/).

  4. KDEF-PT: Valence, Emotional Intensity, Familiarity and Attractiveness Ratings of Angry, Neutral, and Happy Faces

    Directory of Open Access Journals (Sweden)

    Margarida V. Garrido

    2017-12-01

    Full Text Available The Karolinska Directed Emotional Faces (KDEF is one of the most widely used human facial expressions database. Almost a decade after the original validation study (Goeleven et al., 2008, we present subjective rating norms for a sub-set of 210 pictures which depict 70 models (half female each displaying an angry, happy and neutral facial expressions. Our main goals were to provide an additional and updated validation to this database, using a sample from a different nationality (N = 155 Portuguese students, M = 23.73 years old, SD = 7.24 and to extend the number of subjective dimensions used to evaluate each image. Specifically, participants reported emotional labeling (forced-choice task and evaluated the emotional intensity and valence of the expression, as well as the attractiveness and familiarity of the model (7-points rating scales. Overall, results show that happy faces obtained the highest ratings across evaluative dimensions and emotion labeling accuracy. Female (vs. male models were perceived as more attractive, familiar and positive. The sex of the model also moderated the accuracy of emotional labeling and ratings of different facial expressions. Each picture of the set was categorized as low, moderate, or high for each dimension. Normative data for each stimulus (hits proportion, means, standard deviations, and confidence intervals per evaluative dimension is available as supplementary material (available at https://osf.io/fvc4m/.

  5. Willingness to express emotion depends upon perceiving partner care.

    Science.gov (United States)

    Von Culin, Katherine R; Hirsch, Jennifer L; Clark, Margaret S

    2017-06-01

    Two studies document that people are more willing to express emotions that reveal vulnerabilities to partners when they perceive those partners to be more communally responsive to them. In Study 1, participants rated the communal strength they thought various partners felt toward them and their own willingness to express happiness, sadness and anxiety to each partner. Individuals who generally perceive high communal strength from their partners were also generally most willing to express emotion to partners. Independently, participants were more willing to express emotion to particular partners whom they perceived felt more communal strength toward them. In Study 2, members of romantic couples independently reported their own felt communal strength toward one another, perceptions of their partners' felt communal strength toward them, and willingness to express emotions (happiness, sadness, anxiety, disgust, anger, hurt and guilt) to each other. The communal strength partners reported feeling toward the participants predicted the participants' willingness to express emotion to those partners. This link was mediated by participants' perceptions of the partner's communal strength toward them which, itself, was a joint function of accurate perceptions of the communal strength partners had reported feeling toward them and projections of their own felt communal strength for their partners onto those partners.

  6. Emotional regulation, perceived socialalienation and well-being of ...

    African Journals Online (AJOL)

    Emotional regulation, perceived socialalienation and well-being of students in a Nigerian University: implilcations for assessment and coaching of emotional regulation. A O Ojedokun. Abstract. No Abstract. African Journal for the psychological studies of social issue Vol. 10 (1&2) 2007: pp. 76-90. Full Text: EMAIL FULL ...

  7. Emotional faces and the default mode network.

    Science.gov (United States)

    Sreenivas, S; Boehm, S G; Linden, D E J

    2012-01-11

    The default-mode network (DMN) of the human brain has become a central topic of cognitive neuroscience research. Although alterations in its resting state activity and in its recruitment during tasks have been reported for several mental and neurodegenerative disorders, its role in emotion processing has received relatively little attention. We investigated brain responses to different categories of emotional faces with functional magnetic resonance imaging (fMRI) and found deactivation in ventromedial prefrontal cortex (VMPFC), posterior cingulate gyrus (PC) and cuneus. This deactivation was modulated by emotional category and was less prominent for happy than for sad faces. These deactivated areas along the midline conformed to areas of the DMN. We also observed emotion-dependent deactivation of the left middle frontal gyrus, which is not a classical component of the DMN. Conversely, several areas in a fronto-parietal network commonly linked with attention were differentially activated by emotion categories. Functional connectivity patterns, as obtained by correlation of activation levels, also varied between emotions. VMPFC, PC or cuneus served as hubs between the DMN-type areas and the fronto-parietal network. These data support recent suggestions that the DMN is not a unitary system but differentiates according to task and even type of stimulus. The emotion-specific differential pattern of DMN deactivation may be explored further in patients with mood disorder, where the quest for biological markers of emotional biases is still ongoing. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  8. Implicit conditioning of faces via the social regulation of emotion: ERP evidence of early attentional biases for security conditioned faces.

    Science.gov (United States)

    Beckes, Lane; Coan, James A; Morris, James P

    2013-08-01

    Not much is known about the neural and psychological processes that promote the initial conditions necessary for positive social bonding. This study explores one method of conditioned bonding utilizing dynamics related to the social regulation of emotion and attachment theory. This form of conditioning involves repeated presentations of negative stimuli followed by images of warm, smiling faces. L. Beckes, J. Simpson, and A. Erickson (2010) found that this conditioning procedure results in positive associations with the faces measured via a lexical decision task, suggesting they are perceived as comforting. This study found that the P1 ERP was similarly modified by this conditioning procedure and the P1 amplitude predicted lexical decision times to insecure words primed by the faces. The findings have implications for understanding how the brain detects supportive people, the flexibility and modifiability of early ERP components, and social bonding more broadly. Copyright © 2013 Society for Psychophysiological Research.

  9. Achievement goals and emotions: The mediational roles of perceived progress, control, and value.

    Science.gov (United States)

    Hall, Nathan C; Sampasivam, Lavanya; Muis, Krista R; Ranellucci, John

    2016-06-01

    The link between achievement goals and achievement emotions is well established; however, research exploring potential mediators of this relationship is lacking. The control-value theory of achievement emotions (Pekrun, 2006, Educational Psychology Review, 18, 315) posits that perceptions of control and value mediate the relationship between achievement goals and achievement emotions, whereas the bidirectional theory of affect (Linnenbrink & Pintrich, 2002, Educational Psychologist, 37, 69) proposes that perceived progress mediates this relationship. The present study empirically evaluated three hypothesized mediators of the effects of achievement goals on learning-related emotions as proposed in the control-value theory and the bidirectional theory of affect. Undergraduate students (N = 273) from humanities, social science, and STEM disciplines participated. Participants completed web-based questionnaires evaluating academic achievement goals, perceptions of control, perceived task value, and achievement emotions. Results provided empirical support primarily for perceived progress as a mediator of mastery-approach goal effects on positive emotions (enjoyment, hope), showing indirect effects of mastery- and performance-approach goals on outcome-related emotions (hope, anxiety) via perceived control. Indirect effects of mastery- and performance-approach goals were further observed on anxiety via perceived value, with higher value levels predicting greater anxiety. Study findings partially support Linnenbrink and Pintrich's (2002, Educational Psychologist, 37, 69) bidirectional theory of affect while underscoring the potential for indirect effects of goals on emotions through perceived control as proposed by Pekrun (2006, Educational Psychology Review, 18, 315). © 2016 The British Psychological Society.

  10. Emotional style, health and perceived quality of life in pregnants

    Directory of Open Access Journals (Sweden)

    Leticia Guarino

    2013-08-01

    Full Text Available The aim of the present study is to determine the possible relationship among emotional style (rumination and emotional inhibition and the perceived health and quality of life of pregnant women. To do so, a sample of 94 Venezuelan women on their first trimester of pregnancy completed questionnaires measuring the studied variables: Rumination, Emotional Inhibition, Global Health and perceived Quality of Life. Results support previous findings regarding the positive association between negative emotional style and the deterioration of the health status, while brings new evidence of the inverse relationship between these individual difference and the quality of life in this particular group, who has been poorly studied in its psychosocial dimension. 

  11. Neuropsychology of facial expressions. The role of consciousness in processing emotional faces

    Directory of Open Access Journals (Sweden)

    Michela Balconi

    2012-04-01

    Full Text Available Neuropsychological studies have underlined the significant presence of distinct brain correlates deputed to analyze facial expression of emotion. It was observed that some cerebral circuits were considered as specific for emotional face comprehension as a function of conscious vs. unconscious processing of emotional information. Moreover, the emotional content of faces (i.e. positive vs. negative; more or less arousing may have an effect in activating specific cortical networks. Between the others, recent studies have explained the contribution of hemispheres in comprehending face, as a function of type of emotions (mainly related to the distinction positive vs. negative and of specific tasks (comprehending vs. producing facial expressions. Specifically, ERPs (event-related potentials analysis overview is proposed in order to comprehend how face may be processed by an observer and how he can make face a meaningful construct even in absence of awareness. Finally, brain oscillations is considered in order to explain the synchronization of neural populations in response to emotional faces when a conscious vs. unconscious processing is activated

  12. Categorical Perception of emotional faces is not affected by aging

    Directory of Open Access Journals (Sweden)

    Mandy Rossignol

    2009-11-01

    Full Text Available Effects of normal aging on categorical perception (CP of facial emotional expressions were investigated. One-hundred healthy participants (20 to 70 years old; five age groups had to identify morphed expressions ranging from neutrality to happiness, sadness and fear. We analysed percentages and latencies of correct recognition for nonmorphed emotional expressions, percentages and latencies of emotional recognition for morphed-faces, locus of the boundaries along the different continua and the number of intrusions. The results showed that unmorphed happy and fearful faces were better processed than unmorphed sad and neutral faces. For morphed faces, CP was confirmed, as latencies increased as a function of the distance between the displayed morph and the original unmorphed photograph. The locus of categorical boundaries was not affected by age. Aging did not alter the accuracy of recognition for original pictures, no more than the emotional recognition of morphed faces or the rate of intrusions. However, latencies of responses increased with age, for both unmorphed and morphed pictures. In conclusion, CP of facial expressions appears to be spared in aging.

  13. Facing mixed emotions: Analytic and holistic perception of facial emotion expressions engages separate brain networks.

    Science.gov (United States)

    Meaux, Emilie; Vuilleumier, Patrik

    2016-11-01

    The ability to decode facial emotions is of primary importance for human social interactions; yet, it is still debated how we analyze faces to determine their expression. Here we compared the processing of emotional face expressions through holistic integration and/or local analysis of visual features, and determined which brain systems mediate these distinct processes. Behavioral, physiological, and brain responses to happy and angry faces were assessed by presenting congruent global configurations of expressions (e.g., happy top+happy bottom), incongruent composite configurations (e.g., angry top+happy bottom), and isolated features (e.g. happy top only). Top and bottom parts were always from the same individual. Twenty-six healthy volunteers were scanned using fMRI while they classified the expression in either the top or the bottom face part but ignored information in the other non-target part. Results indicate that the recognition of happy and anger expressions is neither strictly holistic nor analytic Both routes were involved, but with a different role for analytic and holistic information depending on the emotion type, and different weights of local features between happy and anger expressions. Dissociable neural pathways were engaged depending on emotional face configurations. In particular, regions within the face processing network differed in their sensitivity to holistic expression information, which predominantly activated fusiform, inferior occipital areas and amygdala when internal features were congruent (i.e. template matching), whereas more local analysis of independent features preferentially engaged STS and prefrontal areas (IFG/OFC) in the context of full face configurations, but early visual areas and pulvinar when seen in isolated parts. Collectively, these findings suggest that facial emotion recognition recruits separate, but interactive dorsal and ventral routes within the face processing networks, whose engagement may be shaped by

  14. Adaptation to emotional conflict: evidence from a novel face emotion paradigm.

    Science.gov (United States)

    Clayson, Peter E; Larson, Michael J

    2013-01-01

    The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation) relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression). Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth) or incongruent (happy eyes, angry mouth) while high-density event-related potentials (ERPs) were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs) showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs). Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words.

  15. Adaptation to emotional conflict: evidence from a novel face emotion paradigm.

    Directory of Open Access Journals (Sweden)

    Peter E Clayson

    Full Text Available The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression. Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth or incongruent (happy eyes, angry mouth while high-density event-related potentials (ERPs were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs. Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words.

  16. Emotional faces influence evaluation of natural and transformed food.

    Science.gov (United States)

    Manippa, Valerio; Padulo, Caterina; Brancucci, Alfredo

    2018-07-01

    Previous evidence showed the presence of a straight relationship between feeding behavior and emotions. Despite that, no studies have focused on the influence of emotional faces on food processing. In our study, participants were presented with 72 couples of visual stimuli composed of a neutral, happy, or disgusted faces (5000 ms duration in Experiment 1, adaptation; 150 ms in Experiment 2, priming) followed by a food stimulus (1500 ms). Food stimuli were grouped in pleasant foods, further divided in natural and transformed, and unpleasant rotten foods. The task consisted in judging the food valence (as 'pleasant' or 'unpleasant') by keypress. Results showed a different pattern of response based on the transformation level of food. In general, the evaluation of natural foods was more rapid compared with transformed foods, maybe for their simplicity and healthier perception. In addition, transformed foods yielded incongruent responses with respect to the preceding emotional face, whereas natural foods yielded congruent responses with respect to it. These effects were independent of the duration of the emotional face (i.e., adaptation or priming paradigm) and may depend on pleasant food stimuli salience.

  17. The Perceived Invalidation of Emotion Scale (PIES): Development and psychometric properties of a novel measure of current emotion invalidation.

    Science.gov (United States)

    Zielinski, Melissa J; Veilleux, Jennifer C

    2018-05-24

    Emotion invalidation is theoretically and empirically associated with mental and physical health problems. However, existing measures of invalidation focus on past (e.g., childhood) invalidation and/or do not specifically emphasize invalidation of emotion. In this article, the authors articulate a clarified operational definition of emotion invalidation and use that definition as the foundation for development of a new measure of current perceived emotion invalidation across a series of five studies. Study 1 was a qualitative investigation of people's experiences with emotional invalidation from which we generated items. An initial item pool was vetted by expert reviewers in Study 2 and examined via exploratory factor analysis in Study 3 within both college student and online samples. The scale was reduced to 10 items via confirmatory factor analysis in Study 4, resulting in a brief but psychometrically promising measure, the Perceived Invalidation of Emotion Scale (PIES). A short-term longitudinal investigation (Study 5) revealed that PIES scores had strong test-retest reliability, and that greater perceived emotion invalidation was associated with greater emotion dysregulation, borderline features and symptoms of emotional distress. In addition, the PIES predicted changes in relational health and psychological health over a 1-month period. The current set of studies thus presents a psychometrically promising and practical measure of perceived emotion invalidation that can provide a foundation for future research in this burgeoning area. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  18. A leftward bias however you look at it: Revisiting the emotional chimeric face task as a tool for measuring emotion lateralization.

    Science.gov (United States)

    R Innes, Bobby; Burt, D Michael; Birch, Yan K; Hausmann, Markus

    2015-12-28

    Left hemiface biases observed within the Emotional Chimeric Face Task (ECFT) support emotional face perception models whereby all expressions are preferentially processed by the right hemisphere. However, previous research using this task has not considered that the visible midline between hemifaces might engage atypical facial emotion processing strategies in upright or inverted conditions, nor controlled for left visual field (thus right hemispheric) visuospatial attention biases. This study used novel emotional chimeric faces (blended at the midline) to examine laterality biases for all basic emotions. Left hemiface biases were demonstrated across all emotional expressions and were reduced, but not reversed, for inverted faces. The ECFT bias in upright faces was significantly increased in participants with a large attention bias. These results support the theory that left hemiface biases reflect a genuine bias in emotional face processing, and this bias can interact with attention processes similarly localized in the right hemisphere.

  19. Emotion perception accuracy and bias in face-to-face versus cyberbullying.

    Science.gov (United States)

    Ciucci, Enrica; Baroncelli, Andrea; Nowicki, Stephen

    2014-01-01

    The authors investigated the association of traditional and cyber forms of bullying and victimization with emotion perception accuracy and emotion perception bias. Four basic emotions were considered (i.e., happiness, sadness, anger, and fear); 526 middle school students (280 females; M age = 12.58 years, SD = 1.16 years) were recruited, and emotionality was controlled. Results indicated no significant findings for girls. Boys with higher levels of traditional bullying did not show any deficit in perception accuracy of emotions, but they were prone to identify happiness and fear in faces when a different emotion was expressed; in addition, male cyberbullying was related to greater accuracy in recognizing fear. In terms of the victims, cyber victims had a global problem in recognizing emotions and a specific problem in processing anger and fear. It was concluded that emotion perception accuracy and bias were associated with bullying and victimization for boys not only in traditional settings but also in the electronic ones. Implications of these findings for possible intervention are discussed.

  20. Perceived social support and emotional exhaustion in HIV/AIDS ...

    African Journals Online (AJOL)

    Counsellors have been identified as a group of professionals at elevated risk of burnout in general and emotional exhaustion in particular. Considering the nature of the illness, ... the quality of the services they provide. Key words: Emotional exhaustion, perceived social support, burnout syndrome, demographic variables.

  1. The perception and identification of facial emotions in individuals with autism spectrum disorders using the Let's Face It! Emotion Skills Battery.

    Science.gov (United States)

    Tanaka, James W; Wolf, Julie M; Klaiman, Cheryl; Koenig, Kathleen; Cockburn, Jeffrey; Herlihy, Lauren; Brown, Carla; Stahl, Sherin S; South, Mikle; McPartland, James C; Kaiser, Martha D; Schultz, Robert T

    2012-12-01

    Although impaired social-emotional ability is a hallmark of autism spectrum disorder (ASD), the perceptual skills and mediating strategies contributing to the social deficits of autism are not well understood. A perceptual skill that is fundamental to effective social communication is the ability to accurately perceive and interpret facial emotions. To evaluate the expression processing of participants with ASD, we designed the Let's Face It! Emotion Skills Battery (LFI! Battery), a computer-based assessment composed of three subscales measuring verbal and perceptual skills implicated in the recognition of facial emotions. We administered the LFI! Battery to groups of participants with ASD and typically developing control (TDC) participants that were matched for age and IQ. On the Name Game labeling task, participants with ASD (N = 68) performed on par with TDC individuals (N = 66) in their ability to name the facial emotions of happy, sad, disgust and surprise and were only impaired in their ability to identify the angry expression. On the Matchmaker Expression task that measures the recognition of facial emotions across different facial identities, the ASD participants (N = 66) performed reliably worse than TDC participants (N = 67) on the emotions of happy, sad, disgust, frighten and angry. In the Parts-Wholes test of perceptual strategies of expression, the TDC participants (N = 67) displayed more holistic encoding for the eyes than the mouths in expressive faces whereas ASD participants (N = 66) exhibited the reverse pattern of holistic recognition for the mouth and analytic recognition of the eyes. In summary, findings from the LFI! Battery show that participants with ASD were able to label the basic facial emotions (with the exception of angry expression) on par with age- and IQ-matched TDC participants. However, participants with ASD were impaired in their ability to generalize facial emotions across different identities and showed a tendency to recognize

  2. Are neutral faces of children really emotionally neutral?

    OpenAIRE

    小松, 佐穂子; 箱田, 裕司; Komatsu, Sahoko; Hakoda, Yuji

    2012-01-01

    In this study, we investigated whether people recognize emotions from neutral faces of children (11 to 13 years old). We took facial images of 53 male and 54 female Japanese children who had been asked to keep a neutral facial expression. Then, we conducted an experiment in which 43 participants (19 to 34 years old) rated the strength of four emotions (happiness, surprise, sadness, and anger) for the facial images, using a 7- point scale. We found that (a) they rated both male and female face...

  3. Emotional competence relating to perceived stress and burnout in Spanish teachers: a mediator model

    OpenAIRE

    Lourdes Rey; Natalio Extremera; Mario Pena

    2016-01-01

    This study examined direct associations between emotional competence, perceived stress and burnout in 489 Spanish teachers. In addition, a model in which perceived stress mediated pathways linking emotional competence to teacher burnout symptoms was also examined. Results showed that emotional competence and stress were significantly correlated with teacher burnout symptoms in the expected direction. Moreover, mediational analysis indicated that perceived stress partly mediated the relationsh...

  4. Interactions between facial emotion and identity in face processing: evidence based on redundancy gains.

    Science.gov (United States)

    Yankouskaya, Alla; Booth, David A; Humphreys, Glyn

    2012-11-01

    Interactions between the processing of emotion expression and form-based information from faces (facial identity) were investigated using the redundant-target paradigm, in which we specifically tested whether identity and emotional expression are integrated in a superadditive manner (Miller, Cognitive Psychology 14:247-279, 1982). In Experiments 1 and 2, participants performed emotion and face identity judgments on faces with sad or angry emotional expressions. Responses to redundant targets were faster than responses to either single target when a universal emotion was conveyed, and performance violated the predictions from a model assuming independent processing of emotion and face identity. Experiment 4 showed that these effects were not modulated by varying interstimulus and nontarget contingencies, and Experiment 5 demonstrated that the redundancy gains were eliminated when faces were inverted. Taken together, these results suggest that the identification of emotion and facial identity interact in face processing.

  5. A face a mother could love: depression-related maternal neural responses to infant emotion faces.

    Science.gov (United States)

    Laurent, Heidemarie K; Ablow, Jennifer C

    2013-01-01

    Depressed mothers show negatively biased responses to their infants' emotional bids, perhaps due to faulty processing of infant cues. This study is the first to examine depression-related differences in mothers' neural response to their own infant's emotion faces, considering both effects of perinatal depression history and current depressive symptoms. Primiparous mothers (n = 22), half of whom had a history of major depressive episodes (with one episode occurring during pregnancy and/or postpartum), were exposed to images of their own and unfamiliar infants' joy and distress faces during functional neuroimaging. Group differences (depression vs. no-depression) and continuous effects of current depressive symptoms were tested in relation to neural response to own infant emotion faces. Compared to mothers with no psychiatric diagnoses, those with depression showed blunted responses to their own infant's distress faces in the dorsal anterior cingulate cortex. Mothers with higher levels of current symptomatology showed reduced responses to their own infant's joy faces in the orbitofrontal cortex and insula. Current symptomatology also predicted lower responses to own infant joy-distress in left-sided prefrontal and insula/striatal regions. These deficits in self-regulatory and motivational response circuits may help explain parenting difficulties in depressed mothers.

  6. Perceived duration of emotional events: evidence for a positivity effect in older adults.

    Science.gov (United States)

    Nicol, Jeffrey R; Tanner, Jessica; Clarke, Kelly

    2013-01-01

    BACKGROUND/STUDY CONTEXT: Arousal and negative affect modulate the effect of emotion on the subjective experience of the passage of time. Given that older adults are less aroused by negative emotional stimuli, and report lower levels of negative affect, compared with younger adults, the present study examined whether the effect of emotion on time perception differed in older and younger adults. Participants performed a temporal bisection task for emotional (i.e., angry, sad, happy) and neutral facial expressions presented at varying temporal intervals. Older adults perceived the duration of both positive and threatening events longer than neutral events, whereas younger adults only perceived threatening events longer than neutral events. The results, which are partially consistent with the positivity effect of aging postulated by the socioemotional selectivity theory, are the first to show how the effect of emotion on perceived duration affects older adults, and support previous research indicating that only threatening events prolong perceived duration in younger adults.

  7. Parental autonomy granting and child perceived control: effects on the everyday emotional experience of anxious youth.

    Science.gov (United States)

    Benoit Allen, Kristy; Silk, Jennifer S; Meller, Suzanne; Tan, Patricia Z; Ladouceur, Cecile D; Sheeber, Lisa B; Forbes, Erika E; Dahl, Ronald E; Siegle, Greg J; McMakin, Dana L; Ryan, Neal D

    2016-07-01

    Childhood anxiety is associated with low levels of parental autonomy granting and child perceived control, elevated child emotional reactivity and deficits in child emotion regulation. In early childhood, low levels of parental autonomy granting are thought to decrease child perceived control, which in turn leads to increases in child negative emotion. Later in development, perceived control may become a more stable, trait-like characteristic that amplifies the relationship between parental autonomy granting and child negative emotion. The purpose of this study was to test mediation and moderation models linking parental autonomy granting and child perceived control with child emotional reactivity and emotion regulation in anxious youth. Clinically anxious youth (N = 106) and their primary caregivers were assessed prior to beginning treatment. Children were administered a structured diagnostic interview and participated in a parent-child interaction task that was behaviorally coded for parental autonomy granting. Children completed an ecological momentary assessment protocol during which they reported on perceived control, emotional reactivity (anxiety and physiological arousal) and emotion regulation strategy use in response to daily negative life events. The relationship between parental autonomy granting and both child emotional reactivity and emotion regulation strategy use was moderated by child perceived control: the highest levels of self-reported physiological responding and the lowest levels of acceptance in response to negative events occurred in children low in perceived control with parents high in autonomy granting. Evidence for a mediational model was not found. In addition, child perceived control over negative life events was related to less anxious reactivity and greater use of both problem solving and cognitive restructuring as emotion regulation strategies. Both parental autonomy granting and child perceived control play important roles in the

  8. The role of family expressed emotion and perceived social support in predicting addiction relapse.

    Science.gov (United States)

    Atadokht, Akbar; Hajloo, Nader; Karimi, Masoud; Narimani, Mohammad

    2015-03-01

    Emotional conditions governing the family and patients' perceived social support play important roles in the treatment or relapse process of the chronic disease. The current study aimed to investigate the role of family expressed emotion and perceived social support in prediction of addiction relapse. The descriptive-correlation method was used in the current study. The study population consisted of the individuals referred to the addiction treatment centers in Ardabil from October 2013 to January 2014. The subjects (n = 80) were randomly selected using cluster sampling method. To collect data, expressed emotion test by Cole and Kazaryan, and Multidimensional Scale of Perceived Social Support (MSPSS) were used, and the obtained data was analyzed using the Pearson's correlation coefficient and multiple regression analyses. Results showed a positive relationship between family expressed emotions and the frequency of relapse (r = 0.26, P = 0.011) and a significant negative relationship between perceived social support and the frequency of relapse (r = -0.34, P = 0.001). Multiple regression analysis also showed that perceived social support from family and the family expressed emotions significantly explained 12% of the total variance of relapse frequency. These results have implications for addicted people, their families and professionals working in addiction centers to use the emotional potential of families especially their expressed emotions and the perceived social support of addicts to increase the success rate of addiction treatment.

  9. Mere social categorization modulates identification of facial expressions of emotion.

    Science.gov (United States)

    Young, Steven G; Hugenberg, Kurt

    2010-12-01

    The ability of the human face to communicate emotional states via facial expressions is well known, and past research has established the importance and universality of emotional facial expressions. However, recent evidence has revealed that facial expressions of emotion are most accurately recognized when the perceiver and expresser are from the same cultural ingroup. The current research builds on this literature and extends this work. Specifically, we find that mere social categorization, using a minimal-group paradigm, can create an ingroup emotion-identification advantage even when the culture of the target and perceiver is held constant. Follow-up experiments show that this effect is supported by differential motivation to process ingroup versus outgroup faces and that this motivational disparity leads to more configural processing of ingroup faces than of outgroup faces. Overall, the results point to distinct processing modes for ingroup and outgroup faces, resulting in differential identification accuracy for facial expressions of emotion. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  10. Congruence of happy and sad emotion in music and faces modifies cortical audiovisual activation.

    Science.gov (United States)

    Jeong, Jeong-Won; Diwadkar, Vaibhav A; Chugani, Carla D; Sinsoongsud, Piti; Muzik, Otto; Behen, Michael E; Chugani, Harry T; Chugani, Diane C

    2011-02-14

    The powerful emotion inducing properties of music are well-known, yet music may convey differing emotional responses depending on environmental factors. We hypothesized that neural mechanisms involved in listening to music may differ when presented together with visual stimuli that conveyed the same emotion as the music when compared to visual stimuli with incongruent emotional content. We designed this study to determine the effect of auditory (happy and sad instrumental music) and visual stimuli (happy and sad faces) congruent or incongruent for emotional content on audiovisual processing using fMRI blood oxygenation level-dependent (BOLD) signal contrast. The experiment was conducted in the context of a conventional block-design experiment. A block consisted of three emotional ON periods, music alone (happy or sad music), face alone (happy or sad faces), and music combined with faces where the music excerpt was played while presenting either congruent emotional faces or incongruent emotional faces. We found activity in the superior temporal gyrus (STG) and fusiform gyrus (FG) to be differentially modulated by music and faces depending on the congruence of emotional content. There was a greater BOLD response in STG when the emotion signaled by the music and faces was congruent. Furthermore, the magnitude of these changes differed for happy congruence and sad congruence, i.e., the activation of STG when happy music was presented with happy faces was greater than the activation seen when sad music was presented with sad faces. In contrast, incongruent stimuli diminished the BOLD response in STG and elicited greater signal change in bilateral FG. Behavioral testing supplemented these findings by showing that subject ratings of emotion in faces were influenced by emotion in music. When presented with happy music, happy faces were rated as more happy (p=0.051) and sad faces were rated as less sad (p=0.030). When presented with sad music, happy faces were rated as less

  11. Behavioural and neurophysiological evidence for face identity and face emotion processing in animals

    Science.gov (United States)

    Tate, Andrew J; Fischer, Hanno; Leigh, Andrea E; Kendrick, Keith M

    2006-01-01

    Visual cues from faces provide important social information relating to individual identity, sexual attraction and emotional state. Behavioural and neurophysiological studies on both monkeys and sheep have shown that specialized skills and neural systems for processing these complex cues to guide behaviour have evolved in a number of mammals and are not present exclusively in humans. Indeed, there are remarkable similarities in the ways that faces are processed by the brain in humans and other mammalian species. While human studies with brain imaging and gross neurophysiological recording approaches have revealed global aspects of the face-processing network, they cannot investigate how information is encoded by specific neural networks. Single neuron electrophysiological recording approaches in both monkeys and sheep have, however, provided some insights into the neural encoding principles involved and, particularly, the presence of a remarkable degree of high-level encoding even at the level of a specific face. Recent developments that allow simultaneous recordings to be made from many hundreds of individual neurons are also beginning to reveal evidence for global aspects of a population-based code. This review will summarize what we have learned so far from these animal-based studies about the way the mammalian brain processes the faces and the emotions they can communicate, as well as associated capacities such as how identity and emotion cues are dissociated and how face imagery might be generated. It will also try to highlight what questions and advances in knowledge still challenge us in order to provide a complete understanding of just how brain networks perform this complex and important social recognition task. PMID:17118930

  12. How Context Influences Our Perception of Emotional Faces

    DEFF Research Database (Denmark)

    Calbi, Marta; Heimann, Katrin; Barratt, Daniel

    2017-01-01

    corresponding to one of the so called ‘basic emotions.’ However, our real experience during social interactions is different: facial expressions of emotion are mostly perceived in a wider context, constituted by body language, the surrounding environment, and our beliefs and expectations. Already in the early...... twentieth century, the Russian filmmaker Lev Kuleshov argued that such context, established by intermediate shots of strong emotional content, could significantly change our interpretation of facial expressions in film. Prior experiments have shown behavioral effects pointing in this direction, but have...

  13. Corrigendum: Cultural Relativity in Perceiving Emotion From Vocalizations.

    Science.gov (United States)

    2014-12-01

    Gendron, M., Roberson, D., van der Vyver, J. M., & Barrett, L. F. (2014). Cultural relativity in perceiving emotion from vocalizations. Psychological Science, 25, 911-920. (Original DOI: 10.1177/0956797613517239 ).

  14. Influence of Emotional Facial Expressions on 3-5-Year-Olds' Face Recognition

    Science.gov (United States)

    Freitag, Claudia; Schwarzer, Gudrun

    2011-01-01

    Three experiments examined 3- and 5-year-olds' recognition of faces in constant and varied emotional expressions. Children were asked to identify repeatedly presented target faces, distinguishing them from distractor faces, during an immediate recognition test and during delayed assessments after 10 min and one week. Emotional facial expression…

  15. Faces in context: A review and systematization of contextual influences on affective face processing

    Directory of Open Access Journals (Sweden)

    Matthias J Wieser

    2012-11-01

    Full Text Available Facial expressions are of eminent importance for social interaction as they convey information about other individuals’ emotions and social intentions. According to the predominant basic emotion approach, the perception of emotion in faces is based on the rapid, automatic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, decontextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual’s face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social information gained from affective learning and implicit processing biases such as race bias. Thus, the perception of facial expressions is presumably always influenced by contextual variables. In this comprehensive review, we aim at 1 systematizing the contextual variables that may influence the perception of facial expressions and 2 summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of automatic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in

  16. Children's understanding of facial expression of emotion: II. Drawing of emotion-faces.

    Science.gov (United States)

    Missaghi-Lakshman, M; Whissell, C

    1991-06-01

    67 children from Grades 2, 4, and 7 drew faces representing the emotional expressions of fear, anger, surprise, disgust, happiness, and sadness. The children themselves and 29 adults later decoded the drawings in an emotion-recognition task. Children were the more accurate decoders, and their accuracy and the accuracy of adults increased significantly for judgments of 7th-grade drawings. The emotions happy and sad were most accurately decoded. There were no significant differences associated with sex. In their drawings, children utilized a symbol system that seems to be based on a highlighting or exaggeration of features of the innately governed facial expression of emotion.

  17. Visual attention to emotional face in schizophrenia: an eye tracking study.

    Directory of Open Access Journals (Sweden)

    Mania Asgharpour

    2015-03-01

    Full Text Available Deficits in the processing of facial emotions have been reported extensively in patients with schizophrenia. To explore whether restricted attention is the cause of impaired emotion processing in these patients, we examined visual attention through tracking eye movements in response to emotional and neutral face stimuli in a group of patients with schizophrenia and healthy individuals. We also examined the correlation between visual attention allocation and symptoms severity in our patient group.Thirty adult patients with schizophrenia and 30 matched healthy controls participated in this study. Visual attention data were recorded while participants passively viewed emotional-neutral face pairs for 500 ms. The relationship between the visual attention and symptoms severity were assessed by the Positive and Negative Syndrome Scale (PANSS in the schizophrenia group. Repeated Measures ANOVAs were used to compare the groups.Comparing the number of fixations made during face-pairs presentation, we found that patients with schizophrenia made fewer fixations on faces, regardless of the expression of the face. Analysis of the number of fixations on negative-neutral pairs also revealed that the patients made fewer fixations on both neutral and negative faces. Analysis of number of fixations on positive-neutral pairs only showed more fixations on positive relative to neutral expressions in both groups. We found no correlations between visual attention pattern to faces and symptom severity in schizophrenic patients.The results of this study suggest that the facial recognition deficit in schizophrenia is related to decreased attention to face stimuli. Finding of no difference in visual attention for positive-neutral face pairs between the groups is in line with studies that have shown increased ability to positive emotional perception in these patients.

  18. Do Characteristics of Faces That Convey Trustworthiness and Dominance Underlie Perceptions of Criminality?

    Science.gov (United States)

    Flowe, Heather D.

    2012-01-01

    Background This study tested whether the 2D face evaluation model proposed by Oosterhof and Todorov can parsimoniously account for why some faces are perceived as more criminal-looking than others. The 2D model proposes that trust and dominance are spontaneously evaluated from features of faces. These evaluations have adaptive significance from an evolutionary standpoint because they indicate whether someone should be approached or avoided. Method Participants rated the emotional state, personality traits, and criminal appearance of faces shown in photographs. The photographs were of males and females taken under naturalistic conditions (i.e., police mugshots) and highly controlled conditions. In the controlled photographs, the emotion display of the actor was systematically varied (happy expression, emotionally neutral expression, or angry expression). Results Both male and female faces rated high in criminal appearance were perceived as less trustworthy and more dominant in police mugshots as well as in photographs taken under highly controlled conditions. Additionally, emotionally neutral faces were deemed as less trustworthy if they were perceived as angry, and more dominant if they were morphologically mature. Systematically varying emotion displays also affected criminality ratings, with angry faces perceived as the most criminal, followed by neutral faces and then happy faces. Conclusion The 2D model parsimoniously accounts for criminality perceptions. This study extends past research by demonstrating that morphological features that signal high dominance and low trustworthiness can also signal high criminality. Spontaneous evaluations regarding criminal propensity may have adaptive value in that they may help us to avoid someone who is physically threatening. On the other hand, such evaluations could inappropriately influence decision making in criminal identification lineups. Hence, additional research is needed to discover whether and how people can avoid

  19. Do characteristics of faces that convey trustworthiness and dominance underlie perceptions of criminality?

    Science.gov (United States)

    Flowe, Heather D

    2012-01-01

    This study tested whether the 2D face evaluation model proposed by Oosterhof and Todorov can parsimoniously account for why some faces are perceived as more criminal-looking than others. The 2D model proposes that trust and dominance are spontaneously evaluated from features of faces. These evaluations have adaptive significance from an evolutionary standpoint because they indicate whether someone should be approached or avoided. Participants rated the emotional state, personality traits, and criminal appearance of faces shown in photographs. The photographs were of males and females taken under naturalistic conditions (i.e., police mugshots) and highly controlled conditions. In the controlled photographs, the emotion display of the actor was systematically varied (happy expression, emotionally neutral expression, or angry expression). Both male and female faces rated high in criminal appearance were perceived as less trustworthy and more dominant in police mugshots as well as in photographs taken under highly controlled conditions. Additionally, emotionally neutral faces were deemed as less trustworthy if they were perceived as angry, and more dominant if they were morphologically mature. Systematically varying emotion displays also affected criminality ratings, with angry faces perceived as the most criminal, followed by neutral faces and then happy faces. The 2D model parsimoniously accounts for criminality perceptions. This study extends past research by demonstrating that morphological features that signal high dominance and low trustworthiness can also signal high criminality. Spontaneous evaluations regarding criminal propensity may have adaptive value in that they may help us to avoid someone who is physically threatening. On the other hand, such evaluations could inappropriately influence decision making in criminal identification lineups. Hence, additional research is needed to discover whether and how people can avoid making evaluations regarding

  20. Do characteristics of faces that convey trustworthiness and dominance underlie perceptions of criminality?

    Directory of Open Access Journals (Sweden)

    Heather D Flowe

    Full Text Available BACKGROUND: This study tested whether the 2D face evaluation model proposed by Oosterhof and Todorov can parsimoniously account for why some faces are perceived as more criminal-looking than others. The 2D model proposes that trust and dominance are spontaneously evaluated from features of faces. These evaluations have adaptive significance from an evolutionary standpoint because they indicate whether someone should be approached or avoided. METHOD: Participants rated the emotional state, personality traits, and criminal appearance of faces shown in photographs. The photographs were of males and females taken under naturalistic conditions (i.e., police mugshots and highly controlled conditions. In the controlled photographs, the emotion display of the actor was systematically varied (happy expression, emotionally neutral expression, or angry expression. RESULTS: Both male and female faces rated high in criminal appearance were perceived as less trustworthy and more dominant in police mugshots as well as in photographs taken under highly controlled conditions. Additionally, emotionally neutral faces were deemed as less trustworthy if they were perceived as angry, and more dominant if they were morphologically mature. Systematically varying emotion displays also affected criminality ratings, with angry faces perceived as the most criminal, followed by neutral faces and then happy faces. CONCLUSION: The 2D model parsimoniously accounts for criminality perceptions. This study extends past research by demonstrating that morphological features that signal high dominance and low trustworthiness can also signal high criminality. Spontaneous evaluations regarding criminal propensity may have adaptive value in that they may help us to avoid someone who is physically threatening. On the other hand, such evaluations could inappropriately influence decision making in criminal identification lineups. Hence, additional research is needed to discover whether

  1. How Context Influences Our Perception of Emotional Faces

    DEFF Research Database (Denmark)

    Calbi, Marta; Heimann, Katrin; Barratt, Daniel

    2017-01-01

    Facial expressions are of major importance in understanding the mental and emotional states of others. So far, most studies on the perception and comprehension of emotions have used isolated facial expressions as stimuli; for example, photographs of actors displaying facial expressions...... corresponding to one of the so called ‘basic emotions.’ However, our real experience during social interactions is different: facial expressions of emotion are mostly perceived in a wider context, constituted by body language, the surrounding environment, and our beliefs and expectations. Already in the early...... twentieth century, the Russian filmmaker Lev Kuleshov argued that such context, established by intermediate shots of strong emotional content, could significantly change our interpretation of facial expressions in film. Prior experiments have shown behavioral effects pointing in this direction, but have...

  2. Perceived emotional intelligence, achievement motivation and school performance in conservatory students

    OpenAIRE

    Laura López Bernad

    2013-01-01

    The present research inspects the relationship between Perceived Emotional Intelligence (PEI) and achievement motivation in a sample of students (N=57) belonging to the string department at the Conservatorio Profesional de Música and specializing in violin, viola, violoncello and bass violoncello. The evaluation methods were: the Trait Meta-Mood Scale (TMMS-24), to test Perceived Emotional Intelligence, and the Escala Atribucional de Motivación de Logro (EAML, Attributive Motivation Scale) to...

  3. Human sex differences in emotional processing of own-race and other-race faces.

    Science.gov (United States)

    Ran, Guangming; Chen, Xu; Pan, Yangu

    2014-06-18

    There is evidence that women and men show differences in the perception of affective facial expressions. However, none of the previous studies directly investigated sex differences in emotional processing of own-race and other-race faces. The current study addressed this issue using high time resolution event-related potential techniques. In total, data from 25 participants (13 women and 12 men) were analyzed. It was found that women showed increased N170 amplitudes to negative White faces compared with negative Chinese faces over the right hemisphere electrodes. This result suggests that women show enhanced sensitivity to other-race faces showing negative emotions (fear or disgust), which may contribute toward evolution. However, the current data showed that men had increased N170 amplitudes to happy Chinese versus happy White faces over the left hemisphere electrodes, indicating that men show enhanced sensitivity to own-race faces showing positive emotions (happiness). In this respect, men might use past pleasant emotional experiences to boost recognition of own-race faces.

  4. Power as an emotional liability: Implications for perceived authenticity and trust after a transgression.

    Science.gov (United States)

    Kim, Peter H; Mislin, Alexandra; Tuncel, Ece; Fehr, Ryan; Cheshin, Arik; van Kleef, Gerben A

    2017-10-01

    People may express a variety of emotions after committing a transgression. Through 6 empirical studies and a meta-analysis, we investigate how the perceived authenticity of such emotional displays and resulting levels of trust are shaped by the transgressor's power. Past findings suggest that individuals with power tend to be more authentic because they have more freedom to act on the basis of their own personal inclinations. Yet, our findings reveal that (a) a transgressor's display of emotion is perceived to be less authentic when that party's power is high rather than low; (b) this perception of emotional authenticity, in turn, directly influences (and mediates) the level of trust in that party; and (c) perceivers ultimately exert less effort when asked to make a case for leniency toward high rather than low-power transgressors. This tendency to discount the emotional authenticity of the powerful was found to arise from power increasing the transgressor's perceived level of emotional control and strategic motivation, rather than a host of alternative mechanisms. These results were also found across different types of emotions (sadness, anger, fear, happiness, and neutral), expressive modalities, operationalizations of the transgression, and participant populations. Altogether, our findings demonstrate that besides the wealth of benefits power can afford, it also comes with a notable downside. The findings, furthermore, extend past research on perceived emotional authenticity, which has focused on how and when specific emotions are expressed, by revealing how this perception can depend on considerations that have nothing to do with the expression itself. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Self-esteem Modulates the P3 Component in Response to the Self-face Processing after Priming with Emotional Faces.

    Science.gov (United States)

    Guan, Lili; Zhao, Yufang; Wang, Yige; Chen, Yujie; Yang, Juan

    2017-01-01

    The self-face processing advantage (SPA) refers to the research finding that individuals generally recognize their own face faster than another's face; self-face also elicits an enhanced P3 amplitude compared to another's face. It has been suggested that social evaluation threats could weaken the SPA and that self-esteem could be regarded as a threat buffer. However, little research has directly investigated the neural evidence of how self-esteem modulates the social evaluation threat to the SPA. In the current event-related potential study, 27 healthy Chinese undergraduate students were primed with emotional faces (angry, happy, or neutral) and were asked to judge whether the target face (self, friend, and stranger) was familiar or unfamiliar. Electrophysiological results showed that after priming with emotional faces (angry and happy), self-face elicited similar P3 amplitudes to friend-face in individuals with low self-esteem, but not in individuals with high self-esteem. The results suggest that as low self-esteem raises fears of social rejection and exclusion, priming with emotional faces (angry and happy) can weaken the SPA in low self-esteem individuals but not in high self-esteem individuals.

  6. Self-esteem Modulates the P3 Component in Response to the Self-face Processing after Priming with Emotional Faces

    Directory of Open Access Journals (Sweden)

    Lili Guan

    2017-08-01

    Full Text Available The self-face processing advantage (SPA refers to the research finding that individuals generally recognize their own face faster than another’s face; self-face also elicits an enhanced P3 amplitude compared to another’s face. It has been suggested that social evaluation threats could weaken the SPA and that self-esteem could be regarded as a threat buffer. However, little research has directly investigated the neural evidence of how self-esteem modulates the social evaluation threat to the SPA. In the current event-related potential study, 27 healthy Chinese undergraduate students were primed with emotional faces (angry, happy, or neutral and were asked to judge whether the target face (self, friend, and stranger was familiar or unfamiliar. Electrophysiological results showed that after priming with emotional faces (angry and happy, self-face elicited similar P3 amplitudes to friend-face in individuals with low self-esteem, but not in individuals with high self-esteem. The results suggest that as low self-esteem raises fears of social rejection and exclusion, priming with emotional faces (angry and happy can weaken the SPA in low self-esteem individuals but not in high self-esteem individuals.

  7. Emotional competence relating to perceived stress and burnout in Spanish teachers: a mediator model

    Science.gov (United States)

    Extremera, Natalio

    2016-01-01

    This study examined direct associations between emotional competence, perceived stress and burnout in 489 Spanish teachers. In addition, a model in which perceived stress mediated pathways linking emotional competence to teacher burnout symptoms was also examined. Results showed that emotional competence and stress were significantly correlated with teacher burnout symptoms in the expected direction. Moreover, mediational analysis indicated that perceived stress partly mediated the relationship between emotional competence and the three dimensions of burnout even when controlling for salient background characteristics. These findings suggest an underlying process by which high emotional competence may increase the capacity to cope with symptoms of burnout, by reducing the experience of stress. Implications of these findings for future research and for working with teachers to prevent burnout are discussed. PMID:27280077

  8. Emotional competence relating to perceived stress and burnout in Spanish teachers: a mediator model.

    Science.gov (United States)

    Rey, Lourdes; Extremera, Natalio; Pena, Mario

    2016-01-01

    This study examined direct associations between emotional competence, perceived stress and burnout in 489 Spanish teachers. In addition, a model in which perceived stress mediated pathways linking emotional competence to teacher burnout symptoms was also examined. Results showed that emotional competence and stress were significantly correlated with teacher burnout symptoms in the expected direction. Moreover, mediational analysis indicated that perceived stress partly mediated the relationship between emotional competence and the three dimensions of burnout even when controlling for salient background characteristics. These findings suggest an underlying process by which high emotional competence may increase the capacity to cope with symptoms of burnout, by reducing the experience of stress. Implications of these findings for future research and for working with teachers to prevent burnout are discussed.

  9. Threat advantage: perception of angry and happy dynamic faces across cultures.

    Science.gov (United States)

    Marinetti, Claudia; Mesquita, Batja; Yik, Michelle; Cragwall, Caroline; Gallagher, Ashleigh H

    2012-01-01

    The current study tested whether the perception of angry faces is cross-culturally privileged over that of happy faces, by comparing perception of the offset of emotion in a dynamic flow of expressions. Thirty Chinese and 30 European-American participants saw movies that morphed an anger expression into a happy expression of the same stimulus person, or vice versa. Participants were asked to stop the movie at the point where they ceased seeing the initial emotion. As expected, participants cross-culturally continued to perceive anger longer than happiness. Moreover, anger was perceived longer in in-group than in out-group faces. The effects were driven by female rather than male targets. Results are discussed with reference to the important role of context in emotion perception.

  10. Detection of Emotional Faces: Salient Physical Features Guide Effective Visual Search

    Science.gov (United States)

    Calvo, Manuel G.; Nummenmaa, Lauri

    2008-01-01

    In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent,…

  11. Recognition memory for low- and high-frequency-filtered emotional faces: Low spatial frequencies drive emotional memory enhancement, whereas high spatial frequencies drive the emotion-induced recognition bias.

    Science.gov (United States)

    Rohr, Michaela; Tröger, Johannes; Michely, Nils; Uhde, Alarith; Wentura, Dirk

    2017-07-01

    This article deals with two well-documented phenomena regarding emotional stimuli: emotional memory enhancement-that is, better long-term memory for emotional than for neutral stimuli-and the emotion-induced recognition bias-that is, a more liberal response criterion for emotional than for neutral stimuli. Studies on visual emotion perception and attention suggest that emotion-related processes can be modulated by means of spatial-frequency filtering of the presented emotional stimuli. Specifically, low spatial frequencies are assumed to play a primary role for the influence of emotion on attention and judgment. Given this theoretical background, we investigated whether spatial-frequency filtering also impacts (1) the memory advantage for emotional faces and (2) the emotion-induced recognition bias, in a series of old/new recognition experiments. Participants completed incidental-learning tasks with high- (HSF) and low- (LSF) spatial-frequency-filtered emotional and neutral faces. The results of the surprise recognition tests showed a clear memory advantage for emotional stimuli. Most importantly, the emotional memory enhancement was significantly larger for face images containing only low-frequency information (LSF faces) than for HSF faces across all experiments, suggesting that LSF information plays a critical role in this effect, whereas the emotion-induced recognition bias was found only for HSF stimuli. We discuss our findings in terms of both the traditional account of different processing pathways for HSF and LSF information and a stimulus features account. The double dissociation in the results favors the latter account-that is, an explanation in terms of differences in the characteristics of HSF and LSF stimuli.

  12. Is perceived emotional support beneficial? Well-being and health in independent and interdependent cultures.

    Science.gov (United States)

    Uchida, Yukiko; Kitayama, Shinobu; Mesquita, Batja; Reyes, Jose Alberto S; Morling, Beth

    2008-06-01

    Previous studies show there is little or no association between perceived emotional support and well-being in European American culture. The authors hypothesized that this paradoxical absence of any benefit of perceived support is unique to cultural contexts that privilege independence rather than interdependence of the self. Study 1 tested college students and found, as predicted, that among Euro-Americans a positive effect of perceived emotional support on subjective well-being (positive affect) was weak and, moreover, it disappeared entirely once self-esteem was statistically controlled. In contrast, among Asians in Asia (Japanese and Filipinos) perceived emotional support positively predicted subjective well-being even after self-esteem was controlled. Study 2 extended Study 1 by testing both Japanese and American adults in midlife with respect to multiple indicators of well-being and physical health. Overall, the evidence underscores the central significance of culture as a moderator of the effectiveness of perceived emotional support.

  13. Association between Ability Emotional Intelligence and Left Insula during Social Judgment of Facial Emotions.

    Science.gov (United States)

    Quarto, Tiziana; Blasi, Giuseppe; Maddalena, Chiara; Viscanti, Giovanna; Lanciano, Tiziana; Soleti, Emanuela; Mangiulli, Ivan; Taurisano, Paolo; Fazio, Leonardo; Bertolino, Alessandro; Curci, Antonietta

    2016-01-01

    The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain "somatic marker circuitry" (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.

  14. Processing of emotional faces in congenital amusia: An emotional music priming event-related potential study.

    Science.gov (United States)

    Zhishuai, Jin; Hong, Liu; Daxing, Wu; Pin, Zhang; Xuejing, Lu

    2017-01-01

    Congenital amusia is characterized by lifelong impairments in music perception and processing. It is unclear whether pitch detection deficits impact amusic individuals' perception of musical emotion. In the current work, 19 amusics and 21 healthy controls were subjected to electroencephalography (EEG) while being exposed to music excerpts and emotional faces. We assessed each individual's ability to discriminate positive- and negative-valenced emotional faces and analyzed electrophysiological indices, in the form of event-related potentials (ERPs) recorded at 32 sites, following exposure to emotionally positive or negative music excerpts. We observed smaller N2 amplitudes in response to facial expressions in the amusia group than in the control group, suggesting that amusics were less affected by the musical stimuli. The late-positive component (LPC) in amusics was similar to that in controls. Our results suggest that the neurocognitive deficit characteristic of congenital amusia is fundamentally an impairment in musical information processing rather than an impairment in emotional processing.

  15. Processing of emotional faces in congenital amusia: An emotional music priming event-related potential study

    Directory of Open Access Journals (Sweden)

    Jin Zhishuai

    2017-01-01

    Full Text Available Congenital amusia is characterized by lifelong impairments in music perception and processing. It is unclear whether pitch detection deficits impact amusic individuals' perception of musical emotion. In the current work, 19 amusics and 21 healthy controls were subjected to electroencephalography (EEG while being exposed to music excerpts and emotional faces. We assessed each individual's ability to discriminate positive- and negative-valenced emotional faces and analyzed electrophysiological indices, in the form of event-related potentials (ERPs recorded at 32 sites, following exposure to emotionally positive or negative music excerpts. We observed smaller N2 amplitudes in response to facial expressions in the amusia group than in the control group, suggesting that amusics were less affected by the musical stimuli. The late-positive component (LPC in amusics was similar to that in controls. Our results suggest that the neurocognitive deficit characteristic of congenital amusia is fundamentally an impairment in musical information processing rather than an impairment in emotional processing.

  16. Chinese Adolescents' Emotional Intelligence, Perceived Social Support, and Resilience

    Science.gov (United States)

    Chen, Shitao

    2017-01-01

    The constructs of emotional intelligence, perceived social support and resilience have been primarily developed in a Western, individual-oriented societal context. The application of these constructs in Eastern cultures requires further investigation. The aim of the study was to examine the relationships among trait emotional intelligence,…

  17. The electrophysiological effects of the serotonin 1A receptor agonist buspirone in emotional face processing.

    Science.gov (United States)

    Bernasconi, Fosco; Kometer, Michael; Pokorny, Thomas; Seifritz, Erich; Vollenweider, Franz X

    2015-04-01

    Emotional face processing is critically modulated by the serotonergic system, and serotonin (5-HT) receptor agonists impair emotional face processing. However, the specific contribution of the 5-HT1A receptor remains poorly understood. Here we investigated the spatiotemporal brain mechanisms underpinning the modulation of emotional face processing induced by buspirone, a partial 5-HT1A receptor agonist. In a psychophysical discrimination of emotional faces task, we observed that the discrimination fearful versus neutral faces were reduced, but not happy versus neutral faces. Electrical neuroimaging analyses were applied to visual evoked potentials elicited by emotional face images, after placebo and buspirone administration. Buspirone modulated response strength (i.e., global field power) in the interval 230-248ms after stimulus onset. Distributed source estimation over this time interval revealed that buspirone decreased the neural activity in the right dorsolateral prefrontal cortex that was evoked by fearful faces. These results indicate temporal and valence-specific effects of buspirone on the neuronal correlates of emotional face processing. Furthermore, the reduced neural activity in the dorsolateral prefrontal cortex in response to fearful faces suggests a reduced attention to fearful faces. Collectively, these findings provide new insights into the role of 5-HT1A receptors in emotional face processing and have implications for affective disorders that are characterized by an increased attention to negative stimuli. Copyright © 2015 Elsevier B.V. and ECNP. All rights reserved.

  18. Emotional energy, work self-efficacy, and perceived similarity during the Mars 520 study.

    Science.gov (United States)

    Solcová, Iva; Gushin, Vadim; Vinokhodova, Alla; Lukavský, Jirí

    2013-11-01

    The objective of the present research was to study the dynamics of changes in emotional energy, work self-efficacy and perceived similarity in the crew of the Mars 520 experimental study. The study comprised six volunteers, all men, between 27-38 yr of age (M = 32.16; SD = 4.99). The Mars 520 experimental study simulated all the elements of the proposed Mars mission that could be ground simulated, i.e., traveling to Mars, orbiting it, landing, and returning to Earth. During the simulation, measures of emotional energy, work self-efficacy, and perceived similarity were repeated every month. The data were analyzed using linear mixed effect models. Emotional energy, work self-efficacy, and perceived similarity gradually increased in the course of the simulation. There was no evidence for a so-called third quarter phenomenon (the most strenuous period of group isolation, psychologically, emotionally, and socially) in our data. On the contrary, work self-efficacy, emotional energy, and group cohesion (indexed here by the subject's perceived similarity to others) increased significantly in the course of the simulation, with the latter two variables showing positive growth in the group functioning.

  19. Emotional competence relating to perceived stress and burnout in Spanish teachers: a mediator model

    Directory of Open Access Journals (Sweden)

    Lourdes Rey

    2016-05-01

    Full Text Available This study examined direct associations between emotional competence, perceived stress and burnout in 489 Spanish teachers. In addition, a model in which perceived stress mediated pathways linking emotional competence to teacher burnout symptoms was also examined. Results showed that emotional competence and stress were significantly correlated with teacher burnout symptoms in the expected direction. Moreover, mediational analysis indicated that perceived stress partly mediated the relationship between emotional competence and the three dimensions of burnout even when controlling for salient background characteristics. These findings suggest an underlying process by which high emotional competence may increase the capacity to cope with symptoms of burnout, by reducing the experience of stress. Implications of these findings for future research and for working with teachers to prevent burnout are discussed.

  20. Dimensions of Emotional Intelligence and Online Gaming Addiction in Adolescence: The Indirect Effects of Two Facets of Perceived Stress.

    Science.gov (United States)

    Che, Dexin; Hu, Jianping; Zhen, Shuangju; Yu, Chengfu; Li, Bin; Chang, Xi; Zhang, Wei

    2017-01-01

    This study tested a parallel two-mediator model in which the relationship between dimensions of emotional intelligence and online gaming addiction are mediated by perceived helplessness and perceived self-efficacy, respectively. The sample included 931 male adolescents (mean age = 16.18 years, SD = 0.95) from southern China. Data on emotional intelligence (four dimensions, including self-management of emotion, social skills, empathy and utilization of emotions), perceived stress (two facets, including perceived self-efficacy and perceived helplessness) and online gaming addiction were collected, and bootstrap methods were used to test this parallel two-mediator model. Our findings revealed that perceived self-efficacy mediated the relationship between three dimensions of emotional intelligence (i.e., self-management, social skills, and empathy) and online gaming addiction, and perceived helplessness mediated the relationship between two dimensions of emotional intelligence (i.e., self-management and emotion utilization) and online gaming addiction. These findings underscore the importance of separating the four dimensions of emotional intelligence and two facets of perceived stress to understand the complex relationship between these factors and online gaming addiction.

  1. Dimensions of Emotional Intelligence and Online Gaming Addiction in Adolescence: The Indirect Effects of Two Facets of Perceived Stress

    Directory of Open Access Journals (Sweden)

    Dexin Che

    2017-07-01

    Full Text Available This study tested a parallel two-mediator model in which the relationship between dimensions of emotional intelligence and online gaming addiction are mediated by perceived helplessness and perceived self-efficacy, respectively. The sample included 931 male adolescents (mean age = 16.18 years, SD = 0.95 from southern China. Data on emotional intelligence (four dimensions, including self-management of emotion, social skills, empathy and utilization of emotions, perceived stress (two facets, including perceived self-efficacy and perceived helplessness and online gaming addiction were collected, and bootstrap methods were used to test this parallel two-mediator model. Our findings revealed that perceived self-efficacy mediated the relationship between three dimensions of emotional intelligence (i.e., self-management, social skills, and empathy and online gaming addiction, and perceived helplessness mediated the relationship between two dimensions of emotional intelligence (i.e., self-management and emotion utilization and online gaming addiction. These findings underscore the importance of separating the four dimensions of emotional intelligence and two facets of perceived stress to understand the complex relationship between these factors and online gaming addiction.

  2. Dimensions of Emotional Intelligence and Online Gaming Addiction in Adolescence: The Indirect Effects of Two Facets of Perceived Stress

    Science.gov (United States)

    Che, Dexin; Hu, Jianping; Zhen, Shuangju; Yu, Chengfu; Li, Bin; Chang, Xi; Zhang, Wei

    2017-01-01

    This study tested a parallel two-mediator model in which the relationship between dimensions of emotional intelligence and online gaming addiction are mediated by perceived helplessness and perceived self-efficacy, respectively. The sample included 931 male adolescents (mean age = 16.18 years, SD = 0.95) from southern China. Data on emotional intelligence (four dimensions, including self-management of emotion, social skills, empathy and utilization of emotions), perceived stress (two facets, including perceived self-efficacy and perceived helplessness) and online gaming addiction were collected, and bootstrap methods were used to test this parallel two-mediator model. Our findings revealed that perceived self-efficacy mediated the relationship between three dimensions of emotional intelligence (i.e., self-management, social skills, and empathy) and online gaming addiction, and perceived helplessness mediated the relationship between two dimensions of emotional intelligence (i.e., self-management and emotion utilization) and online gaming addiction. These findings underscore the importance of separating the four dimensions of emotional intelligence and two facets of perceived stress to understand the complex relationship between these factors and online gaming addiction. PMID:28751876

  3. Buying Impulsive Trait: An effective moderator for shopping emotions and perceived risk

    OpenAIRE

    Sinha, Piyush Kumar; Mishra, Hari Govind; Kaul, Surabhi; Singh, Sarabjot

    2014-01-01

    The study provides an evidence of the relationship between buying traits, perceived risk and buying emotions. The study also indicates that the three emotional states of arousal and pleasure and dominance have significant relationship with impulsive buying behavior. Arousal which was active with buying intentions and impulsive buying was seen insignificant with moderating regression results. Buying impulsive trait was found to be significant moderator of pleasure, dominance, perceived risk an...

  4. Perceived maternal autonomy-support and early adolescent emotion regulation: a longitudinal study

    OpenAIRE

    Brenning, Katrijn; Soenens, Bart; Van Petegem, Stijn; Vansteenkiste, Maarten

    2015-01-01

    This study investigated longitudinal associations between perceived maternal autonomy-supportive parenting and early adolescents' use of three emotion regulation (ER) styles: emotional integration, suppressive regulation, and dysregulation. We tested whether perceived maternal autonomy support predicted changes in ER and whether these ER styles, in turn, related to changes in adjustment (i.e., depressive symptoms, self-esteem). Participants (N= 311, mean age at Time 1 = 12.04) reported on per...

  5. Asymmetric Engagement of Amygdala and Its Gamma Connectivity in Early Emotional Face Processing

    Science.gov (United States)

    Liu, Tai-Ying; Chen, Yong-Sheng; Hsieh, Jen-Chuen; Chen, Li-Fen

    2015-01-01

    The amygdala has been regarded as a key substrate for emotion processing. However, the engagement of the left and right amygdala during the early perceptual processing of different emotional faces remains unclear. We investigated the temporal profiles of oscillatory gamma activity in the amygdala and effective connectivity of the amygdala with the thalamus and cortical areas during implicit emotion-perceptual tasks using event-related magnetoencephalography (MEG). We found that within 100 ms after stimulus onset the right amygdala habituated to emotional faces rapidly (with duration around 20–30 ms), whereas activity in the left amygdala (with duration around 50–60 ms) sustained longer than that in the right. Our data suggest that the right amygdala could be linked to autonomic arousal generated by facial emotions and the left amygdala might be involved in decoding or evaluating expressive faces in the early perceptual emotion processing. The results of effective connectivity provide evidence that only negative emotional processing engages both cortical and subcortical pathways connected to the right amygdala, representing its evolutional significance (survival). These findings demonstrate the asymmetric engagement of bilateral amygdala in emotional face processing as well as the capability of MEG for assessing thalamo-cortico-limbic circuitry. PMID:25629899

  6. Perceived hunger mediates the relationship between attachment anxiety and emotional eating.

    Science.gov (United States)

    Alexander, Katherine E; Siegel, Harold I

    2013-08-01

    Eating is an inherently emotional activity and the attachment system is an emotion regulation system. Individuals with attachment insecurity have less interoceptive awareness and difficulty regulating emotion. Insecurely attached individuals may eat emotionally because they misinterpret internal hunger cues, (i.e. think they are hungry when they are experiencing some other internal, attachment-related state). The current study found a positive association between attachment anxiety and emotional eating. This relationship was mediated by perceived hunger. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. The Impact of Top-Down Prediction on Emotional Face Processing in Social Anxiety

    Directory of Open Access Journals (Sweden)

    Guangming Ran

    2017-07-01

    Full Text Available There is evidence that people with social anxiety show abnormal processing of emotional faces. To investigate the impact of top-down prediction on emotional face processing in social anxiety, brain responses of participants with high and low social anxiety (LSA were recorded, while they performed a variation of the emotional task, using high temporal resolution event-related potential techniques. Behaviorally, we reported an effect of prediction with higher accuracy for predictable than unpredictable faces. Furthermore, we found that participants with high social anxiety (HSA, but not with LSA, recognized angry faces more accurately than happy faces. For the P100 and P200 components, HSA participants showed enhanced brain activity for angry faces compared to happy faces, suggesting a hypervigilance to angry faces. Importantly, HSA participants exhibited larger N170 amplitudes in the right hemisphere electrodes than LSA participants when they observed unpredictable angry faces, but not when the angry faces were predictable. This probably reflects the top-down prediction improving the deficiency at building a holistic face representation in HSA participants.

  8. Neural activation to emotional faces in adolescents with autism spectrum disorders.

    Science.gov (United States)

    Weng, Shih-Jen; Carrasco, Melisa; Swartz, Johnna R; Wiggins, Jillian Lee; Kurapati, Nikhil; Liberzon, Israel; Risi, Susan; Lord, Catherine; Monk, Christopher S

    2011-03-01

    Autism spectrum disorders (ASD) involve a core deficit in social functioning and impairments in the ability to recognize face emotions. In an emotional faces task designed to constrain group differences in attention, the present study used functional MRI to characterize activation in the amygdala, ventral prefrontal cortex (vPFC), and striatum, three structures involved in socio-emotional processing in adolescents with ASD. Twenty-two adolescents with ASD and 20 healthy adolescents viewed facial expressions (happy, fearful, sad and neutral) that were briefly presented (250 ms) during functional MRI acquisition. To monitor attention, subjects pressed a button to identify the gender of each face. The ASD group showed greater activation to the faces relative to the control group in the amygdala, vPFC and striatum. Follow-up analyses indicated that the ASD relative to control group showed greater activation in the amygdala, vPFC and striatum (p gender identification task. When group differences in attention to facial expressions were limited, adolescents with ASD showed greater activation in structures involved in socio-emotional processing. © 2010 The Authors. Journal of Child Psychology and Psychiatry © 2010 Association for Child and Adolescent Mental Health.

  9. Are Max-Specified Infant Facial Expressions during Face-to-Face Interaction Consistent with Differential Emotions Theory?

    Science.gov (United States)

    Matias, Reinaldo; Cohn, Jeffrey F.

    1993-01-01

    Examined infant facial expressions at two, four, and six months of age during face-to-face play and a still-face interaction with their mothers. Contrary to differential emotions theory, at no age did proportions or durations of discrete and blended negative expressions differ; they also showed different patterns of developmental change. (MM)

  10. Psilocybin modulates functional connectivity of the amygdala during emotional face discrimination.

    Science.gov (United States)

    Grimm, O; Kraehenmann, R; Preller, K H; Seifritz, E; Vollenweider, F X

    2018-04-24

    Recent studies suggest that the antidepressant effects of the psychedelic 5-HT2A receptor agonist psilocybin are mediated through its modulatory properties on prefrontal and limbic brain regions including the amygdala. To further investigate the effects of psilocybin on emotion processing networks, we studied for the first-time psilocybin's acute effects on amygdala seed-to-voxel connectivity in an event-related face discrimination task in 18 healthy volunteers who received psilocybin and placebo in a double-blind balanced cross-over design. The amygdala has been implicated as a salience detector especially involved in the immediate response to emotional face content. We used beta-series amygdala seed-to-voxel connectivity during an emotional face discrimination task to elucidate the connectivity pattern of the amygdala over the entire brain. When we compared psilocybin to placebo, an increase in reaction time for all three categories of affective stimuli was found. Psilocybin decreased the connectivity between amygdala and the striatum during angry face discrimination. During happy face discrimination, the connectivity between the amygdala and the frontal pole was decreased. No effect was seen during discrimination of fearful faces. Thus, we show psilocybin's effect as a modulator of major connectivity hubs of the amygdala. Psilocybin decreases the connectivity between important nodes linked to emotion processing like the frontal pole or the striatum. Future studies are needed to clarify whether connectivity changes predict therapeutic effects in psychiatric patients. Copyright © 2018 Elsevier B.V. and ECNP. All rights reserved.

  11. Mixed emotions: Sensitivity to facial variance in a crowd of faces.

    Science.gov (United States)

    Haberman, Jason; Lee, Pegan; Whitney, David

    2015-01-01

    The visual system automatically represents summary information from crowds of faces, such as the average expression. This is a useful heuristic insofar as it provides critical information about the state of the world, not simply information about the state of one individual. However, the average alone is not sufficient for making decisions about how to respond to a crowd. The variance or heterogeneity of the crowd--the mixture of emotions--conveys information about the reliability of the average, essential for determining whether the average can be trusted. Despite its importance, the representation of variance within a crowd of faces has yet to be examined. This is addressed here in three experiments. In the first experiment, observers viewed a sample set of faces that varied in emotion, and then adjusted a subsequent set to match the variance of the sample set. To isolate variance as the summary statistic of interest, the average emotion of both sets was random. Results suggested that observers had information regarding crowd variance. The second experiment verified that this was indeed a uniquely high-level phenomenon, as observers were unable to derive the variance of an inverted set of faces as precisely as an upright set of faces. The third experiment replicated and extended the first two experiments using method-of-constant-stimuli. Together, these results show that the visual system is sensitive to emergent information about the emotional heterogeneity, or ambivalence, in crowds of faces.

  12. Laughter exaggerates happy and sad faces depending on visual context.

    Science.gov (United States)

    Sherman, Aleksandra; Sweeny, Timothy D; Grabowecky, Marcia; Suzuki, Satoru

    2012-04-01

    Laughter is an auditory stimulus that powerfully conveys positive emotion. We investigated how laughter influenced the visual perception of facial expressions. We presented a sound clip of laughter simultaneously with a happy, a neutral, or a sad schematic face. The emotional face was briefly presented either alone or among a crowd of neutral faces. We used a matching method to determine how laughter influenced the perceived intensity of the happy, neutral, and sad expressions. For a single face, laughter increased the perceived intensity of a happy expression. Surprisingly, for a crowd of faces, laughter produced an opposite effect, increasing the perceived intensity of a sad expression in a crowd. A follow-up experiment revealed that this contrast effect may have occurred because laughter made the neutral distractor faces appear slightly happy, thereby making the deviant sad expression stand out in contrast. A control experiment ruled out semantic mediation of the laughter effects. Our demonstration of the strong context dependence of laughter effects on facial expression perception encourages a reexamination of the previously demonstrated effects of prosody, speech content, and mood on face perception, as they may be similarly context dependent.

  13. Testing the effects of expression, intensity and age on emotional face processing in ASD.

    Science.gov (United States)

    Luyster, Rhiannon J; Bick, Johanna; Westerlund, Alissa; Nelson, Charles A

    2017-06-21

    Individuals with autism spectrum disorder (ASD) commonly show global deficits in the processing of facial emotion, including impairments in emotion recognition and slowed processing of emotional faces. Growing evidence has suggested that these challenges may increase with age, perhaps due to minimal improvement with age in individuals with ASD. In the present study, we explored the role of age, emotion type and emotion intensity in face processing for individuals with and without ASD. Twelve- and 18-22- year-old children with and without ASD participated. No significant diagnostic group differences were observed on behavioral measures of emotion processing for younger versus older individuals with and without ASD. However, there were significant group differences in neural responses to emotional faces. Relative to TD, at 12 years of age and during adulthood, individuals with ASD showed slower N170 to emotional faces. While the TD groups' P1 latency was significantly shorter in adults when compared to 12 year olds, there was no significant age-related difference in P1 latency among individuals with ASD. Findings point to potential differences in the maturation of cortical networks that support visual processing (whether of faces or stimuli more broadly), among individuals with and without ASD between late childhood and adulthood. Finally, associations between ERP amplitudes and behavioral responses on emotion processing tasks suggest possible neural markers for emotional and behavioral deficits among individuals with ASD. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Effects of acute psychosocial stress on neural activity to emotional and neutral faces in a face recognition memory paradigm.

    Science.gov (United States)

    Li, Shijia; Weerda, Riklef; Milde, Christopher; Wolf, Oliver T; Thiel, Christiane M

    2014-12-01

    Previous studies have shown that acute psychosocial stress impairs recognition of declarative memory and that emotional material is especially sensitive to this effect. Animal studies suggest a central role of the amygdala which modulates memory processes in hippocampus, prefrontal cortex and other brain areas. We used functional magnetic resonance imaging (fMRI) to investigate neural correlates of stress-induced modulation of emotional recognition memory in humans. Twenty-seven healthy, right-handed, non-smoker male volunteers performed an emotional face recognition task. During encoding, participants were presented with 50 fearful and 50 neutral faces. One hour later, they underwent either a stress (Trier Social Stress Test) or a control procedure outside the scanner which was followed immediately by the recognition session inside the scanner, where participants had to discriminate between 100 old and 50 new faces. Stress increased salivary cortisol, blood pressure and pulse, and decreased the mood of participants but did not impact recognition memory. BOLD data during recognition revealed a stress condition by emotion interaction in the left inferior frontal gyrus and right hippocampus which was due to a stress-induced increase of neural activity to fearful and a decrease to neutral faces. Functional connectivity analyses revealed a stress-induced increase in coupling between the right amygdala and the right fusiform gyrus, when processing fearful as compared to neutral faces. Our results provide evidence that acute psychosocial stress affects medial temporal and frontal brain areas differentially for neutral and emotional items, with a stress-induced privileged processing of emotional stimuli.

  15. Visual Afterimages of Emotional Faces in High Functioning Autism

    Science.gov (United States)

    Rutherford, M. D.; Troubridge, Erin K.; Walsh, Jennifer

    2012-01-01

    Fixating an emotional facial expression can create afterimages, such that subsequent faces are seen as having the opposite expression of that fixated. Visual afterimages have been used to map the relationships among emotion categories, and this method was used here to compare ASD and matched control participants. Participants adapted to a facial…

  16. Association between Ability Emotional Intelligence and Left Insula during Social Judgment of Facial Emotions.

    Directory of Open Access Journals (Sweden)

    Tiziana Quarto

    Full Text Available The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI. Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain "somatic marker circuitry" (SMC sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC. Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.

  17. Virtual faces expressing emotions: an initial concomitant and construct validity study.

    Science.gov (United States)

    Joyal, Christian C; Jacob, Laurence; Cigna, Marie-Hélène; Guay, Jean-Pierre; Renaud, Patrice

    2014-01-01

    Facial expressions of emotions represent classic stimuli for the study of social cognition. Developing virtual dynamic facial expressions of emotions, however, would open-up possibilities, both for fundamental and clinical research. For instance, virtual faces allow real-time Human-Computer retroactions between physiological measures and the virtual agent. The goal of this study was to initially assess concomitants and construct validity of a newly developed set of virtual faces expressing six fundamental emotions (happiness, surprise, anger, sadness, fear, and disgust). Recognition rates, facial electromyography (zygomatic major and corrugator supercilii muscles), and regional gaze fixation latencies (eyes and mouth regions) were compared in 41 adult volunteers (20 ♂, 21 ♀) during the presentation of video clips depicting real vs. virtual adults expressing emotions. Emotions expressed by each set of stimuli were similarly recognized, both by men and women. Accordingly, both sets of stimuli elicited similar activation of facial muscles and similar ocular fixation times in eye regions from man and woman participants. Further validation studies can be performed with these virtual faces among clinical populations known to present social cognition difficulties. Brain-Computer Interface studies with feedback-feedforward interactions based on facial emotion expressions can also be conducted with these stimuli.

  18. Cognitive Biases for Emotional Faces in High- and Low-Trait Depressive Participants

    Directory of Open Access Journals (Sweden)

    Yi-Hsing Hsieh

    2004-10-01

    Full Text Available This study examined the association between trait depression and information-processing biases. Thirty participants were divided into high- and low-trait depressive groups based on the median of their depressive subscale scores according to the Basic Personality Inventory. Information-processing biases were measured using a deployment-of-attention task (DOAT and a recognition memory task (RMT. For the DOAT, participants saw one emotional face paired with a neutral face of the same person, and then were forced to choose on which face the color patch had first occurred. The percentage of participants' choices favoring the happy, angry, or sad faces represented the selective attentional bias score for each emotion, respectively. For the RMT, participants rated different types of emotional faces and subsequently discriminated old faces from new faces. The memory strength for each type of face was calculated from hit and false-positive rates, based on the signal detection theory. Compared with the low-trait depressive group, the high-trait depressive group showed a negative cognitive style. This was an enhanced recognition memory for sad faces and a weakened inhibition of attending to sad faces, suggesting that those with high depressive trait may be vulnerable to interpersonal withdrawal.

  19. Memory for faces and voices varies as a function of sex and expressed emotion.

    Science.gov (United States)

    S Cortes, Diana; Laukka, Petri; Lindahl, Christina; Fischer, Håkan

    2017-01-01

    We investigated how memory for faces and voices (presented separately and in combination) varies as a function of sex and emotional expression (anger, disgust, fear, happiness, sadness, and neutral). At encoding, participants judged the expressed emotion of items in forced-choice tasks, followed by incidental Remember/Know recognition tasks. Results from 600 participants showed that accuracy (hits minus false alarms) was consistently higher for neutral compared to emotional items, whereas accuracy for specific emotions varied across the presentation modalities (i.e., faces, voices, and face-voice combinations). For the subjective sense of recollection ("remember" hits), neutral items received the highest hit rates only for faces, whereas for voices and face-voice combinations anger and fear expressions instead received the highest recollection rates. We also observed better accuracy for items by female expressers, and own-sex bias where female participants displayed memory advantage for female faces and face-voice combinations. Results further suggest that own-sex bias can be explained by recollection, rather than familiarity, rates. Overall, results show that memory for faces and voices may be influenced by the expressions that they carry, as well as by the sex of both items and participants. Emotion expressions may also enhance the subjective sense of recollection without enhancing memory accuracy.

  20. Memory for faces and voices varies as a function of sex and expressed emotion.

    Directory of Open Access Journals (Sweden)

    Diana S Cortes

    Full Text Available We investigated how memory for faces and voices (presented separately and in combination varies as a function of sex and emotional expression (anger, disgust, fear, happiness, sadness, and neutral. At encoding, participants judged the expressed emotion of items in forced-choice tasks, followed by incidental Remember/Know recognition tasks. Results from 600 participants showed that accuracy (hits minus false alarms was consistently higher for neutral compared to emotional items, whereas accuracy for specific emotions varied across the presentation modalities (i.e., faces, voices, and face-voice combinations. For the subjective sense of recollection ("remember" hits, neutral items received the highest hit rates only for faces, whereas for voices and face-voice combinations anger and fear expressions instead received the highest recollection rates. We also observed better accuracy for items by female expressers, and own-sex bias where female participants displayed memory advantage for female faces and face-voice combinations. Results further suggest that own-sex bias can be explained by recollection, rather than familiarity, rates. Overall, results show that memory for faces and voices may be influenced by the expressions that they carry, as well as by the sex of both items and participants. Emotion expressions may also enhance the subjective sense of recollection without enhancing memory accuracy.

  1. Time for a Change: College Students' Preference for Technology-Mediated Versus Face-to-Face Help for Emotional Distress.

    Science.gov (United States)

    Lungu, Anita; Sun, Michael

    2016-12-01

    Even with recent advances in psychological treatments and mobile technology, online computerized therapy is not yet popular. College students, with ubiquitous access to technology, experiencing high distress, and often nontreatment seekers, could be an important area for online treatment dissemination. Finding ways to reach out to college students by offering psychological interventions through technology, devices, and applications they often use, might increase their engagement in treatment. This study evaluates college students' reported willingness to seek help for emotional distress through novel delivery mediums, to play computer games for learning emotional coping skills, and to disclose personal information online. We also evaluated the role of ethnicity and level of emotional distress in help-seeking patterns. A survey exploring our domains of interest and the Mental Health Inventory ([MHI] as mental health index) were completed by 572 students (mean age 18.7 years, predominantly Asian American, female, and freshmen in college). More participants expressed preference for online versus face-to-face professional help. We found no relationship between MHI and help-seeking preference. A third of participants were likely to disclose at least as much information online as face-to-face. Ownership of mobile technology was pervasive. Asian Americans were more likely to be nontreatment seekers than Caucasians. Most participants were interested in serious games for emotional distress. Our results suggest that college students are very open to creative ways of receiving emotional help such as playing games and seeking emotional help online, suggesting a need for online evidence-based treatments.

  2. Parental autonomy granting and child perceived control: Effects on the everyday emotional experience of anxious youth

    OpenAIRE

    Allen, Kristy Benoit; Silk, Jennifer S.; Meller, Suzanne; Tan, Patricia Z.; Ladouceur, Cecile D.; Sheeber, Lisa B.; Forbes, Erika E.; Dahl, Ronald E.; Siegle, Greg J.; McMakin, Dana L.; Ryan, Neal D.

    2015-01-01

    © 2015 Association for Child and Adolescent Mental Health. Background: Childhood anxiety is associated with low levels of parental autonomy granting and child perceived control, elevated child emotional reactivity and deficits in child emotion regulation. In early childhood, low levels of parental autonomy granting are thought to decrease child perceived control, which in turn leads to increases in child negative emotion. Later in development, perceived control may become a more stable, trait...

  3. Facial Features Can Induce Emotion: Evidence from Affective Priming Tasks

    Directory of Open Access Journals (Sweden)

    Chia-Chen Wu

    2011-05-01

    Full Text Available Our previous study found that schematic faces with direct gazes, with mouths, with horizontal oval eyes, or without noses, tend to be perceived as in negative emotion. In this study we further explore these factors by the affective priming task. Faces were taking as prime, and positive or negative words were probe. The task was to judge the valence of the probe. If the faces could induce emotions, a target word with the same emotional valence should be judged faster than with opposite valence (the congruency effect. Experiment 1 used the most positive and negative rated faces in previous study as the primes. The positive faces were with vertical oval eyes and without mouth, while the negative faces were with horizontal eyes and with mouth. Results of 34 participants showed that those faces indeed elicited congruency effects. Experiment 2 manipulated gaze directions (N = 16. After the task the participants were asked to rate the prime faces. According to their rating, faces with direct gaze was perceive as positive, and elicited congruency effect with positive words in affective priming task. Our data thus support the conjecture that shape of eyes, the existence of mouths, and gaze directions could induces emotion.

  4. Influence of perceived city brand image on emotional attachment to the city

    OpenAIRE

    Manyiwa, Simon; Priporas, Constantinos-Vasilios; Wang, Xuan Lorna

    2018-01-01

    Purpose - This study examines the influence of perceived city brand image on emotional attachment to the city. The study also compares the effects of perceived brand image of the city on the emotional attachment to the city across two groups: residents and visitors. \\ud \\ud Design/methodology - A total of 207 usable questionnaires were collected from 107 residents of the city of Bratislava, Slovakia, and 100 visitors to the city. Partial least square structural equation modelling (PLS-SEM) me...

  5. Influence of perceived city brand image on emotional attachment to the city

    OpenAIRE

    Manyiwa, Simon; Priporas, Constantinos-Vasilios; Wang, Xuan Lorna

    2018-01-01

    Purpose - This study examines the influence of perceived city brand image on emotional attachment to the city. The study also compares the effects of perceived brand image of the city on the emotional attachment to the city across two groups: residents and visitors. Design/methodology - A total of 207 usable questionnaires were collected from 107 residents of the city of Bratislava, Slovakia, and 100 visitors to the city. Partial least square structural equation modelling (PLS-SEM) met...

  6. Exploring individual differences in online and face-to-face help-seeking intentions in case of impending mental health problems: The role of adult attachment, perceived social support, psychological distress and self-stigma

    Directory of Open Access Journals (Sweden)

    Jennifer Apolinário-Hagen

    2016-11-01

    Full Text Available Background: Even though common mental health problems such as depression are a global burden calling for efficient prevention strategies, still many distressed individuals face hurdles to access public mental healthcare. Thus, computerized Internet-based psychological services have been suggested as viable approach to overcome barriers, such as self-stigma, and to inform the access to professional support on a large scale. However, little research has targeted predictors of online and face-to-face help-seeking intentions. Objective: This study aimed at determining whether associations between attachment insecurity and the willingness to seek online versus face-to-face counselling in case of impending emotional problems are mediated by both perceived social support and psychological distress and moderated by self-stigma. Methods: Data was collected from 301 adults from the German-speaking general population (age: M = 34.42, SD = 11.23; range: 18 - 65 years; 72.1% female through an anonymous online survey. Determinants of seeking help were assessed with the self-report measures Experiences in Close Relationship-Scale, Perceived Stress Questionnaire, ENRICHD-Social Support Inventory and an adapted version of the General Help Seeking Questionnaire (i.e. case vignette. Mediation analyses were performed with the SPSS-macro PROCESS by Hayes. Results: About half of the sample indicated being not aware of online counselling. As expected, insecure attachment was associated with less perceived social support and increased psychological distress. Mediational analyses revealed negative relationships between both attachment avoidance and self-stigma with face-to-face help-seeking intentions. Moreover, the relationship between attachment anxiety and the willingness to seek face-to-face counselling was mediated by social support. In contrast, none of the predictors of online counselling was statistically significant. Conclusions: Overall, this study identified

  7. A new method for face detection in colour images for emotional bio-robots

    Institute of Scientific and Technical Information of China (English)

    HAPESHI; Kevin

    2010-01-01

    Emotional bio-robots have become a hot research topic in last two decades. Though there have been some progress in research, design and development of various emotional bio-robots, few of them can be used in practical applications. The study of emotional bio-robots demands multi-disciplinary co-operation. It involves computer science, artificial intelligence, 3D computation, engineering system modelling, analysis and simulation, bionics engineering, automatic control, image processing and pattern recognition etc. Among them, face detection belongs to image processing and pattern recognition. An emotional robot must have the ability to recognize various objects, particularly, it is very important for a bio-robot to be able to recognize human faces from an image. In this paper, a face detection method is proposed for identifying any human faces in colour images using human skin model and eye detection method. Firstly, this method can be used to detect skin regions from the input colour image after normalizing its luminance. Then, all face candidates are identified using an eye detection method. Comparing with existing algorithms, this method only relies on the colour and geometrical data of human face rather than using training datasets. From experimental results, it is shown that this method is effective and fast and it can be applied to the development of an emotional bio-robot with further improvements of its speed and accuracy.

  8. Emotion Recognition in Face and Body Motion in Bulimia Nervosa.

    Science.gov (United States)

    Dapelo, Marcela Marin; Surguladze, Simon; Morris, Robin; Tchanturia, Kate

    2017-11-01

    Social cognition has been studied extensively in anorexia nervosa (AN), but there are few studies in bulimia nervosa (BN). This study investigated the ability of people with BN to recognise emotions in ambiguous facial expressions and in body movement. Participants were 26 women with BN, who were compared with 35 with AN, and 42 healthy controls. Participants completed an emotion recognition task by using faces portraying blended emotions, along with a body emotion recognition task by using videos of point-light walkers. The results indicated that BN participants exhibited difficulties recognising disgust in less-ambiguous facial expressions, and a tendency to interpret non-angry faces as anger, compared with healthy controls. These difficulties were similar to those found in AN. There were no significant differences amongst the groups in body motion emotion recognition. The findings suggest that difficulties with disgust and anger recognition in facial expressions may be shared transdiagnostically in people with eating disorders. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association.

  9. Emotion perception, but not affect perception, is impaired with semantic memory loss.

    Science.gov (United States)

    Lindquist, Kristen A; Gendron, Maria; Barrett, Lisa Feldman; Dickerson, Bradford C

    2014-04-01

    For decades, psychologists and neuroscientists have hypothesized that the ability to perceive emotions on others' faces is inborn, prelinguistic, and universal. Concept knowledge about emotion has been assumed to be epiphenomenal to emotion perception. In this article, we report findings from 3 patients with semantic dementia that cannot be explained by this "basic emotion" view. These patients, who have substantial deficits in semantic processing abilities, spontaneously perceived pleasant and unpleasant expressions on faces, but not discrete emotions such as anger, disgust, fear, or sadness, even in a task that did not require the use of emotion words. Our findings support the hypothesis that discrete emotion concept knowledge helps transform perceptions of affect (positively or negatively valenced facial expressions) into perceptions of discrete emotions such as anger, disgust, fear, and sadness. These findings have important consequences for understanding the processes supporting emotion perception.

  10. Functional Brain Activation to Emotional and non-Emotional Faces in Healthy Children: Evidence for Developmentally Undifferentiated Amygdala Function During the School Age Period

    Science.gov (United States)

    Pagliaccio, David; Luby, Joan L.; Gaffrey, Michael S.; Belden, Andrew C.; Botteron, Kelly N.; Harms, Michael P.; Barch, Deanna M.

    2013-01-01

    The amygdala is a key region in emotion processing. Particularly, fMRI studies have demonstrated that the amygdala is active during the viewing of emotional faces. Previous research has consistently found greater amygdala responses to fearful faces as compared to neutral faces in adults, convergent with a focus in the animal literature on the amygdala's role in fear processing. Studies have found that the amygdala also responds differentially to other facial emotion types in adults. Yet, the literature regarding when this differential amygdala responsivity develops is limited and mixed. Thus, the goal of current study was to examine amygdala responses to emotional and neutral faces in a relatively large sample of healthy school age children (N = 52). While the amygdala was active in response to emotional and neutral faces, the results do not support the hypothesis that the amygdala responds differentially to emotional faces in 7 – 12 year old children. Nonetheless, amygdala activity was correlated with the severity of subclinical depression symptoms and emotional regulation skills. Additionally, sex differences were observed in frontal, temporal, and visual regions as well as effects of pubertal development in visual regions. These findings suggest important differences in amygdala reactivity in childhood. PMID:23636982

  11. The sound of the crowd: auditory information modulates the perceived emotion of a crowd based on bodily expressions.

    Science.gov (United States)

    McHugh, Joanna E; Kearney, Gavin; Rice, Henry; Newell, Fiona N

    2012-02-01

    Although both auditory and visual information can influence the perceived emotion of an individual, how these modalities contribute to the perceived emotion of a crowd of characters was hitherto unknown. Here, we manipulated the ambiguity of the emotion of either a visual or auditory crowd of characters by varying the proportions of characters expressing one of two emotional states. Using an intersensory bias paradigm, unambiguous emotional information from an unattended modality was presented while participants determined the emotion of a crowd in an attended, but different, modality. We found that emotional information in an unattended modality can disambiguate the perceived emotion of a crowd. Moreover, the size of the crowd had little effect on these crossmodal influences. The role of audiovisual information appears to be similar in perceiving emotion from individuals or crowds. Our findings provide novel insights into the role of multisensory influences on the perception of social information from crowds of individuals. PsycINFO Database Record (c) 2012 APA, all rights reserved

  12. Emotional Faces in Context: Age Differences in Recognition Accuracy and Scanning Patterns

    Science.gov (United States)

    Noh, Soo Rim; Isaacowitz, Derek M.

    2014-01-01

    While age-related declines in facial expression recognition are well documented, previous research relied mostly on isolated faces devoid of context. We investigated the effects of context on age differences in recognition of facial emotions and in visual scanning patterns of emotional faces. While their eye movements were monitored, younger and older participants viewed facial expressions (i.e., anger, disgust) in contexts that were emotionally congruent, incongruent, or neutral to the facial expression to be identified. Both age groups had highest recognition rates of facial expressions in the congruent context, followed by the neutral context, and recognition rates in the incongruent context were worst. These context effects were more pronounced for older adults. Compared to younger adults, older adults exhibited a greater benefit from congruent contextual information, regardless of facial expression. Context also influenced the pattern of visual scanning characteristics of emotional faces in a similar manner across age groups. In addition, older adults initially attended more to context overall. Our data highlight the importance of considering the role of context in understanding emotion recognition in adulthood. PMID:23163713

  13. Effects of facial emotion recognition remediation on visual scanning of novel face stimuli.

    Science.gov (United States)

    Marsh, Pamela J; Luckett, Gemma; Russell, Tamara; Coltheart, Max; Green, Melissa J

    2012-11-01

    Previous research shows that emotion recognition in schizophrenia can be improved with targeted remediation that draws attention to important facial features (eyes, nose, mouth). Moreover, the effects of training have been shown to last for up to one month after training. The aim of this study was to investigate whether improved emotion recognition of novel faces is associated with concomitant changes in visual scanning of these same novel facial expressions. Thirty-nine participants with schizophrenia received emotion recognition training using Ekman's Micro-Expression Training Tool (METT), with emotion recognition and visual scanpath (VSP) recordings to face stimuli collected simultaneously. Baseline ratings of interpersonal and cognitive functioning were also collected from all participants. Post-METT training, participants showed changes in foveal attention to the features of facial expressions of emotion not used in METT training, which were generally consistent with the information about important features from the METT. In particular, there were changes in how participants looked at the features of facial expressions of emotion surprise, disgust, fear, happiness, and neutral, demonstrating that improved emotion recognition is paralleled by changes in the way participants with schizophrenia viewed novel facial expressions of emotion. However, there were overall decreases in foveal attention to sad and neutral faces that indicate more intensive instruction might be needed for these faces during training. Most importantly, the evidence shows that participant gender may affect training outcomes. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Are patients with schizophrenia impaired in processing non-emotional features of human faces?

    Directory of Open Access Journals (Sweden)

    Hayley eDarke

    2013-08-01

    Full Text Available It is known that individuals with schizophrenia exhibit signs of impaired face processing, however, the exact perceptual and cognitive mechanisms underlying these deficits are yet to be elucidated. One possible source of confusion in the current literature is the methodological and conceptual inconsistencies that can arise from the varied treatment of different aspects of face processing relating to emotional and non-emotional aspects of face perception. This review aims to disentangle the literature by focusing on the performance of patients with schizophrenia in a range of tasks that required processing of non-emotional features of face stimuli (e.g. identity or gender. We also consider the performance of patients on non-face stimuli that share common elements such as familiarity (e.g. cars and social relevance (e.g. gait. We conclude by exploring whether observed deficits are best considered as face-specific and note that further investigation is required to properly assess the potential contribution of more generalised attentional or perceptual impairments.

  15. Poignancy: Mixed Emotional Experience in the Face of Meaningful Endings

    Science.gov (United States)

    Ersner-Hershfield, Hal; Mikels, Joseph A.; Sullivan, Sarah J.; Carstensen, Laura L.

    2009-01-01

    The experience of mixed emotions increases with age. Socioemotional selectivity theory suggests that mixed emotions are associated with shifting time horizons. Theoretically, perceived constraints on future time increase appreciation for life, which, in turn, elicits positive emotions such as happiness. Yet, the very same temporal constraints heighten awareness that these positive experiences come to an end, thus yielding mixed emotional states. In 2 studies, the authors examined the link between the awareness of anticipated endings and mixed emotional experience. In Study 1, participants repeatedly imagined being in a meaningful location. Participants in the experimental condition imagined being in the meaningful location for the final time. Only participants who imagined “last times” at meaningful locations experienced more mixed emotions. In Study 2, college seniors reported their emotions on graduation day. Mixed emotions were higher when participants were reminded of the ending that they were experiencing. Findings suggest that poignancy is an emotional experience associated with meaningful endings. PMID:18179325

  16. THE INFLUENCE OF PERCEIVED QUALITY, BRAND IMAGE, AND EMOTIONAL VALUE TOWARDS PURCHASE INTENTION OF CONSINA BACKPACK

    Directory of Open Access Journals (Sweden)

    Basrah Saidani

    2017-05-01

    Full Text Available This study was conducted to determine the descriptive and empirical impact of perceived quality, brand image and emotional value toward purchase intention. The object of this research was respondents of backpack users in East Jakarta. Methods of data collection using survey methods. Data analysis using SPSS to process and analyze the research data. The results of descriptive test explained that perceived quality, brand image and emotional value of backpack Consina is good according most respondents answer, so they have quite high purchase intention. The hypothesis test shows, perceived quality has positive and significant effect on purchase intention, brand image has positive and significant effect on purchase intention, emotional value has positive and significant effect on purchase intention, perceived quality, brand image and emotional value have simultaneous effect on purchase intention.

  17. Emotional expectations influence neural sensitivity to fearful faces in humans:An event-related potential study

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The present study tested whether neural sensitivity to salient emotional facial expressions was influenced by emotional expectations induced by a cue that validly predicted the expression of a subsequently presented target face. Event-related potentials (ERPs) elicited by fearful and neutral faces were recorded while participants performed a gender discrimination task under cued (‘expected’) and uncued (‘unexpected’) conditions. The behavioral results revealed that accuracy was lower for fearful compared with neutral faces in the unexpected condition, while accuracy was similar for fearful and neutral faces in the expected condition. ERP data revealed increased amplitudes in the P2 component and 200–250 ms interval for unexpected fearful versus neutral faces. By contrast, ERP responses were similar for fearful and neutral faces in the expected condition. These findings indicate that human neural sensitivity to fearful faces is modulated by emotional expectations. Although the neural system is sensitive to unpredictable emotionally salient stimuli, sensitivity to salient stimuli is reduced when these stimuli are predictable.

  18. The face of fear and anger: Facial width-to-height ratio biases recognition of angry and fearful expressions.

    Science.gov (United States)

    Deska, Jason C; Lloyd, E Paige; Hugenberg, Kurt

    2018-04-01

    The ability to rapidly and accurately decode facial expressions is adaptive for human sociality. Although judgments of emotion are primarily determined by musculature, static face structure can also impact emotion judgments. The current work investigates how facial width-to-height ratio (fWHR), a stable feature of all faces, influences perceivers' judgments of expressive displays of anger and fear (Studies 1a, 1b, & 2), and anger and happiness (Study 3). Across 4 studies, we provide evidence consistent with the hypothesis that perceivers more readily see anger on faces with high fWHR compared with those with low fWHR, which instead facilitates the recognition of fear and happiness. This bias emerges when participants are led to believe that targets displaying otherwise neutral faces are attempting to mask an emotion (Studies 1a & 1b), and is evident when faces display an emotion (Studies 2 & 3). Together, these studies suggest that target facial width-to-height ratio biases ascriptions of emotion with consequences for emotion recognition speed and accuracy. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  19. Faces and bodies: perception and mimicry of emotionally congruent and incongruent facial and bodily expressions

    Directory of Open Access Journals (Sweden)

    Mariska eKret

    2013-02-01

    Full Text Available Traditional emotion theories stress the importance of the face in the expression of emotions but bodily expressions are becoming increasingly important. Here we tested the hypothesis that similar physiological responses can be evoked by observing emotional face and body signals and that the reaction to angry signals is amplified in anxious individuals. We designed three experiments in which participants categorized emotional expressions from isolated facial and bodily expressions and from emotionally congruent and incongruent face-body compounds. Participants’ fixations were measured and their pupil size recorded with eye-tracking equipment, and their facial reactions measured with electromyography (EMG. The behavioral results support our prediction that the recognition of a facial expression is improved in the context of a matching posture and importantly, also vice versa. From their facial expression, it appeared that observers acted with signs of negative emotionality (increased corrugator activity to angry and fearful facial expressions and with positive emotionality (increased zygomaticus to happy facial expressions. What we predicted and found, was that angry and fearful cues from the face or the body, attracted more attention than happy cues. We further observed that responses evoked by angry cues were amplified in individuals with high anxiety scores. In sum, we show that people process bodily expressions of emotion in a similar fashion as facial expressions and that the congruency between the emotional signals from the face and body ameliorates the recognition of the emotion.

  20. Putting the face in context: Body expressions impact facial emotion processing in human infants

    Directory of Open Access Journals (Sweden)

    Purva Rajhans

    2016-06-01

    Full Text Available Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs. We primed infants with body postures (fearful, happy that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception.

  1. Veiled emotions: the effect of covered faces on emotion perception and attitudes

    NARCIS (Netherlands)

    Fischer, A.H.; Gillebaart, M.; Rotteveel, M.; Becker, D.; Vliek, M.

    2012-01-01

    The present study explores the relative absence of expressive cues and the effect of contextual cues on the perception of emotions and its effect on attitudes. The visibility of expressive cues was manipulated by showing films displaying female targets whose faces were either fully visible, covered

  2. Predictive validity of Perceived Emotional Intelligence on nursing students' self-concept.

    Science.gov (United States)

    Augusto Landa, José María; López-Zafra, Esther; Aguilar-Luzón, Maria del Carmen; de Ugarte, Maria Fe Salguero

    2009-10-01

    This study examines the role of Perceived Emotional Intelligence, in nursing students' self-concept, controlling personality dimensions. Self-image is a cognitive component of the self that contains images of who we are, what we want to be and what we express and wish to express to others. Likewise, there is also an emotional and assessable component known as self-esteem. For a profession that requires not only technical expertise but also psychologically oriented care, knowledge about the self in nursing would be crucial to further development and growth of the profession. However, the role of emotions in the formation of nursing professionals has been scarcely studied. One hundred and thirty five undergraduates from nursing studies voluntarily participated in our study. They completed a questionnaire that comprises several scales. Our results show positive correlations between the Clarity and Emotional Repair components of Perceived Emotional Intelligence and all scales of the self-concept scale. Furthermore, we found positive relationships between the Extraversion and Accountability components of personality with almost all the scales of the self-concept and negative relationships with personality and neuroticism components of the self-concept.

  3. Music-Elicited Emotion Identification Using Optical Flow Analysis of Human Face

    Science.gov (United States)

    Kniaz, V. V.; Smirnova, Z. N.

    2015-05-01

    Human emotion identification from image sequences is highly demanded nowadays. The range of possible applications can vary from an automatic smile shutter function of consumer grade digital cameras to Biofied Building technologies, which enables communication between building space and residents. The highly perceptual nature of human emotions leads to the complexity of their classification and identification. The main question arises from the subjective quality of emotional classification of events that elicit human emotions. A variety of methods for formal classification of emotions were developed in musical psychology. This work is focused on identification of human emotions evoked by musical pieces using human face tracking and optical flow analysis. Facial feature tracking algorithm used for facial feature speed and position estimation is presented. Facial features were extracted from each image sequence using human face tracking with local binary patterns (LBP) features. Accurate relative speeds of facial features were estimated using optical flow analysis. Obtained relative positions and speeds were used as the output facial emotion vector. The algorithm was tested using original software and recorded image sequences. The proposed technique proves to give a robust identification of human emotions elicited by musical pieces. The estimated models could be used for human emotion identification from image sequences in such fields as emotion based musical background or mood dependent radio.

  4. Perceived Expressed Emotion, Emotional and Behavioral Problems and Self Esteem in Obese Adolescents: A Case-Control Study.

    Science.gov (United States)

    Çolpan, Merve; Eray, Şafak; Eren, Erdal; Vural, Ayşe Pınar

    2018-05-23

    Obesity is a chronic disease which causes medical and psychiatric complications. Family climate is also a critical factor in the presence and treatment of obesity and comorbid psychiatric disorders. In our study, perceived expressed emotion (EE), psychopathology, self-esteem and emotional and behavioural problems among obese adolescents will be investigated by comparison with their non-obese peers. This study was carried out with 49 obese adolescents and 47 non-obese adolescents as a control group. All participants were requested to fill out the Socio-demographic Data Form, Shortened Level of Expressed Emotion Scale, Rosenberg Self-Esteem Scale, Strength and Difficulties Questionnaire-Adolescent Form. In our study, obese adolescents showed a significant difference in perceived EE (pself-esteem (pself esteem. A higher rate of perceived EE, psychopathology and low self-esteem among obese adolescents showed that obesity prevention and treatment are also crucial for mental health in adolescents. With the help of our study results, we aimed to emphasize the role of the family in obese adolescent's mental health and their treatment. By the help of our results we try to identifying risk factors in childhood that promote obesity in order to help develop targeted intervention and prevention programs.

  5. Cultural relativity in perceiving emotion from vocalizations.

    Science.gov (United States)

    Gendron, Maria; Roberson, Debi; van der Vyver, Jacoba Marieta; Barrett, Lisa Feldman

    2014-04-01

    A central question in the study of human behavior is whether certain emotions, such as anger, fear, and sadness, are recognized in nonverbal cues across cultures. We predicted and found that in a concept-free experimental task, participants from an isolated cultural context (the Himba ethnic group from northwestern Namibia) did not freely label Western vocalizations with expected emotion terms. Responses indicate that Himba participants perceived more basic affective properties of valence (positivity or negativity) and to some extent arousal (high or low activation). In a second, concept-embedded task, we manipulated whether the target and foil on a given trial matched in both valence and arousal, neither valence nor arousal, valence only, or arousal only. Himba participants achieved above-chance accuracy only when foils differed from targets in valence only. Our results indicate that the voice can reliably convey affective meaning across cultures, but that perceptions of emotion from the voice are culturally variable.

  6. Expression-dependent susceptibility to face distortions in processing of facial expressions of emotion.

    Science.gov (United States)

    Guo, Kun; Soornack, Yoshi; Settle, Rebecca

    2018-03-05

    Our capability of recognizing facial expressions of emotion under different viewing conditions implies the existence of an invariant expression representation. As natural visual signals are often distorted and our perceptual strategy changes with external noise level, it is essential to understand how expression perception is susceptible to face distortion and whether the same facial cues are used to process high- and low-quality face images. We systematically manipulated face image resolution (experiment 1) and blur (experiment 2), and measured participants' expression categorization accuracy, perceived expression intensity and associated gaze patterns. Our analysis revealed a reasonable tolerance to face distortion in expression perception. Reducing image resolution up to 48 × 64 pixels or increasing image blur up to 15 cycles/image had little impact on expression assessment and associated gaze behaviour. Further distortion led to decreased expression categorization accuracy and intensity rating, increased reaction time and fixation duration, and stronger central fixation bias which was not driven by distortion-induced changes in local image saliency. Interestingly, the observed distortion effects were expression-dependent with less deterioration impact on happy and surprise expressions, suggesting this distortion-invariant facial expression perception might be achieved through the categorical model involving a non-linear configural combination of local facial features. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. The right place at the right time: priming facial expressions with emotional face components in developmental visual agnosia.

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran R; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo

    2012-04-01

    The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG's impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face's emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG's performance was strongly influenced by the diagnosticity of the components: his emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Increased amygdala responses to emotional faces after psilocybin for treatment-resistant depression.

    Science.gov (United States)

    Roseman, Leor; Demetriou, Lysia; Wall, Matthew B; Nutt, David J; Carhart-Harris, Robin L

    2017-12-27

    Recent evidence indicates that psilocybin with psychological support may be effective for treating depression. Some studies have found that patients with depression show heightened amygdala responses to fearful faces and there is reliable evidence that treatment with SSRIs attenuates amygdala responses (Ma, 2015). We hypothesised that amygdala responses to emotional faces would be altered post-treatment with psilocybin. In this open-label study, 20 individuals diagnosed with moderate to severe, treatment-resistant depression, underwent two separate dosing sessions with psilocybin. Psychological support was provided before, during and after these sessions and 19 completed fMRI scans one week prior to the first session and one day after the second and last. Neutral, fearful and happy faces were presented in the scanner and analyses focused on the amygdala. Group results revealed rapid and enduring improvements in depressive symptoms post psilocybin. Increased responses to fearful and happy faces were observed in the right amygdala post-treatment, and right amygdala increases to fearful versus neutral faces were predictive of clinical improvements at 1-week. Psilocybin with psychological support was associated with increased amygdala responses to emotional stimuli, an opposite effect to previous findings with SSRIs. This suggests fundamental differences in these treatments' therapeutic actions, with SSRIs mitigating negative emotions and psilocybin allowing patients to confront and work through them. Based on the present results, we propose that psilocybin with psychological support is a treatment approach that potentially revives emotional responsiveness in depression, enabling patients to reconnect with their emotions. ISRCTN, number ISRCTN14426797. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Perceived Emotional Intelligence as a predictor of Depressive Symptoms after a one year follow-up during Adolescence

    Directory of Open Access Journals (Sweden)

    Diego Gomez-Baya

    2016-04-01

    Full Text Available Research to date has identified various risk factors in the emergence of depressive disorders in adolescence. There are very few studies, however, which have analyzed the role of perceived emotional intelligence in depressive symptoms longitudinally during adolescence. This work aimed to analyze longitudinal relationships between perceived emotional intelligence and depressive symptoms in adolescence, developing an explanatory model of depression following a one-year follow-up. A longitudinal study was carried out with two waves separated by one year, with a sample of 714 Spanish adolescents. The instruments consisted of self-report measures of depressive symptoms and perceived emotional intelligence. Results underlined gender differences in depressive symptoms and emotional intelligence, and indicated that greater emotional intelligence was associated with a lower presence of depressive symptoms after a oneyear follow-up. A multiple partial mediation model was developed to explain longitudinally depressive symptoms based on perceived emotional intelligence skills and depressive symptoms. These contributions underscore the need to design programs to prevent depression in adolescence through the promotion of emotional intelligence.

  10. Age-related differences in event-related potentials for early visual processing of emotional faces.

    Science.gov (United States)

    Hilimire, Matthew R; Mienaltowski, Andrew; Blanchard-Fields, Fredda; Corballis, Paul M

    2014-07-01

    With advancing age, processing resources are shifted away from negative emotional stimuli and toward positive ones. Here, we explored this 'positivity effect' using event-related potentials (ERPs). Participants identified the presence or absence of a visual probe that appeared over photographs of emotional faces. The ERPs elicited by the onsets of angry, sad, happy and neutral faces were recorded. We examined the frontocentral emotional positivity (FcEP), which is defined as a positive deflection in the waveforms elicited by emotional expressions relative to neutral faces early on in the time course of the ERP. The FcEP is thought to reflect enhanced early processing of emotional expressions. The results show that within the first 130 ms young adults show an FcEP to negative emotional expressions, whereas older adults show an FcEP to positive emotional expressions. These findings provide additional evidence that the age-related positivity effect in emotion processing can be traced to automatic processes that are evident very early in the processing of emotional facial expressions. © The Author (2013). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  11. No Differences in Emotion Recognition Strategies in Children with Autism Spectrum Disorder: Evidence from Hybrid Faces

    Directory of Open Access Journals (Sweden)

    Kris Evers

    2014-01-01

    Full Text Available Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD. However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness or in the mouth region (so-called bottom-emotions: sadness, anger, and fear. No stronger reliance on mouth information was found in children with ASD.

  12. No differences in emotion recognition strategies in children with autism spectrum disorder: evidence from hybrid faces.

    Science.gov (United States)

    Evers, Kris; Kerkhof, Inneke; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2014-01-01

    Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD). However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD) group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region) with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness) or in the mouth region (so-called bottom-emotions: sadness, anger, and fear). No stronger reliance on mouth information was found in children with ASD.

  13. [Abnormal processing characteristics to basic emotional faces in the early phase in children with autism spectrum disorder].

    Science.gov (United States)

    Lin, Qiong-Xi; Wu, Gui-Hua; Zhang, Ling; Wang, Zeng-Jian; Pan, Ning; Xu, Cai-Juan; Jing, Jin; Jin, Yu

    2018-02-01

    To explore the recognition ability and abnormal processing characteristics to basic emotional faces in the early phase in children with autism spectrum disorders (ASD). Photos of Chinese static faces with four basic emotions (fearful, happy, angry and sad) were used as stimulus. Twenty-five ASD children and twenty-two age- and gender-matched typical developed children (normal controls) were asked to match the emotional faces with words. Event-related potential (ERP) data were recorded concurrently. N170 latencies for total emotion and fearful face in the left temporal region were faster than in the right one in normal controls (P<0.05), but the results were not noted in ASD children. Further, N170 latencies in the left temporal region of ASD children were slower than normal controls for total emotion, fearful and happy faces (P<0.05), and their N170 latencies in the right temporal region were prone to slower than normal controls for angry and fearful faces. The holistic perception speed of emotional faces in the early cognitive processing phase in ASD children is slower than normal controls. The lateralized response in the early phase of recognizing emotional faces may be aberrant in children with ASD.

  14. The Right Place at the Right Time: Priming Facial Expressions with Emotional Face Components in Developmental Visual Agnosia

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran. R.; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo

    2012-01-01

    The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG’s impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face’s emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG’s performance was strongly influenced by the diagnosticity of the components: His emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. PMID:22349446

  15. Transient emotional events and individual affective traits affect emotion recognition in a perceptual decision-making task.

    Science.gov (United States)

    Qiao-Tasserit, Emilie; Garcia Quesada, Maria; Antico, Lia; Bavelier, Daphne; Vuilleumier, Patrik; Pichon, Swann

    2017-01-01

    Both affective states and personality traits shape how we perceive the social world and interpret emotions. The literature on affective priming has mostly focused on brief influences of emotional stimuli and emotional states on perceptual and cognitive processes. Yet this approach does not fully capture more dynamic processes at the root of emotional states, with such states lingering beyond the duration of the inducing external stimuli. Our goal was to put in perspective three different types of affective states (induced affective states, more sustained mood states and affective traits such as depression and anxiety) and investigate how they may interact and influence emotion perception. Here, we hypothesized that absorption into positive and negative emotional episodes generate sustained affective states that outlast the episode period and bias the interpretation of facial expressions in a perceptual decision-making task. We also investigated how such effects are influenced by more sustained mood states and by individual affect traits (depression and anxiety) and whether they interact. Transient emotional states were induced using movie-clips, after which participants performed a forced-choice emotion classification task with morphed facial expressions ranging from fear to happiness. Using a psychometric approach, we show that negative (vs. neutral) clips increased participants' propensity to classify ambiguous faces as fearful during several minutes. In contrast, positive movies biased classification toward happiness only for those clips perceived as most absorbing. Negative mood, anxiety and depression had a stronger effect than transient states and increased the propensity to classify ambiguous faces as fearful. These results provide the first evidence that absorption and different temporal dimensions of emotions have a significant effect on how we perceive facial expressions.

  16. Transient emotional events and individual affective traits affect emotion recognition in a perceptual decision-making task.

    Directory of Open Access Journals (Sweden)

    Emilie Qiao-Tasserit

    Full Text Available Both affective states and personality traits shape how we perceive the social world and interpret emotions. The literature on affective priming has mostly focused on brief influences of emotional stimuli and emotional states on perceptual and cognitive processes. Yet this approach does not fully capture more dynamic processes at the root of emotional states, with such states lingering beyond the duration of the inducing external stimuli. Our goal was to put in perspective three different types of affective states (induced affective states, more sustained mood states and affective traits such as depression and anxiety and investigate how they may interact and influence emotion perception. Here, we hypothesized that absorption into positive and negative emotional episodes generate sustained affective states that outlast the episode period and bias the interpretation of facial expressions in a perceptual decision-making task. We also investigated how such effects are influenced by more sustained mood states and by individual affect traits (depression and anxiety and whether they interact. Transient emotional states were induced using movie-clips, after which participants performed a forced-choice emotion classification task with morphed facial expressions ranging from fear to happiness. Using a psychometric approach, we show that negative (vs. neutral clips increased participants' propensity to classify ambiguous faces as fearful during several minutes. In contrast, positive movies biased classification toward happiness only for those clips perceived as most absorbing. Negative mood, anxiety and depression had a stronger effect than transient states and increased the propensity to classify ambiguous faces as fearful. These results provide the first evidence that absorption and different temporal dimensions of emotions have a significant effect on how we perceive facial expressions.

  17. Transient emotional events and individual affective traits affect emotion recognition in a perceptual decision-making task

    Science.gov (United States)

    Garcia Quesada, Maria; Antico, Lia; Bavelier, Daphne; Vuilleumier, Patrik; Pichon, Swann

    2017-01-01

    Both affective states and personality traits shape how we perceive the social world and interpret emotions. The literature on affective priming has mostly focused on brief influences of emotional stimuli and emotional states on perceptual and cognitive processes. Yet this approach does not fully capture more dynamic processes at the root of emotional states, with such states lingering beyond the duration of the inducing external stimuli. Our goal was to put in perspective three different types of affective states (induced affective states, more sustained mood states and affective traits such as depression and anxiety) and investigate how they may interact and influence emotion perception. Here, we hypothesized that absorption into positive and negative emotional episodes generate sustained affective states that outlast the episode period and bias the interpretation of facial expressions in a perceptual decision-making task. We also investigated how such effects are influenced by more sustained mood states and by individual affect traits (depression and anxiety) and whether they interact. Transient emotional states were induced using movie-clips, after which participants performed a forced-choice emotion classification task with morphed facial expressions ranging from fear to happiness. Using a psychometric approach, we show that negative (vs. neutral) clips increased participants’ propensity to classify ambiguous faces as fearful during several minutes. In contrast, positive movies biased classification toward happiness only for those clips perceived as most absorbing. Negative mood, anxiety and depression had a stronger effect than transient states and increased the propensity to classify ambiguous faces as fearful. These results provide the first evidence that absorption and different temporal dimensions of emotions have a significant effect on how we perceive facial expressions. PMID:28151976

  18. Image-based Analysis of Emotional Facial Expressions in Full Face Transplants.

    Science.gov (United States)

    Bedeloglu, Merve; Topcu, Çagdas; Akgul, Arzu; Döger, Ela Naz; Sever, Refik; Ozkan, Ozlenen; Ozkan, Omer; Uysal, Hilmi; Polat, Ovunc; Çolak, Omer Halil

    2018-01-20

    In this study, it is aimed to determine the degree of the development in emotional expression of full face transplant patients from photographs. Hence, a rehabilitation process can be planned according to the determination of degrees as a later work. As envisaged, in full face transplant cases, the determination of expressions can be confused or cannot be achieved as the healthy control group. In order to perform image-based analysis, a control group consist of 9 healthy males and 2 full-face transplant patients participated in the study. Appearance-based Gabor Wavelet Transform (GWT) and Local Binary Pattern (LBP) methods are adopted for recognizing neutral and 6 emotional expressions which consist of angry, scared, happy, hate, confused and sad. Feature extraction was carried out by using both methods and combination of these methods serially. In the performed expressions, the extracted features of the most distinct zones in the facial area where the eye and mouth region, have been used to classify the emotions. Also, the combination of these region features has been used to improve classifier performance. Control subjects and transplant patients' ability to perform emotional expressions have been determined with K-nearest neighbor (KNN) classifier with region-specific and method-specific decision stages. The results have been compared with healthy group. It has been observed that transplant patients don't reflect some emotional expressions. Also, there were confusions among expressions.

  19. Gender differences in emotion regulation and relationships with perceived health in patients with rheumatoid arthritis.

    Science.gov (United States)

    van Middendorp, Henriët; Geenen, Rinie; Sorbi, Marjolijn J; Hox, Joop J; Vingerhoets, Ad J J M; van Doornen, Lorenz J P; Bijlsma, Johannes W J

    2005-01-01

    Emotion regulation has been associated with perceived health in rheumatoid arthritis, which is diagnosed three times more often in women than men. Our aim was to examine gender differences in styles of emotion regulation (ambiguity, control, orientation, and expression) and gender-specificity of the associations between emotion regulation and perceived health (psychological well-being, social functioning, physical functioning, and disease activity) in 244 female and 91 male patients with rheumatoid arthritis. Women reported more emotional orientation than men, but did not differ from men with regard to ambiguity, control, and expression. Structural equation modelling showed that relationships between emotion regulation and perceived health were more frequent and stronger for women than men. This held especially for the affective dimension of health, while associations were similar for both women and men with regard to social and physical functioning. Only for women, the association between ambiguity and disease activity was significant, which appeared to be mediated by affective functioning. The observations that women are more emotionally oriented than men and that emotion regulation is more interwoven with psychological health in women than men, support the usefulness of a gender-sensitive approach in research and health care of patients with rheumatoid arthritis.

  20. Generational Differences of Emotional Expression

    Institute of Scientific and Technical Information of China (English)

    李学勇

    2014-01-01

    As a kind of subjective psychological activity, emotion can only be known and perceived by a certain expressive form. Varies as the different main bodies, difference of emotional expression can be reflected not only among individuals but between generations. The old conceals their emotions inside, the young express their emotions boldly, and the middle-aged are rational and deep in their expressions. Facing and understanding such differences is the premise and foundation of the con-struction of a harmonious relationship between different generations.

  1. Social and emotional relevance in face processing: Happy faces of future interaction partners enhance the LPP

    Directory of Open Access Journals (Sweden)

    Florian eBublatzky

    2014-07-01

    Full Text Available Human face perception is modulated by both emotional valence and social relevance, but their interaction has rarely been examined. Event-related brain potentials (ERP to happy, neutral, and angry facial expressions with different degrees of social relevance were recorded. Social relevance was manipulated by presenting pictures of two specific face actors as future interaction partners (meet condition, whereas two other face actors remained non-relevant. As a further control condition all stimuli were presented without specific task instructions (passive viewing condition. A within-subject design (Facial Expression x Relevance x Task was implemented, where randomly ordered face stimuli of four actors (2 women, from the KDEF were presented for 1s to 26 participants (16 female. Results showed an augmented N170, early posterior negativity (EPN, and late positive potential (LPP for emotional in contrast to neutral facial expressions. Of particular interest, face processing varied as a function of instructed social relevance. Whereas the meet condition was accompanied with unspecific effects regardless of relevance (P1, EPN, viewing potential interaction partners was associated with increased LPP amplitudes. The LPP was specifically enhanced for happy facial expressions of the future interaction partners. This underscores that social relevance can impact face processing already at an early stage of visual processing. These findings are discussed within the framework of motivated attention and face processing theories.

  2. Putting the face in context: Body expressions impact facial emotion processing in human infants.

    Science.gov (United States)

    Rajhans, Purva; Jessen, Sarah; Missana, Manuela; Grossmann, Tobias

    2016-06-01

    Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs). We primed infants with body postures (fearful, happy) that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Face to face: blocking facial mimicry can selectively impair recognition of emotional expressions.

    Science.gov (United States)

    Oberman, Lindsay M; Winkielman, Piotr; Ramachandran, Vilayanur S

    2007-01-01

    People spontaneously mimic a variety of behaviors, including emotional facial expressions. Embodied cognition theories suggest that mimicry reflects internal simulation of perceived emotion in order to facilitate its understanding. If so, blocking facial mimicry should impair recognition of expressions, especially of emotions that are simulated using facial musculature. The current research tested this hypothesis using four expressions (happy, disgust, fear, and sad) and two mimicry-interfering manipulations (1) biting on a pen and (2) chewing gum, as well as two control conditions. Experiment 1 used electromyography over cheek, mouth, and nose regions. The bite manipulation consistently activated assessed muscles, whereas the chew manipulation activated muscles only intermittently. Further, expressing happiness generated most facial action. Experiment 2 found that the bite manipulation interfered most with recognition of happiness. These findings suggest that facial mimicry differentially contributes to recognition of specific facial expressions, thus allowing for more refined predictions from embodied cognition theories.

  4. Facial EMG responses to emotional expressions are related to emotion perception ability.

    Science.gov (United States)

    Künecke, Janina; Hildebrandt, Andrea; Recio, Guillermo; Sommer, Werner; Wilhelm, Oliver

    2014-01-01

    Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a "reactivation" of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG)--in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110) in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective.

  5. Assessment of incongruent emotions in face and voice

    NARCIS (Netherlands)

    Takagi, S.; Tabei, K.-I.; Huis in 't Veld, E.M.J.; de Gelder, B.

    2013-01-01

    Information derived from facial and vocal nonverbal expressions plays an important role in social communication in the real and virtual worlds. In the present study, we investigated cultural differences between Japanese and Dutch participants in the multisensory perception of emotion. We used a face

  6. Gender differences in the recognition of emotional faces: are men less efficient?

    Directory of Open Access Journals (Sweden)

    Ana Ruiz-Ibáñez

    2017-06-01

    Full Text Available As research in recollection of stimuli with emotional valence indicates, emotions influence memory. Many studies in face and emotional facial expression recognition have focused on age (young and old people and gender-associated (men and women differences. Nevertheless, this kind of studies has produced contradictory results, because of that, it would be necessary to study gender involvement in depth. The main objective of our research consists of analyzing the differences in image recognition using faces with emotional facial expressions between two groups composed by university students aged 18-30. The first group is constituted by men and the second one by women. The results showed statistically significant differences in face corrected recognition (hit rate - false alarm rate: the women demonstrated a better recognition than the men. However, other analyzed variables as time or efficiency do not provide conclusive results. Furthermore, a significant negative correlation between the time used and the efficiency when doing the task was found in the male group. This information reinforces not only the hypothesis of gender difference in face recognition, in favor of women, but also these ones that suggest a different cognitive processing of facial stimuli in both sexes. Finally, we argue the necessity of a greater research related to variables as age or sociocultural level.

  7. A face to remember: emotional expression modulates prefrontal activity during memory formation.

    Science.gov (United States)

    Sergerie, Karine; Lepage, Martin; Armony, Jorge L

    2005-01-15

    Emotion can exert a modulatory role on episodic memory. Several studies have shown that negative stimuli (e.g., words, pictures) are better remembered than neutral ones. Although facial expressions are powerful emotional stimuli and have been shown to influence perception and attention processes, little is known about their effect on memory. We used functional magnetic resonance imaging (fMRI) in humans to investigate the effects of expression (happy, neutral, and fearful) on prefrontal cortex (PFC) activity during the encoding of faces, using a subsequent memory effect paradigm. Our results show that activity in right PFC predicted memory for faces, regardless of expression, while a homotopic region in the left hemisphere was associated with successful encoding only for faces with an emotional expression. These findings are consistent with the proposed role of right dorsolateral PFC in successful encoding of nonverbal material, but also suggest that left DLPFC may be a site where integration of memory and emotional processes occurs. This study sheds new light on the current controversy regarding the hemispheric lateralization of PFC in memory encoding.

  8. Sad benefit in face working memory: an emotional bias of melancholic depression.

    Science.gov (United States)

    Linden, Stefanie C; Jackson, Margaret C; Subramanian, Leena; Healy, David; Linden, David E J

    2011-12-01

    Emotion biases feature prominently in cognitive theories of depression and are a focus of psychological interventions. However, there is presently no stable neurocognitive marker of altered emotion-cognition interactions in depression. One reason may be the heterogeneity of major depressive disorder. Our aim in the present study was to find an emotional bias that differentiates patients with melancholic depression from controls, and patients with melancholic from those with non-melancholic depression. We used a working memory paradigm for emotional faces, where two faces with angry, happy, neutral, sad or fearful expression had to be retained over one second. Twenty patients with melancholic depression, 20 age-, education- and gender-matched control participants and 20 patients with non-melancholic depression participated in the study. We analysed performance on the working memory task using signal detection measures. We found an interaction between group and emotion on working memory performance that was driven by the higher performance for sad faces compared to other categories in the melancholic group. We computed a measure of "sad benefit", which distinguished melancholic and non-melancholic patients with good sensitivity and specificity. However, replication studies and formal discriminant analysis will be needed in order to assess whether emotion bias in working memory may become a useful diagnostic tool to distinguish these two syndromes. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Emotional Clarity as a Buffer in the Association Between Perceived Mental Illness Stigma and Suicide Risk.

    Science.gov (United States)

    Wang, Katie; Weiss, Nicole H; Pachankis, John E; Link, Bruce G

    2016-11-01

    Among people living with psychiatric disorders, mental illness stigma has been identified as a major barrier to recovery by contributing to low self-esteem and interfering with treatment-seeking. The present research examined the association between perceived mental illness stigma and suicide risk severity and considered the role of emotional clarity (i.e., the ability to identify and understand one's emotional experiences), a critical component of emotion regulation, as a moderator of this association. A sample of individuals who had experienced recent psychiatric hospitalizations ( N = 184) completed self-report measures of perceived stigma associated with their psychiatric diagnoses, deficits in emotional clarity, and behaviors that have been found to confer risk for suicide. A moderation analysis revealed that perceived mental illness stigma was positively associated with suicide risk severity, but only for individuals who have greater deficits in emotional clarity. These findings highlight the role of emotional clarity as a resource for individuals coping with mental illness stigma and underscore the potential utility of targeting deficits in emotional clarity in prevention and intervention efforts for reducing suicide risk.

  10. Passive and motivated perception of emotional faces: qualitative and quantitative changes in the face processing network.

    Directory of Open Access Journals (Sweden)

    Laurie R Skelly

    Full Text Available Emotionally expressive faces are processed by a distributed network of interacting sub-cortical and cortical brain regions. The components of this network have been identified and described in large part by the stimulus properties to which they are sensitive, but as face processing research matures interest has broadened to also probe dynamic interactions between these regions and top-down influences such as task demand and context. While some research has tested the robustness of affective face processing by restricting available attentional resources, it is not known whether face network processing can be augmented by increased motivation to attend to affective face stimuli. Short videos of people expressing emotions were presented to healthy participants during functional magnetic resonance imaging. Motivation to attend to the videos was manipulated by providing an incentive for improved recall performance. During the motivated condition, there was greater coherence among nodes of the face processing network, more widespread correlation between signal intensity and performance, and selective signal increases in a task-relevant subset of face processing regions, including the posterior superior temporal sulcus and right amygdala. In addition, an unexpected task-related laterality effect was seen in the amygdala. These findings provide strong evidence that motivation augments co-activity among nodes of the face processing network and the impact of neural activity on performance. These within-subject effects highlight the necessity to consider motivation when interpreting neural function in special populations, and to further explore the effect of task demands on face processing in healthy brains.

  11. The Role of Family Expressed Emotion and Perceived Social Support in Predicting Addiction Relapse

    OpenAIRE

    Atadokht, Akbar; Hajloo, Nader; Karimi, Masoud; Narimani, Mohammad

    2015-01-01

    Background: Emotional conditions governing the family and patients? perceived social support play important roles in the treatment or relapse process of the chronic disease. Objectives: The current study aimed to investigate the role of family expressed emotion and perceived social support in prediction of addiction relapse. Patients and Methods: The descriptive-correlation method was used in the current study. The study population consisted of the individuals referred to the addiction treatm...

  12. Collective Efficacy in Sports and Physical Activities: Perceived Emotional Synchrony and Shared Flow

    Science.gov (United States)

    Zumeta, Larraitz N.; Oriol, Xavier; Telletxea, Saioa; Amutio, Alberto; Basabe, Nekane

    2016-01-01

    This cross-sectional study analyzes the relationship between collective efficacy and two psychosocial processes involved in collective sport-physical activities. It argues that in-group identification and fusion with the group will affect collective efficacy (CE). A sample of 276 university students answered different scales regarding their participation in collective physical and sport activities. Multiple-mediation analyses showed that shared flow and perceived emotional synchrony mediate the relationship between in-group identification and CE, whereas the relationship between identity fusion and CE was only mediated by perceived emotional synchrony. Results suggest that both psychosocial processes explain the positive effects of in-group identification and identity fusion with the group in collective efficacy. Specifically, the role of perceived emotional synchrony in explaining the positive effects of participation in collective sport-physical activities is underlined. In sum, this study highlights the utility of collective actions and social identities to explain the psychosocial processes related to collective efficacy in physical and sports activities. Finally, practical implications are discussed. PMID:26779077

  13. Perceived emotional intelligence, general intelligence and early professional success: predictive and incremental validity

    Directory of Open Access Journals (Sweden)

    José-Manuel de Haro

    2014-05-01

    Full Text Available Although the study of factors affecting career success has shown connections between biographical and other aspects related to ability, knowledge and personality, few studies have examined the relationship between emotional intelligence and professional success at the initial career stage. When these studies were carried out, the results showed significant relationships between the dimensions of emotional intelligence (emotional self-awareness, self-regulation, social awareness or social skills and the level of professional competence. In this paper, we analyze the relationship between perceived emotional intelligence, measured by the Trait Meta-Mood Scale (TMMS-24 questionnaire, general intelligence assessed by the Cattell factor "g" test, scale 3, and extrinsic indicators of career success, in a sample of 130 graduates at the beginning of their careers. Results from hierarchical regression analysis indicate that emotional intelligence makes a specific contribution to the prediction of salary, after controlling the general intelligence effect. The perceived emotional intelligence dimensions of TMMS repair, TMMS attention and sex show a higher correlation and make a greater contribution to professional success than general intelligence. The implications of these results for the development of socio-emotional skills among University graduates are discussed.

  14. The influence of perceived parenting styles on socio-emotional development from pre-puberty into puberty.

    Science.gov (United States)

    Ong, Min Yee; Eilander, Janna; Saw, Seang Mei; Xie, Yuhuan; Meaney, Michael J; Broekman, Birit F P

    2018-01-01

    The relative impact of parenting on socio-emotional development of children has rarely been examined in a longitudinal context. This study examined the association between perceived parenting styles and socio-emotional functioning from childhood to adolescence. We hypothesized that optimal parenting associated with improvement in socio-emotional functioning from childhood into early adulthood, especially for those with more behavioral problems in childhood. Children between ages 7 and 9 years were recruited for the Singapore Cohort Study of Risk Factors for Myopia (SCORM). Nine years later, 700 out of 1052 subjects were followed up (67%). During childhood, parents completed the Child Behavior Checklist (CBCL), while young adults completed the Youth Self-Report (YSR) and Parental Bonding Instrument (PBI). Perceived optimal parental care resulted in less internalizing and externalizing problems in early adulthood in comparison to non-optimal parental care styles. Perceived optimal paternal parenting, but not maternal parenting, in interaction with childhood externalizing problems predicted externalizing symptoms in early adulthood. No significant interactions were found between perceived parenting styles and internalizing problems. In conclusion, perceived parental care associates with the quality of socio-emotional development, while optimal parenting by the father is especially important for children with more externalizing problems in childhood.

  15. An emotional Stroop task with faces and words. A comparison of young and older adults.

    Science.gov (United States)

    Agustí, Ana I; Satorres, Encarnación; Pitarque, Alfonso; Meléndez, Juan C

    2017-08-01

    Given the contradictions of previous studies on the changes in attentional responses produced in aging a Stroop emotional task was proposed to compare young and older adults to words or faces with an emotional valence. The words happy or sad were superimposed on faces that express the emotion of happiness or sadness. The emotion expressed by the word and the face could agree or not (cued and uncued trials, respectively). 85 young and 66 healthy older adults had to identify both faces and words separately, and the interference between the two types of stimuli was examined. An interference effect was observed for both types of stimuli in both groups. There was more interference on positive faces and words than on negative stimuli. Older adults had more difficulty than younger in focusing on positive uncued trials, whereas there was no difference across samples on negative uncued trials. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Emotional face recognition in adolescent suicide attempters and adolescents engaging in non-suicidal self-injury.

    Science.gov (United States)

    Seymour, Karen E; Jones, Richard N; Cushman, Grace K; Galvan, Thania; Puzia, Megan E; Kim, Kerri L; Spirito, Anthony; Dickstein, Daniel P

    2016-03-01

    Little is known about the bio-behavioral mechanisms underlying and differentiating suicide attempts from non-suicidal self-injury (NSSI) in adolescents. Adolescents who attempt suicide or engage in NSSI often report significant interpersonal and social difficulties. Emotional face recognition ability is a fundamental skill required for successful social interactions, and deficits in this ability may provide insight into the unique brain-behavior interactions underlying suicide attempts versus NSSI in adolescents. Therefore, we examined emotional face recognition ability among three mutually exclusive groups: (1) inpatient adolescents who attempted suicide (SA, n = 30); (2) inpatient adolescents engaged in NSSI (NSSI, n = 30); and (3) typically developing controls (TDC, n = 30) without psychiatric illness. Participants included adolescents aged 13-17 years, matched on age, gender and full-scale IQ. Emotional face recognition was evaluated using the diagnostic assessment of nonverbal accuracy (DANVA-2). Compared to TDC youth, adolescents with NSSI made more errors on child fearful and adult sad face recognition while controlling for psychopathology and medication status (ps face recognition between NSSI and SA groups. Secondary analyses showed that compared to inpatients without major depression, those with major depression made fewer errors on adult sad face recognition even when controlling for group status (p recognition errors on adult happy faces even when controlling for group status (p face recognition than TDC, but not inpatient adolescents who attempted suicide. Further results suggest the importance of psychopathology in emotional face recognition. Replication of these preliminary results and examination of the role of context-dependent emotional processing are needed moving forward.

  17. Perceived stress in first year medical students - associations with personal resources and emotional distress.

    Science.gov (United States)

    Heinen, Ines; Bullinger, Monika; Kocalevent, Rüya-Daniela

    2017-01-06

    Medical students have been found to report high levels of perceived stress, yet there is a lack of theoretical frameworks examining possible reasons. This cross-sectional study examines correlates of perceived stress in medical students on the basis of a conceptual stress model originally developed for and applied to the general population. The aim was to identify via structural equation modeling the associations between perceived stress and emotional distress (anxiety and depression), taking into account the activation of personal resources (optimism, self-efficacy and resilient coping). Within this cross-sectional study, 321 first year medical students (age 22 ± 4 years, 39.3% men) completed the Perceived Stress Questionnaire (PSQ-20), the Self-Efficacy Optimism Scale (SWOP) and the Brief Resilient Coping Scale (BRCS) as well as the Patient Health Questionnaire (PHQ-4). The statistical analyses used t-tests, ANOVA, Spearman Rho correlation and multiple regression analysis as well as structural equation modeling. Medical students reported higher levels of perceived stress and higher levels of anxiety and depression than reference samples. No statistically significant differences in stress levels were found within the sample according to gender, migration background or employment status. Students reported more self-efficacy, optimism, and resilient coping and higher emotional distress compared to validation samples and results in other studies. Structural equation analysis revealed a satisfactory fit between empirical data and the proposed stress model indicating that personal resources modulated perceived stress, which in turn had an impact on emotional distress. Medical students' perceived stress and emotional distress levels are generally high, with personal resources acting as a buffer, thus supporting the population-based general stress model. Results suggest providing individual interventions for those students, who need support in dealing with the

  18. Different underlying mechanisms for face emotion and gender processing during feature-selective attention: Evidence from event-related potential studies.

    Science.gov (United States)

    Wang, Hailing; Ip, Chengteng; Fu, Shimin; Sun, Pei

    2017-05-01

    Face recognition theories suggest that our brains process invariant (e.g., gender) and changeable (e.g., emotion) facial dimensions separately. To investigate whether these two dimensions are processed in different time courses, we analyzed the selection negativity (SN, an event-related potential component reflecting attentional modulation) elicited by face gender and emotion during a feature selective attention task. Participants were instructed to attend to a combination of face emotion and gender attributes in Experiment 1 (bi-dimensional task) and to either face emotion or gender in Experiment 2 (uni-dimensional task). The results revealed that face emotion did not elicit a substantial SN, whereas face gender consistently generated a substantial SN in both experiments. These results suggest that face gender is more sensitive to feature-selective attention and that face emotion is encoded relatively automatically on SN, implying the existence of different underlying processing mechanisms for invariant and changeable facial dimensions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Detection of emotional faces: salient physical features guide effective visual search.

    Science.gov (United States)

    Calvo, Manuel G; Nummenmaa, Lauri

    2008-08-01

    In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent, surprised and disgusted faces was found both under upright and inverted display conditions. Inversion slowed down the detection of these faces less than that of others (fearful, angry, and sad). Accordingly, the detection advantage involves processing of featural rather than configural information. The facial features responsible for the detection advantage are located in the mouth rather than the eye region. Computationally modeled visual saliency predicted both attentional orienting and detection. Saliency was greatest for the faces (happy) and regions (mouth) that were fixated earlier and detected faster, and there was close correspondence between the onset of the modeled saliency peak and the time at which observers initially fixated the faces. The authors conclude that visual saliency of specific facial features--especially the smiling mouth--is responsible for facilitated initial orienting, which thus shortens detection. (PsycINFO Database Record (c) 2008 APA, all rights reserved).

  20. Categorical Perception of Emotional Facial Expressions in Preschoolers

    Science.gov (United States)

    Cheal, Jenna L.; Rutherford, M. D.

    2011-01-01

    Adults perceive emotional facial expressions categorically. In this study, we explored categorical perception in 3.5-year-olds by creating a morphed continuum of emotional faces and tested preschoolers' discrimination and identification of them. In the discrimination task, participants indicated whether two examples from the continuum "felt the…

  1. Vicarious Social Touch Biases Gazing at Faces and Facial Emotions.

    Science.gov (United States)

    Schirmer, Annett; Ng, Tabitha; Ebstein, Richard P

    2018-02-01

    Research has suggested that interpersonal touch promotes social processing and other-concern, and that women may respond to it more sensitively than men. In this study, we asked whether this phenomenon would extend to third-party observers who experience touch vicariously. In an eye-tracking experiment, participants (N = 64, 32 men and 32 women) viewed prime and target images with the intention of remembering them. Primes comprised line drawings of dyadic interactions with and without touch. Targets comprised two faces shown side-by-side, with one being neutral and the other being happy or sad. Analysis of prime fixations revealed that faces in touch interactions attracted longer gazing than faces in no-touch interactions. In addition, touch enhanced gazing at the area of touch in women but not men. Analysis of target fixations revealed that touch priming increased looking at both faces immediately after target onset, and subsequently, at the emotional face in the pair. Sex differences in target processing were nonsignificant. Together, the present results imply that vicarious touch biases visual attention to faces and promotes emotion sensitivity. In addition, they suggest that, compared with men, women are more aware of tactile exchanges in their environment. As such, vicarious touch appears to share important qualities with actual physical touch. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  2. Facial EMG responses to emotional expressions are related to emotion perception ability.

    Directory of Open Access Journals (Sweden)

    Janina Künecke

    Full Text Available Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a "reactivation" of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG--in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110 in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective.

  3. Perceived age change after aesthetic facial surgical procedures quantifying outcomes of aging face surgery.

    Science.gov (United States)

    Chauhan, Nitin; Warner, Jeremy P; Adamson, Peter A

    2012-01-01

    To quantify the degree of perceived age change after aesthetic facial surgical procedures to provide an objective measure of surgical success. Sixty patients undergoing various aging face surgical procedures were randomly chosen for analysis. Preoperative and postoperative photographs were evaluated. Raters were presented with photographs in a random assortment and were asked to estimate the age of the patient. Perceived age difference was defined as the difference between the chronological age and the estimated age, and the change in this value after surgery was the chief outcome of interest. Statistical models were designed to account for any effects of interrater differences, preoperative chronological age, rater group, photograph order, or surgical procedure performed. Our patient population was divided into the following 3 groups based on the surgical procedure performed: group 1 (face- and neck-lift [22 patients]), group 2 (face- and neck-lift and upper and lower blepharoplasty [17 patients]), and group 3 (face- and neck-lift, upper and lower blepharoplasty, and forehead-lift [21 patients]). Adjusted means demonstrated that patient ages were estimated to be 1.7 years younger than their chronological age before surgery and 8.9 years younger than their chronological age after surgery. The effect was less substantial for group 1 patients and was most dramatic for group 3 patients, who had undergone all 3 aging face surgical procedures. Our study is novel in that it quantifies the degree of perceived age change after aging face surgical procedures and demonstrates a significant and consistent reduction in perceived age after aesthetic facial surgery. This effect is more substantial when the number of surgical procedures is increased, an effect unrelated to the preoperative age of a patient and unaffected by other variables that we investigated. The ability to perceive age correctly is accurate and consistent.

  4. Emotion and Interhemispheric Interactions in Binocular Rivalry

    Directory of Open Access Journals (Sweden)

    K L Ritchie

    2013-10-01

    Full Text Available Previous research has shown that fear-related stimuli presented in peripheral vision are preferentially processed over stimuli depicting other emotions. Furthermore, emotional content can influence dominance duration in binocular rivalry, with the period of dominance for an emotional image (e.g. a fearful face being significantly longer than a neutral image (e.g. a neutral face or a house. Experiment 1 of the current study combined these two ideas to investigate the role of emotion in binocular rivalry with face/house pairs viewed in the periphery. The results showed that faces were perceived as more dominant than houses, and fearful faces more so than neutral faces, even when viewed in the periphery. Experiment 2 extended this paradigm to present a rival pair in the periphery in each hemifield, with each eye either viewing the same stimulus in each location (traditional condition, or a different stimulus in each location (Diaz-Caneja condition. The results showed that the two pairs tended to rival in synchrony only in the traditional condition. Taken together, the results show that face dominance and emotion dominance in binocular rivalry persist in the periphery, and that interhemispheric interactions in binocular rivalry depend on an eye- as opposed to an object-based mechanism.

  5. Perceiving emotion in non-social targets: The effect of trait empathy on emotional contagion through art.

    Science.gov (United States)

    Stavrova, Olga; Meckel, Andrea

    2017-01-01

    This research examines the role of trait empathy in emotional contagion through non-social targets-art objects. Studies 1a and 1b showed that high- (compared to low-) empathy individuals are more likely to infer an artist's emotions based on the emotional valence of the artwork and, as a result, are more likely to experience the respective emotions themselves. Studies 2a and 2b experimentally manipulated artists' emotions via revealing details about their personal life. Study 3 experimentally induced positive vs. negative emotions in individuals who then wrote literary texts. These texts were shown to another sample of participants. High- (compared to low-) empathy participants were more like to accurately identify and take on the emotions ostensibly (Studies 2a and 2b) or actually (Study 3) experienced by the "artists". High-empathy individuals' enhanced sensitivity to others' emotions is not restricted to social targets, such as faces, but extends to products of the human mind, such as objects of art.

  6. Collective efficacy in sports and physical activities: perceived emotional synchrony and shared flow

    Directory of Open Access Journals (Sweden)

    Larraitz Nerea Zumeta

    2016-01-01

    Full Text Available This cross-sectional study analyzes the relationship between collective efficacy and two psychosocial processes involved in collective sport-physical activities. It argues that in-group identification and fusion with the group will affect collective efficacy (CE. A sample of 276 university students answered different scales regarding their participation in collective physical and sport activities. Multiple-mediation analyses showed that shared flow and perceived emotional synchrony mediate the relationship between in-group identification and CE, whereas the relationship between identity fusion and CE was only mediated by perceived emotional synchrony. Results suggest that both psychosocial processes explain the positive effects of in-group identification and identity fusion with the group in collective efficacy. Especially, the role of perceived emotional synchrony in explaining the positive effects of participation in collective sport-physical activities is underlined. In sum, this study remarks the utility of collective actions and social identities to explain the psychosocial processes related to collective efficacy in physical and sports activities. Finally, practical implications are discussed.

  7. Designing Emotionally Expressive Robots

    DEFF Research Database (Denmark)

    Tsiourti, Christiana; Weiss, Astrid; Wac, Katarzyna

    2017-01-01

    Socially assistive agents, be it virtual avatars or robots, need to engage in social interactions with humans and express their internal emotional states, goals, and desires. In this work, we conducted a comparative study to investigate how humans perceive emotional cues expressed by humanoid...... robots through five communication modalities (face, head, body, voice, locomotion) and examined whether the degree of a robot's human-like embodiment affects this perception. In an online survey, we asked people to identify emotions communicated by Pepper -a highly human-like robot and Hobbit – a robot...... for robots....

  8. Acute pharmacologically induced shifts in serotonin availability abolish emotion-selective responses to negative face emotions in distinct brain networks

    DEFF Research Database (Denmark)

    Grady, Cheryl Lynn; Siebner, Hartwig R; Hornboll, Bettina

    2013-01-01

    Pharmacological manipulation of serotonin availability can alter the processing of facial expressions of emotion. Using a within-subject design, we measured the effect of serotonin on the brain's response to aversive face emotions with functional MRI while 20 participants judged the gender...... of neutral, fearful and angry faces. In three separate and counterbalanced sessions, participants received citalopram (CIT) to raise serotonin levels, underwent acute tryptophan depletion (ATD) to lower serotonin, or were studied without pharmacological challenge (Control). An analysis designed to identify...

  9. Perceived emotional intelligence in nursing: psychometric properties of the Trait Meta-Mood Scale.

    Science.gov (United States)

    Aradilla-Herrero, Amor; Tomás-Sábado, Joaquín; Gómez-Benito, Juana

    2014-04-01

    To examine the psychometric properties of the Trait Meta-Mood Scale in the nursing context and to determine the relationships between emotional intelligence, self-esteem, alexithymia and death anxiety. The Trait Meta-Mood Scale is one of the most widely used self-report measures for assessing perceived emotional intelligence. However, in the nursing context, no extensive analysis has been conducted to examine its psychometric properties. Cross-sectional and observational study. A total of 1417 subjects participated in the study (1208 nursing students and 209 hospital nurses). The Trait Meta-Mood Scale, the Toronto Alexithymia Scale, the Rosenberg Self-Esteem Scale and the Death Anxiety Inventory were all applied to half of the sample (n = 707). A confirmatory factor analysis was carried out, and statistical analyses examined the internal consistency and test-retest reliability of the Trait Meta-Mood Scale, as well as its relationship with relevant variables. Confirmatory factor analysis confirmed the three dimensions of the original scale (Attention, Clarity and Repair). The instrument showed adequate internal consistency and temporal stability. Correlational results indicated that nurses with high scores on emotional Attention experience more death anxiety, report greater difficulties identifying feelings and have less self-esteem. By contrast, nurses with high levels of emotional Clarity and Repair showed less death anxiety and higher levels of self-esteem. The Trait Meta-Mood Scale is an effective, valid and reliable tool for measuring perceived emotional intelligence in the nursing context. Training programmes should seek to promote emotional abilities among nurses. Use of the Trait Meta-Mood Scale in the nursing context would provide information about nurses' perceived abilities to interpret and manage emotions when interacting with patients. © 2013 John Wiley & Sons Ltd.

  10. Interdependent mechanisms for processing gender and emotion:The special status of angry male faces

    Directory of Open Access Journals (Sweden)

    Daniel A Harris

    2016-07-01

    Full Text Available While some models of how various attributes of a face are processed have posited that face features, invariant physical cues such as gender or ethnicity as well as variant social cues such as emotion, may be processed independently (e.g., Bruce & Young, 1986, other models suggest a more distributed representation and interdependent processing (e.g., Haxby, Hoffman, & Gobbini, 2000. Here we use a contingent adaptation paradigm to investigate if mechanisms for processing the gender and emotion of a face are interdependent and symmetric across the happy-angry emotional continuum and regardless of the gender of the face. We simultaneously adapted participants to angry female faces and happy male faces (Experiment 1 or to happy female faces and angry male faces (Experiment 2. In Experiment 1 we found evidence for contingent adaptation, with simultaneous aftereffects in opposite directions: male faces were biased towards angry while female faces were biased towards happy. Interestingly, in the complementary Experiment 2 we did not find evidence for contingent adaptation, with both male and female faces biased towards angry. Our results highlight that evidence for contingent adaptation and the underlying interdependent face processing mechanisms that would allow for contingent adaptation may only be evident for certain combinations of face features. Such limits may be especially important in the case of social cues given how maladaptive it may be to stop responding to threatening information, with male angry faces considered to be the most threatening. The underlying neuronal mechanisms that could account for such asymmetric effects in contingent adaptation remain to be elucidated.

  11. Face Processing and Facial Emotion Recognition in Adults with Down Syndrome

    Science.gov (United States)

    Barisnikov, Koviljka; Hippolyte, Loyse; Van der Linden, Martial

    2008-01-01

    Face processing and facial expression recognition was investigated in 17 adults with Down syndrome, and results were compared with those of a child control group matched for receptive vocabulary. On the tasks involving faces without emotional content, the adults with Down syndrome performed significantly worse than did the controls. However, their…

  12. The Perceived Helpfulness of Rendering Emotional First Aid via Email

    Science.gov (United States)

    Gilat, Itzhak; Reshef, Eyal

    2015-01-01

    The present research examined the perceived helpfulness of an increasingly widespread mode of psychological assistance, namely, emotional first aid via email. The sample comprised 62 naturally occurring email interactions between distressful clients and trained volunteers operating within the framework of the Israeli Association for Emotional…

  13. Word wins over Face: Emotional Stroop effect activates the frontal cortical network

    Directory of Open Access Journals (Sweden)

    Shima Ovaysikia

    2011-01-01

    Full Text Available The prefrontal cortex (PFC has been implicated in higher order cognitive control of behaviour. Sometimes such control is executed through suppression of an unwanted response in order to avoid conflict. Conflict occurs when two simultaneously competing processes lead to different behavioral outcomes, as seen in tasks such as the anti-saccade, go/no-go and the Stroop task. We set out to examine whether different types of stimuli in a modified emotional Stroop task would cause similar interference effects as the original Stroop-colour/word, and whether the required suppression mechanism(s would recruit similar regions of the medial PFC (mPFC. By using emotional words and emotional faces in this Stroop experiment, we examined the two well-learned automatic behaviours of word reading and recognition of face expressions. In our emotional Stroop paradigm, words were processed faster than face expressions with incongruent trials yielding longer reaction times (RT and larger number of errors compared to the congruent trials. This novel Stroop effect activated the anterior and inferior regions of the mPFC, namely the anterior cingulate cortex (ACC, inferior frontal gyrus (IFG as well as the superior frontal gyrus. Our results suggest that prepotent behaviours such as reading and recognition of face expressions are stimulus-dependent and perhaps hierarchical, hence recruiting distinct regions of the mPFC. Moreover, the faster processing of word reading compared to reporting face expressions is indicative of the formation of stronger stimulus-response (SR associations of an over-learned behaviour compared to an instinctive one, which could alternatively be explained through the distinction between awareness and selective attention.

  14. A facial expression of pax: Assessing children's "recognition" of emotion from faces.

    Science.gov (United States)

    Nelson, Nicole L; Russell, James A

    2016-01-01

    In a classic study, children were shown an array of facial expressions and asked to choose the person who expressed a specific emotion. Children were later asked to name the emotion in the face with any label they wanted. Subsequent research often relied on the same two tasks--choice from array and free labeling--to support the conclusion that children recognize basic emotions from facial expressions. Here five studies (N=120, 2- to 10-year-olds) showed that these two tasks produce illusory recognition; a novel nonsense facial expression was included in the array. Children "recognized" a nonsense emotion (pax or tolen) and two familiar emotions (fear and jealousy) from the same nonsense face. Children likely used a process of elimination; they paired the unknown facial expression with a label given in the choice-from-array task and, after just two trials, freely labeled the new facial expression with the new label. These data indicate that past studies using this method may have overestimated children's expression knowledge. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. The perception of emotion in body expressions.

    Science.gov (United States)

    de Gelder, B; de Borst, A W; Watson, R

    2015-01-01

    During communication, we perceive and express emotional information through many different channels, including facial expressions, prosody, body motion, and posture. Although historically the human body has been perceived primarily as a tool for actions, there is now increased understanding that the body is also an important medium for emotional expression. Indeed, research on emotional body language is rapidly emerging as a new field in cognitive and affective neuroscience. This article reviews how whole-body signals are processed and understood, at the behavioral and neural levels, with specific reference to their role in emotional communication. The first part of this review outlines brain regions and spectrotemporal dynamics underlying perception of isolated neutral and affective bodies, the second part details the contextual effects on body emotion recognition, and final part discusses body processing on a subconscious level. More specifically, research has shown that body expressions as compared with neutral bodies draw upon a larger network of regions responsible for action observation and preparation, emotion processing, body processing, and integrative processes. Results from neurotypical populations and masking paradigms suggest that subconscious processing of affective bodies relies on a specific subset of these regions. Moreover, recent evidence has shown that emotional information from the face, voice, and body all interact, with body motion and posture often highlighting and intensifying the emotion expressed in the face and voice. © 2014 John Wiley & Sons, Ltd.

  16. Perceived emotional intelligence as a moderator variable between cybervictimization and its emotional impact.

    Science.gov (United States)

    Elipe, Paz; Mora-Merchán, Joaquín A; Ortega-Ruiz, Rosario; Casas, José A

    2015-01-01

    The negative effects of traditional bullying and, recently, cyberbullying on victims are well-documented, and abundant empirical evidence for it exists. Cybervictimization affects areas such as academic performance, social integration and self-esteem, and causes emotions ranging from anger and sadness to more complex problems such as depression. However, not all victims are equally affected, and the differences seem to be due to certain situational and personal characteristics. The objective of this study is to analyze the relationship between perceived emotional intelligence (PEI) and the emotional impact of cybervictimization. We hypothesize that EI, which has previously been found to play a role in traditional bullying and cyberbullying, may also affect the emotional impact of cyberbullying. The participants in our study were 636 university students from two universities in the south of Spain. Three self-report questionnaires were used: the "European Cyberbullying Intervention Project Questionnaire," the "Cyberbullying Emotional Impact Scale"; and "Trait Meta-Mood Scale-24." Structural Equation Models were used to test the relationships between the analyzed variables. The results support the idea that PEI, by way of a moderator effect, affects the relationship between cybervictimization and emotional impact. Taken together, cybervictimization and PEI explain much of the variance observed in the emotional impact in general and in the negative dimensions of that impact in particular. Attention and Repair were found to be inversely related to Annoyance and Dejection, and positively related to Invigoration. Clarity has the opposite pattern; a positive relationship with Annoyance and Dejection and an inverse relationship with Invigoration. Various hypothetical explanations of these patterns are discussed.

  17. Psilocybin with psychological support improves emotional face recognition in treatment-resistant depression.

    Science.gov (United States)

    Stroud, J B; Freeman, T P; Leech, R; Hindocha, C; Lawn, W; Nutt, D J; Curran, H V; Carhart-Harris, R L

    2018-02-01

    Depressed patients robustly exhibit affective biases in emotional processing which are altered by SSRIs and predict clinical outcome. The objective of this study is to investigate whether psilocybin, recently shown to rapidly improve mood in treatment-resistant depression (TRD), alters patients' emotional processing biases. Seventeen patients with treatment-resistant depression completed a dynamic emotional face recognition task at baseline and 1 month later after two doses of psilocybin with psychological support. Sixteen controls completed the emotional recognition task over the same time frame but did not receive psilocybin. We found evidence for a group × time interaction on speed of emotion recognition (p = .035). At baseline, patients were slower at recognising facial emotions compared with controls (p psilocybin, this difference was remediated (p = .208). Emotion recognition was faster at follow-up compared with baseline in patients (p = .004, d = .876) but not controls (p = .263, d = .302). In patients, this change was significantly correlated with a reduction in anhedonia over the same time period (r = .640, p = .010). Psilocybin with psychological support appears to improve processing of emotional faces in treatment-resistant depression, and this correlates with reduced anhedonia. Placebo-controlled studies are warranted to follow up these preliminary findings.

  18. Pretreatment Differences in BOLD Response to Emotional Faces Correlate with Antidepressant Response to Scopolamine.

    Science.gov (United States)

    Furey, Maura L; Drevets, Wayne C; Szczepanik, Joanna; Khanna, Ashish; Nugent, Allison; Zarate, Carlos A

    2015-03-28

    Faster acting antidepressants and biomarkers that predict treatment response are needed to facilitate the development of more effective treatments for patients with major depressive disorders. Here, we evaluate implicitly and explicitly processed emotional faces using neuroimaging to identify potential biomarkers of treatment response to the antimuscarinic, scopolamine. Healthy participants (n=15) and unmedicated-depressed major depressive disorder patients (n=16) participated in a double-blind, placebo-controlled crossover infusion study using scopolamine (4 μg/kg). Before and following scopolamine, blood oxygen-level dependent signal was measured using functional MRI during a selective attention task. Two stimuli comprised of superimposed pictures of faces and houses were presented. Participants attended to one stimulus component and performed a matching task. Face emotion was modulated (happy/sad) creating implicit (attend-houses) and explicit (attend-faces) emotion processing conditions. The pretreatment difference in blood oxygen-level dependent response to happy and sad faces under implicit and explicit conditions (emotion processing biases) within a-priori regions of interest was correlated with subsequent treatment response in major depressive disorder. Correlations were observed exclusively during implicit emotion processing in the regions of interest, which included the subgenual anterior cingulate (Pemotional faces prior to treatment reflect the potential to respond to scopolamine. These findings replicate earlier results, highlighting the potential for pretreatment neural activity in the middle occipital cortices and subgenual anterior cingulate to inform us about the potential to respond clinically to scopolamine. Published by Oxford University Press on behalf of CINP 2015. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  19. A motivational determinant of facial emotion recognition: regulatory focus affects recognition of emotions in faces.

    Science.gov (United States)

    Sassenrath, Claudia; Sassenberg, Kai; Ray, Devin G; Scheiter, Katharina; Jarodzka, Halszka

    2014-01-01

    Two studies examined an unexplored motivational determinant of facial emotion recognition: observer regulatory focus. It was predicted that a promotion focus would enhance facial emotion recognition relative to a prevention focus because the attentional strategies associated with promotion focus enhance performance on well-learned or innate tasks - such as facial emotion recognition. In Study 1, a promotion or a prevention focus was experimentally induced and better facial emotion recognition was observed in a promotion focus compared to a prevention focus. In Study 2, individual differences in chronic regulatory focus were assessed and attention allocation was measured using eye tracking during the facial emotion recognition task. Results indicated that the positive relation between a promotion focus and facial emotion recognition is mediated by shorter fixation duration on the face which reflects a pattern of attention allocation matched to the eager strategy in a promotion focus (i.e., striving to make hits). A prevention focus did not have an impact neither on perceptual processing nor on facial emotion recognition. Taken together, these findings demonstrate important mechanisms and consequences of observer motivational orientation for facial emotion recognition.

  20. Emotional face recognition deficit in amnestic patients with mild cognitive impairment: behavioral and electrophysiological evidence

    Directory of Open Access Journals (Sweden)

    Yang L

    2015-08-01

    Full Text Available Linlin Yang, Xiaochuan Zhao, Lan Wang, Lulu Yu, Mei Song, Xueyi Wang Department of Mental Health, The First Hospital of Hebei Medical University, Hebei Medical University Institute of Mental Health, Shijiazhuang, People’s Republic of China Abstract: Amnestic mild cognitive impairment (MCI has been conceptualized as a transitional stage between healthy aging and Alzheimer’s disease. Thus, understanding emotional face recognition deficit in patients with amnestic MCI could be useful in determining progression of amnestic MCI. The purpose of this study was to investigate the features of emotional face processing in amnestic MCI by using event-related potentials (ERPs. Patients with amnestic MCI and healthy controls performed a face recognition task, giving old/new responses to previously studied and novel faces with different emotional messages as the stimulus material. Using the learning-recognition paradigm, the experiments were divided into two steps, ie, a learning phase and a test phase. ERPs were analyzed on electroencephalographic recordings. The behavior data indicated high emotion classification accuracy for patients with amnestic MCI and for healthy controls. The mean percentage of correct classifications was 81.19% for patients with amnestic MCI and 96.46% for controls. Our ERP data suggest that patients with amnestic MCI were still be able to undertake personalizing processing for negative faces, but not for neutral or positive faces, in the early frontal processing stage. In the early time window, no differences in frontal old/new effect were found between patients with amnestic MCI and normal controls. However, in the late time window, the three types of stimuli did not elicit any old/new parietal effects in patients with amnestic MCI, suggesting their recollection was impaired. This impairment may be closely associated with amnestic MCI disease. We conclude from our data that face recognition processing and emotional memory is

  1. When Do Personality and Emotion Predict Destructive Behavior During Relationship Conflict? The Role of Perceived Commitment Asymmetry.

    Science.gov (United States)

    Lemay, Edward P; Dobush, Sarah

    2015-10-01

    The current research examined whether perceived asymmetries in relationship commitment moderate the associations of personality traits and emotional states with enactment of hostile behavior during relationship conflicts. Participants included both members of 53 heterosexual romantic couples (Mage  = 25.5 years). Participants completed questionnaire measures assessing personality traits, emotional states, relationship commitment, and perceptions of their partner's commitment. Participants then had an observed conflict discussion with their partner, which was rated by a panel of objective observers for hostile behavior. When participants perceived that they were less committed than their partners, their enactment of hostile behavior was predicted by traits and states that are associated with antisocial and pro-social orientations (i.e., agreeableness, trait anger, chronic jealousy, and state negative emotion). In contrast, participants who perceived that they were more committed than their partners tended to refrain from hostile behavior, despite traits or states that may suggest hostile inclinations. These results suggest that perceiving that one is less committed than one's partner promotes behavioral expression of interpersonal dispositions and emotions, whereas perceiving that one is more committed than one's partner motivates inhibition of hostile behavior. © 2014 Wiley Periodicals, Inc.

  2. Decoding facial blends of emotion: visual field, attentional and hemispheric biases.

    Science.gov (United States)

    Ross, Elliott D; Shayya, Luay; Champlain, Amanda; Monnot, Marilee; Prodan, Calin I

    2013-12-01

    Most clinical research assumes that modulation of facial expressions is lateralized predominantly across the right-left hemiface. However, social psychological research suggests that facial expressions are organized predominantly across the upper-lower face. Because humans learn to cognitively control facial expression for social purposes, the lower face may display a false emotion, typically a smile, to enable approach behavior. In contrast, the upper face may leak a person's true feeling state by producing a brief facial blend of emotion, i.e. a different emotion on the upper versus lower face. Previous studies from our laboratory have shown that upper facial emotions are processed preferentially by the right hemisphere under conditions of directed attention if facial blends of emotion are presented tachistoscopically to the mid left and right visual fields. This paper explores how facial blends are processed within the four visual quadrants. The results, combined with our previous research, demonstrate that lower more so than upper facial emotions are perceived best when presented to the viewer's left and right visual fields just above the horizontal axis. Upper facial emotions are perceived best when presented to the viewer's left visual field just above the horizontal axis under conditions of directed attention. Thus, by gazing at a person's left ear, which also avoids the social stigma of eye-to-eye contact, one's ability to decode facial expressions should be enhanced. Published by Elsevier Inc.

  3. Consciousness and arousal effects on emotional face processing as revealed by brain oscillations. A gamma band analysis.

    Science.gov (United States)

    Balconi, Michela; Lucchiari, Claudio

    2008-01-01

    It remains an open question whether it is possible to assign a single brain operation or psychological function for facial emotion decoding to a certain type of oscillatory activity. Gamma band activity (GBA) offers an adequate tool for studying cortical activation patterns during emotional face information processing. In the present study brain oscillations were analyzed in response to facial expression of emotions. Specifically, GBA modulation was measured when twenty subjects looked at emotional (angry, fearful, happy, and sad faces) or neutral faces in two different conditions: supraliminal (10 ms) vs subliminal (150 ms) stimulation (100 target-mask pairs for each condition). The results showed that both consciousness and significance of the stimulus in terms of arousal can modulate the power synchronization (ERD decrease) during 150-350 time range: an early oscillatory event showed its peak at about 200 ms post-stimulus. GBA was enhanced by supraliminal more than subliminal elaboration, as well as more by high arousal (anger and fear) than low arousal (happiness and sadness) emotions. Finally a left-posterior dominance for conscious elaboration was found, whereas right hemisphere was discriminant in emotional processing of face in comparison with neutral face.

  4. Effect of positive emotion on consolidation of memory for faces: the modulation of facial valence and facial gender.

    Science.gov (United States)

    Wang, Bo

    2013-01-01

    Studies have shown that emotion elicited after learning enhances memory consolidation. However, no prior studies have used facial photos as stimuli. This study examined the effect of post-learning positive emotion on consolidation of memory for faces. During the learning participants viewed neutral, positive, or negative faces. Then they were assigned to a condition in which they either watched a 9-minute positive video clip, or a 9-minute neutral video. Then 30 minutes after the learning participants took a surprise memory test, in which they made "remember", "know", and "new" judgements. The findings are: (1) Positive emotion enhanced consolidation of recognition for negative male faces, but impaired consolidation of recognition for negative female faces; (2) For males, recognition for negative faces was equivalent to that for positive faces; for females, recognition for negative faces was better than that for positive faces. Our study provides the important evidence that effect of post-learning emotion on memory consolidation can extend to facial stimuli and such an effect can be modulated by facial valence and facial gender. The findings may shed light on establishing models concerning the influence of emotion on memory consolidation.

  5. Is emotion recognition the only problem in ADHD? effects of pharmacotherapy on face and emotion recognition in children with ADHD.

    Science.gov (United States)

    Demirci, Esra; Erdogan, Ayten

    2016-12-01

    The objectives of this study were to evaluate both face and emotion recognition, to detect differences among attention deficit and hyperactivity disorder (ADHD) subgroups, to identify effects of the gender and to assess the effects of methylphenidate and atomoxetine treatment on both face and emotion recognition in patients with ADHD. The study sample consisted of 41 male, 29 female patients, 8-15 years of age, who were diagnosed as having combined type ADHD (N = 26), hyperactive/impulsive type ADHD (N = 21) or inattentive type ADHD (N = 23) but had not previously used any medication for ADHD and 35 male, 25 female healthy individuals. Long-acting methylphenidate (OROS-MPH) was prescribed to 38 patients, whereas atomoxetine was prescribed to 32 patients. The reading the mind in the eyes test (RMET) and Benton face recognition test (BFRT) were applied to all participants before and after treatment. The patients with ADHD had a significantly lower number of correct answers in child and adolescent RMET and in BFRT than the healthy controls. Among the ADHD subtypes, the hyperactive/impulsive subtype had a lower number of correct answers in the RMET than the inattentive subtypes, and the hyperactive/impulsive subtype had a lower number of correct answers in short and long form of BFRT than the combined and inattentive subtypes. Male and female patients with ADHD did not differ significantly with respect to the number of correct answers on the RMET and BFRT. The patients showed significant improvement in RMET and BFRT after treatment with OROS-MPH or atomoxetine. Patients with ADHD have difficulties in face recognition as well as emotion recognition. Both OROS-MPH and atomoxetine affect emotion recognition. However, further studies on the face and emotion recognition are needed in ADHD.

  6. Towards the neurobiology of emotional body language.

    Science.gov (United States)

    de Gelder, Beatrice

    2006-03-01

    People's faces show fear in many different circumstances. However, when people are terrified, as well as showing emotion, they run for cover. When we see a bodily expression of emotion, we immediately know what specific action is associated with a particular emotion, leaving little need for interpretation of the signal, as is the case for facial expressions. Research on emotional body language is rapidly emerging as a new field in cognitive and affective neuroscience. This article reviews how whole-body signals are automatically perceived and understood, and their role in emotional communication and decision-making.

  7. Facing emotions in narcolepsy with cataplexy: haemodynamic and behavioural responses during emotional stimulation.

    Science.gov (United States)

    de Zambotti, Massimiliano; Pizza, Fabio; Covassin, Naima; Vandi, Stefano; Cellini, Nicola; Stegagno, Luciano; Plazzi, Giuseppe

    2014-08-01

    Narcolepsy with cataplexy is a complex sleep disorder that affects the modulation of emotions: cataplexy, the key symptom of narcolepsy, is indeed strongly linked with emotions that usually trigger the episodes. Our study aimed to investigate haemodynamic and behavioural responses during emotional stimulation in narco-cataplexy. Twelve adult drug-naive narcoleptic patients (five males; age: 33.3 ± 9.4 years) and 12 healthy controls (five males; age: 30.9 ± 9.5 years) were exposed to emotional stimuli (pleasant, unpleasant and neutral pictures). Heart rate, arterial blood pressure and mean cerebral blood flow velocity of the middle cerebral arteries were continuously recorded using photoplethysmography and Doppler ultrasound. Ratings of valence and arousal and coping strategies were scored by the Self-Assessment Manikin and by questionnaires, respectively. Narcoleptic patients' haemodynamic responses to pictures overlapped with the data obtained from controls: decrease of heart rate and increase of mean cerebral blood flow velocity regardless of pictures' content, increase of systolic blood pressure during the pleasant condition, and relative reduction of heart rate during pleasant and unpleasant conditions. However, when compared with controls, narcoleptic patients reported lower arousal scores during the pleasant and neutral stimulation, and lower valence scores during the pleasant condition, respectively, and also a lower score at the 'focus on and venting of emotions' dimensions of coping. Our results suggested that adult narcoleptic patients, compared with healthy controls, inhibited their emotion-expressive behaviour to emotional stimulation, and that may be related to the development of adaptive cognitive strategies to face emotions avoiding cataplexy. © 2014 European Sleep Research Society.

  8. Emotional face processing and flat affect in schizophrenia: functional and structural neural correlates.

    Science.gov (United States)

    Lepage, M; Sergerie, K; Benoit, A; Czechowska, Y; Dickie, E; Armony, J L

    2011-09-01

    There is a general consensus in the literature that schizophrenia causes difficulties with facial emotion perception and discrimination. Functional brain imaging studies have observed reduced limbic activity during facial emotion perception but few studies have examined the relation to flat affect severity. A total of 26 people with schizophrenia and 26 healthy controls took part in this event-related functional magnetic resonance imaging study. Sad, happy and neutral faces were presented in a pseudo-random order and participants indicated the gender of the face presented. Manual segmentation of the amygdala was performed on a structural T1 image. Both the schizophrenia group and the healthy control group rated the emotional valence of facial expressions similarly. Both groups exhibited increased brain activity during the perception of emotional faces relative to neutral ones in multiple brain regions, including multiple prefrontal regions bilaterally, the right amygdala, right cingulate cortex and cuneus. Group comparisons, however, revealed increased activity in the healthy group in the anterior cingulate, right parahippocampal gyrus and multiple visual areas. In schizophrenia, the severity of flat affect correlated significantly with neural activity in several brain areas including the amygdala and parahippocampal region bilaterally. These results suggest that many of the brain regions involved in emotional face perception, including the amygdala, are equally recruited in both schizophrenia and controls, but flat affect can also moderate activity in some other brain regions, notably in the left amygdala and parahippocampal gyrus bilaterally. There were no significant group differences in the volume of the amygdala.

  9. The adaptive value associated with expressing and perceiving angry-male and happy-female faces

    Directory of Open Access Journals (Sweden)

    Peter Kay Chai eTay

    2015-06-01

    Full Text Available Facial expressions are valuable for conveying and understanding the inner thoughts and feelings of the expressor. However, the adaptive value associated with a specific expression on a male face is different from a female face. The present review uses a functional-evolutionary analysis to elucidate the evolutionary advantage in the expression and perception of angry-male and happy-female faces over angry-female and happy-male faces. For the expressors, it is more advantageous for men to show angry facial expression as it signals dominance, averts aggression and deters mate poaching; it is more advantageous for women to display happy facial expression as it signals their willingness for childcare, tending and befriending. For the perceivers, those sensitive to angry men avoid being physically harmed while those sensitive to happy women gain social support. Extant evidence suggests that facial structure and cognitive mechanisms evolved to express and perceive angry-male and happy-female faces more efficiently compared to angry-female and happy-male faces.

  10. A common neural code for perceived and inferred emotion.

    Science.gov (United States)

    Skerry, Amy E; Saxe, Rebecca

    2014-11-26

    Although the emotions of other people can often be perceived from overt reactions (e.g., facial or vocal expressions), they can also be inferred from situational information in the absence of observable expressions. How does the human brain make use of these diverse forms of evidence to generate a common representation of a target's emotional state? In the present research, we identify neural patterns that correspond to emotions inferred from contextual information and find that these patterns generalize across different cues from which an emotion can be attributed. Specifically, we use functional neuroimaging to measure neural responses to dynamic facial expressions with positive and negative valence and to short animations in which the valence of a character's emotion could be identified only from the situation. Using multivoxel pattern analysis, we test for regions that contain information about the target's emotional state, identifying representations specific to a single stimulus type and representations that generalize across stimulus types. In regions of medial prefrontal cortex (MPFC), a classifier trained to discriminate emotional valence for one stimulus (e.g., animated situations) could successfully discriminate valence for the remaining stimulus (e.g., facial expressions), indicating a representation of valence that abstracts away from perceptual features and generalizes across different forms of evidence. Moreover, in a subregion of MPFC, this neural representation generalized to trials involving subjectively experienced emotional events, suggesting partial overlap in neural responses to attributed and experienced emotions. These data provide a step toward understanding how the brain transforms stimulus-bound inputs into abstract representations of emotion. Copyright © 2014 the authors 0270-6474/14/3315997-12$15.00/0.

  11. Infants' Temperament and Mothers', and Fathers' Depression Predict Infants' Attention to Objects Paired with Emotional Faces.

    Science.gov (United States)

    Aktar, Evin; Mandell, Dorothy J; de Vente, Wieke; Majdandžić, Mirjana; Raijmakers, Maartje E J; Bögels, Susan M

    2016-07-01

    Between 10 and 14 months, infants gain the ability to learn about unfamiliar stimuli by observing others' emotional reactions to those stimuli, so called social referencing (SR). Joint processing of emotion and head/gaze direction is essential for SR. This study tested emotion and head/gaze direction effects on infants' attention via pupillometry in the period following the emergence of SR. Pupil responses of 14-to-17-month-old infants (N = 57) were measured during computerized presentations of unfamiliar objects alone, before-and-after being paired with emotional (happy, sad, fearful vs. neutral) faces gazing towards (vs. away) from objects. Additionally, the associations of infants' temperament, and parents' negative affect/depression/anxiety with infants' pupil responses were explored. Both mothers and fathers of participating infants completed questionnaires about their negative affect, depression and anxiety symptoms and their infants' negative temperament. Infants allocated more attention (larger pupils) to negative vs. neutral faces when the faces were presented alone, while they allocated less attention to objects paired with emotional vs. neutral faces independent of head/gaze direction. Sad (but not fearful) temperament predicted more attention to emotional faces. Infants' sad temperament moderated the associations of mothers' depression (but not anxiety) with infants' attention to objects. Maternal depression predicted more attention to objects paired with emotional expressions in infants low in sad temperament, while it predicted less attention in infants high in sad temperament. Fathers' depression (but not anxiety) predicted more attention to objects paired with emotional expressions independent of infants' temperament. We conclude that infants' own temperamental dispositions for sadness, and their exposure to mothers' and fathers' depressed moods may influence infants' attention to emotion-object associations in social learning contexts.

  12. Selecting fillers on emotional appearance improves lineup identification accuracy.

    Science.gov (United States)

    Flowe, Heather D; Klatt, Thimna; Colloff, Melissa F

    2014-12-01

    Mock witnesses sometimes report using criminal stereotypes to identify a face from a lineup, a tendency known as criminal face bias. Faces are perceived as criminal-looking if they appear angry. We tested whether matching the emotional appearance of the fillers to an angry suspect can reduce criminal face bias. In Study 1, mock witnesses (n = 226) viewed lineups in which the suspect had an angry, happy, or neutral expression, and we varied whether the fillers matched the expression. An additional group of participants (n = 59) rated the faces on criminal and emotional appearance. As predicted, mock witnesses tended to identify suspects who appeared angrier and more criminal-looking than the fillers. This tendency was reduced when the lineup fillers matched the emotional appearance of the suspect. Study 2 extended the results, testing whether the emotional appearance of the suspect and fillers affects recognition memory. Participants (n = 1,983) studied faces and took a lineup test in which the emotional appearance of the target and fillers was varied between subjects. Discrimination accuracy was enhanced when the fillers matched an angry target's emotional appearance. We conclude that lineup member emotional appearance plays a critical role in the psychology of lineup identification. The fillers should match an angry suspect's emotional appearance to improve lineup identification accuracy. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  13. Child's recognition of emotions in robot's face and body

    NARCIS (Netherlands)

    Cohen, I.; Looije, R.; Neerincx, M.A.

    2011-01-01

    Social robots can comfort and support children who have to cope with chronic diseases. In previous studies, a "facial robot", the iCat, proved to show well-recognized emotional expressions that are important in social interactions. The question is if a mobile robot without a face, the Nao, can

  14. Impaired Integration of Emotional Faces and Affective Body Context in a Rare Case of Developmental Visual Agnosia

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran. R.; Bentin, Shlomo

    2011-01-01

    In the current study we examined the recognition of facial expressions embedded in emotionally expressive bodies in case LG, an individual with a rare form of developmental visual agnosia who suffers from severe prosopagnosia. Neuropsychological testing demonstrated that LG‘s agnosia is characterized by profoundly impaired visual integration. Unlike individuals with typical developmental prosopagnosia who display specific difficulties with face identity (but typically not expression) recognition, LG was also impaired at recognizing isolated facial expressions. By contrast, he successfully recognized the expressions portrayed by faceless emotional bodies handling affective paraphernalia. When presented with contextualized faces in emotional bodies his ability to detect the emotion expressed by a face did not improve even if it was embedded in an emotionally-congruent body context. Furthermore, in contrast to controls, LG displayed an abnormal pattern of contextual influence from emotionally-incongruent bodies. The results are interpreted in the context of a general integration deficit in developmental visual agnosia, suggesting that impaired integration may extend from the level of the face to the level of the full person. PMID:21482423

  15. The unique contributions of perceiver and target characteristics in person perception.

    Science.gov (United States)

    Hehman, Eric; Sutherland, Clare A M; Flake, Jessica K; Slepian, Michael L

    2017-10-01

    Models of person perception have long asserted that our impressions of others are guided by characteristics of both the target and perceiver. However, research has not yet quantified to what extent perceivers and targets contribute to different impressions. This quantification is theoretically critical, as it addresses how much an impression arises from "our minds" versus "others' faces." Here, we apply cross-classified random effects models to address this fundamental question in social cognition, using approximately 700,000 ratings of faces. With this approach, we demonstrate that (a) different trait impressions have unique causal processes, meaning that some impressions are largely informed by perceiver-level characteristics whereas others are driven more by physical target-level characteristics; (b) modeling of perceiver- and target-variance in impressions informs fundamental models of social perception; (c) Perceiver × Target interactions explain a substantial portion of variance in impressions; (d) greater emotional intensity in stimuli decreases the influence of the perceiver; and (e) more variable, naturalistic stimuli increases variation across perceivers. Important overarching patterns emerged. Broadly, traits and dimensions representing inferences of character (e.g., dominance) are driven more by perceiver characteristics than those representing appearance-based appraisals (e.g., youthful-attractiveness). Moreover, inferences made of more ambiguous traits (e.g., creative) or displays (e.g., faces with less extreme emotions, less-controlled stimuli) are similarly driven more by perceiver than target characteristics. Together, results highlight the large role that perceiver and target variability play in trait impressions, and develop a new topography of trait impressions that considers the source of the impression. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. ‘Distracters’ do not always distract: Visual working memory for angry faces is enhanced by incidental emotional words.

    Directory of Open Access Journals (Sweden)

    Margaret Cecilia Jackson

    2012-10-01

    Full Text Available We are often required to filter out distraction in order to focus on a primary task during which working memory (WM is engaged. Previous research has shown that negative versus neutral distracters presented during a visual WM maintenance period significantly impair memory for neutral information. However, the contents of WM are often also emotional in nature. The question we address here is how incidental information might impact upon visual WM when both this and the memory items contain emotional information. We presented emotional versus neutral words during the maintenance interval of an emotional visual WM faces task. Participants encoded two angry or happy faces into WM, and several seconds into a 9 second maintenance period a negative, positive, or neutral word was flashed on the screen three times. A single neutral test face was presented for retrieval with a face identity that was either present or absent in the preceding study array. WM for angry face identities was significantly better when an emotional (negative or positive versus neutral (or no word was presented. In contrast, WM for happy face identities was not significantly affected by word valence. These findings suggest that the presence of emotion within an intervening stimulus boosts the emotional value of threat-related information maintained in visual WM and thus improves performance. In addition, we show that incidental events that are emotional in nature do not always distract from an ongoing WM task.

  17. An exploration of emotional protection and regulation in nurse-patient interactions: The role of the professional face and the emotional mirror.

    Science.gov (United States)

    Cecil, Penelope; Glass, Nel

    2015-01-01

    While interpersonal styles of nurse-patient communication have become more relaxed in recent years, nurses remain challenged in emotional engagement with patients and other health professionals. In order to preserve a professional distance in patient care delivery however slight, nurses need to be able to regulate their emotions. This research aimed to investigate nurses' perceptions of emotional protection and regulation in patient care delivery. A qualitative approach was used for the study utilising in-depth semi-structured interviews and researcher reflective journaling. Participants were drawn from rural New South Wales. Following institutional ethics approval 5 nurses were interviewed and reflective journaling commenced. The interviews and the reflective journal were transcribed verbatim. The results revealed that nurses' emotional regulation demonstrated by a 'professional face' was an important strategy to enable delivery of quality care even though it resulted in emotional containment. Such regulation was a protective mechanism employed to look after self and was critical in situations of emotional dissonance. The results also found that nurses experience emotional dissonance in situations where they have unresolved personal emotional issues and the latter was a individual motivator to manage emotions in the workplace. Emotions play a pivotal role within nurse-patient relationships. The professional face can be recognised as contributing to emotional health and therefore maintaining the emotional health of nurses in practice. This study foregrounds the importance of regulating emotions and nurturing nurses' emotional health in contemporary practice.

  18. Electrophysiological correlates of emotional face processing in typically developing adults and adults with high functioning Autism

    OpenAIRE

    Barrie, Jennifer Nicole

    2012-01-01

    Emotional expressions have been found to affect various event-related potentials (ERPs). Furthermore, socio-emotional functioning is altered in individuals with autism, and a growing body of neuroimaging and electrophysiological evidence substantiates underlying neural differences for face processing in this population. However, relatively few studies have examined the time-course of emotional face processing in autism. This study examined how implicit (not the intended focus of attention) ve...

  19. Interactions among the effects of head orientation, emotional expression, and physical attractiveness on face preferences.

    Science.gov (United States)

    Main, Julie C; DeBruine, Lisa M; Little, Anthony C; Jones, Benedict C

    2010-01-01

    Previous studies have shown that preferences for direct versus averted gaze are modulated by emotional expressions and physical attractiveness. For example, preferences for direct gaze are stronger when judging happy or physically attractive faces than when judging disgusted or physically unattractive faces. Here we show that preferences for front versus three-quarter views of faces, in which gaze direction was always congruent with head orientation, are also modulated by emotional expressions and physical attractiveness; participants demonstrated preferences for front views of faces over three-quarter views of faces when judging the attractiveness of happy, physically attractive individuals, but not when judging the attractiveness of relatively unattractive individuals or those with disgusted expressions. Moreover, further analyses indicated that these interactions did not simply reflect differential perceptions of the intensity of the emotional expressions shown in each condition. Collectively, these findings present novel evidence that the effect of the direction of the attention of others on attractiveness judgments is modulated by cues to the physical attractiveness and emotional state of the depicted individual, potentially reflecting psychological adaptations for efficient allocation of social effort. These data also present the first behavioural evidence that the effect of the direction of the attention of others on attractiveness judgments reflects viewer-referenced, rather than face-referenced, coding and/or processing of gaze direction.

  20. I feel your voice. Cultural differences in the multisensory perception of emotion.

    Science.gov (United States)

    Tanaka, Akihiro; Koizumi, Ai; Imai, Hisato; Hiramatsu, Saori; Hiramoto, Eriko; de Gelder, Beatrice

    2010-09-01

    Cultural differences in emotion perception have been reported mainly for facial expressions and to a lesser extent for vocal expressions. However, the way in which the perceiver combines auditory and visual cues may itself be subject to cultural variability. Our study investigated cultural differences between Japanese and Dutch participants in the multisensory perception of emotion. A face and a voice, expressing either congruent or incongruent emotions, were presented on each trial. Participants were instructed to judge the emotion expressed in one of the two sources. The effect of to-be-ignored voice information on facial judgments was larger in Japanese than in Dutch participants, whereas the effect of to-be-ignored face information on vocal judgments was smaller in Japanese than in Dutch participants. This result indicates that Japanese people are more attuned than Dutch people to vocal processing in the multisensory perception of emotion. Our findings provide the first evidence that multisensory integration of affective information is modulated by perceivers' cultural background.

  1. Locus of emotion: the effect of task order and age on emotion perceived and emotion felt in response to music.

    Science.gov (United States)

    Schubert, Emery

    2007-01-01

    The relationship between emotions perceived to be expressed (external locus EL) versus emotions felt (internal locus--IL) in response to music was examined using 5 contrasting pieces of Romantic, Western art music. The main hypothesis tested was that emotion expressed along the dimensions of emotional-strength, valence, and arousal were lower in magnitude for IL than EL. IL and EL judgments made together after one listening (Experiment 2, n = 18) produced less differentiated responses than when each task was performed after separate listenings (Experiment 1, n = 28). This merging of responses in the locus-task-together condition started to disappear as statistical power was increased. Statistical power was increased by recruiting an additional subject pool of elderly individuals (Experiment 3, n = 19, mean age 75 years). Their valence responses were more positive, and their emotional-strength ratings were generally lower, compared to their younger counterparts. Overall data analysis revealed that IL responses fluctuated slightly more than EL emotions, meaning that the latter are more stable. An additional dimension of dominance-submissiveness was also examined, and was useful in differentiating between pieces, but did not return a difference between IL and EL. Some therapy applications of these findings are discussed.

  2. Neural circuitry of masked emotional face processing in youth with bipolar disorder, severe mood dysregulation, and healthy volunteers.

    Science.gov (United States)

    Thomas, Laura A; Brotman, Melissa A; Bones, Brian L; Chen, Gang; Rosen, Brooke H; Pine, Daniel S; Leibenluft, Ellen

    2014-04-01

    Youth with bipolar disorder (BD) and those with severe, non-episodic irritability (severe mood dysregulation, SMD) show face-emotion labeling deficits. These groups differ from healthy volunteers (HV) in neural responses to emotional faces. It is unknown whether awareness is required to elicit these differences. We compared activation in BD (N=20), SMD (N=18), and HV (N=22) during "Aware" and "Non-aware" priming of shapes by emotional faces. Subjects rated how much they liked the shape. In aware, a face (angry, fearful, happy, neutral, blank oval) appeared (187 ms) before the shape. In non-aware, a face appeared (17 ms), followed by a mask (170 ms), and shape. A Diagnosis-by-Awareness-by-Emotion ANOVA was not significant. There were significant Diagnosis-by-Awareness interactions in occipital regions. BD and SMD showed increased activity for non-aware vs. aware; HV showed the reverse pattern. When subjects viewed angry or neutral faces, there were Emotion-by-Diagnosis interactions in face-emotion processing regions, including the L precentral gyrus, R posterior cingulate, R superior temporal gyrus, R middle occipital gyrus, and L medial frontal gyrus. Regardless of awareness, BD and SMD differ in activation patterns from HV and each other in multiple brain regions, suggesting that BD and SMD are distinct developmental mood disorders. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Neural circuitry of masked emotional face processing in youth with bipolar disorder, severe mood dysregulation, and healthy volunteers

    Directory of Open Access Journals (Sweden)

    Laura A. Thomas

    2014-04-01

    Full Text Available Youth with bipolar disorder (BD and those with severe, non-episodic irritability (severe mood dysregulation, SMD show face-emotion labeling deficits. These groups differ from healthy volunteers (HV in neural responses to emotional faces. It is unknown whether awareness is required to elicit these differences. We compared activation in BD (N = 20, SMD (N = 18, and HV (N = 22 during “Aware” and “Non-aware” priming of shapes by emotional faces. Subjects rated how much they liked the shape. In aware, a face (angry, fearful, happy, neutral, blank oval appeared (187 ms before the shape. In non-aware, a face appeared (17 ms, followed by a mask (170 ms, and shape. A Diagnosis-by-Awareness-by-Emotion ANOVA was not significant. There were significant Diagnosis-by-Awareness interactions in occipital regions. BD and SMD showed increased activity for non-aware vs. aware; HV showed the reverse pattern. When subjects viewed angry or neutral faces, there were Emotion-by-Diagnosis interactions in face-emotion processing regions, including the L precentral gyrus, R posterior cingulate, R superior temporal gyrus, R middle occipital gyrus, and L medial frontal gyrus. Regardless of awareness, BD and SMD differ in activation patterns from HV and each other in multiple brain regions, suggesting that BD and SMD are distinct developmental mood disorders.

  4. Scanning patterns of faces do not explain impaired emotion recognition in Huntington Disease: Evidence for a high level mechanism

    Directory of Open Access Journals (Sweden)

    Marieke evan Asselen

    2012-02-01

    Full Text Available Previous studies in patients with amygdala lesions suggested that deficits in emotion recognition might be mediated by impaired scanning patterns of faces. Here we investigated whether scanning patterns also contribute to the selective impairment in recognition of disgust in Huntington disease (HD. To achieve this goal, we recorded eye movements during a two-alternative forced choice emotion recognition task. HD patients in presymptomatic (n=16 and symptomatic (n=9 disease stages were tested and their performance was compared to a control group (n=22. In our emotion recognition task, participants had to indicate whether a face reflected one of six basic emotions. In addition, and in order to define whether emotion recognition was altered when the participants were forced to look at a specific component of the face, we used a second task where only limited facial information was provided (eyes/mouth in partially masked faces. Behavioural results showed no differences in the ability to recognize emotions between presymptomatic gene carriers and controls. However, an emotion recognition deficit was found for all 6 basic emotion categories in early stage HD. Analysis of eye movement patterns showed that patient and controls used similar scanning strategies. Patterns of deficits were similar regardless of whether parts of the faces were masked or not, thereby confirming that selective attention to particular face parts is not underlying the deficits. These results suggest that the emotion recognition deficits in symptomatic HD patients cannot be explained by impaired scanning patterns of faces. Furthermore, no selective deficit for recognition of disgust was found in presymptomatic HD patients.

  5. Emotional face expression modulates occipital-frontal effective connectivity during memory formation in a bottom-up fashion

    Directory of Open Access Journals (Sweden)

    Daiming eXiu

    2015-04-01

    Full Text Available This study investigated the role of bottom-up and top-down neural mechanisms in the processing of emotional face expression during memory formation. Functional brain imaging data was acquired during incidental learning of positive (‘happy’, neutral and negative (‘angry’ or ‘fearful’ faces. Dynamic Causal Modeling (DCM was applied on the fMRI data to characterize effective connectivity within a brain network involving face perception (inferior occipital gyrus and fusiform gyrus and successful memory formation related areas (hippocampus, superior parietal lobule, amygdala and orbitofrontal cortex. The bottom-up models assumed processing of emotional face expression along feed forward pathways to the orbitofrontal cortex. The top-down models assumed that the orbitofrontal cortex processed emotional valence and mediated connections to the hippocampus. A subsequent recognition memory test showed an effect of negative emotion on the response bias, but not on memory performance. Our DCM findings showed that the bottom-up model family of effective connectivity best explained the data across all subjects and specified that emotion affected most bottom-up connections to the orbitofrontal cortex, especially from the occipital visual cortex and superior parietal lobule. Of those pathways to the orbitofrontal cortex the connection from the inferior occipital gyrus correlated with memory performance independently of valence. We suggest that bottom-up neural mechanisms support effects of emotional face expression and memory formation in a parallel and partially overlapping fashion.

  6. Transcutaneous vagus nerve stimulation (tVNS) enhances recognition of emotions in faces but not bodies.

    Science.gov (United States)

    Sellaro, Roberta; de Gelder, Beatrice; Finisguerra, Alessandra; Colzato, Lorenza S

    2018-02-01

    The polyvagal theory suggests that the vagus nerve is the key phylogenetic substrate enabling optimal social interactions, a crucial aspect of which is emotion recognition. A previous study showed that the vagus nerve plays a causal role in mediating people's ability to recognize emotions based on images of the eye region. The aim of this study is to verify whether the previously reported causal link between vagal activity and emotion recognition can be generalized to situations in which emotions must be inferred from images of whole faces and bodies. To this end, we employed transcutaneous vagus nerve stimulation (tVNS), a novel non-invasive brain stimulation technique that causes the vagus nerve to fire by the application of a mild electrical stimulation to the auricular branch of the vagus nerve, located in the anterior protuberance of the outer ear. In two separate sessions, participants received active or sham tVNS before and while performing two emotion recognition tasks, aimed at indexing their ability to recognize emotions from facial and bodily expressions. Active tVNS, compared to sham stimulation, enhanced emotion recognition for whole faces but not for bodies. Our results confirm and further extend recent observations supporting a causal relationship between vagus nerve activity and the ability to infer others' emotional state, but restrict this association to situations in which the emotional state is conveyed by the whole face and/or by salient facial cues, such as eyes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. The time course of face processing: startle eyeblink response modulation by face gender and expression.

    Science.gov (United States)

    Duval, Elizabeth R; Lovelace, Christopher T; Aarant, Justin; Filion, Diane L

    2013-12-01

    The purpose of this study was to investigate the effects of both facial expression and face gender on startle eyeblink response patterns at varying lead intervals (300, 800, and 3500ms) indicative of attentional and emotional processes. We aimed to determine whether responses to affective faces map onto the Defense Cascade Model (Lang et al., 1997) to better understand the stages of processing during affective face viewing. At 300ms, there was an interaction between face expression and face gender with female happy and neutral faces and male angry faces producing inhibited startle. At 3500ms, there was a trend for facilitated startle during angry compared to neutral faces. These findings suggest that affective expressions are perceived differently in male and female faces, especially at short lead intervals. Future studies investigating face processing should take both face gender and expression into account. © 2013.

  8. Abnormal early gamma responses to emotional faces differentiate unipolar from bipolar disorder patients.

    Science.gov (United States)

    Liu, T Y; Chen, Y S; Su, T P; Hsieh, J C; Chen, L F

    2014-01-01

    This study investigates the cortical abnormalities of early emotion perception in patients with major depressive disorder (MDD) and bipolar disorder (BD) using gamma oscillations. Twenty-three MDD patients, twenty-five BD patients, and twenty-four normal controls were enrolled and their event-related magnetoencephalographic responses were recorded during implicit emotional tasks. Our results demonstrated abnormal gamma activity within 100 ms in the emotion-related regions (amygdala, orbitofrontal (OFC) cortex, anterior insula (AI), and superior temporal pole) in the MDD patients, suggesting that these patients may have dysfunctions or negativity biases in perceptual binding of emotional features at very early stage. Decreased left superior medial frontal cortex (smFC) responses to happy faces in the MDD patients were correlated with their serious level of depression symptoms, indicating that decreased smFC activity perhaps underlies irregular positive emotion processing in depressed patients. In the BD patients, we showed abnormal activation in visual regions (inferior/middle occipital and middle temporal cortices) which responded to emotional faces within 100 ms, supporting that the BD patients may hyperactively respond to emotional features in perceptual binding. The discriminant function of gamma activation in the left smFC, right medial OFC, right AI/inferior OFC, and the right precentral cortex accurately classified 89.6% of patients as unipolar/bipolar disorders.

  9. Facial trustworthiness judgments in children with ASD are modulated by happy and angry emotional cues.

    Directory of Open Access Journals (Sweden)

    Frances Caulfield

    Full Text Available Appearance-based trustworthiness inferences may reflect the misinterpretation of emotional expression cues. Children and adults typically perceive faces that look happy to be relatively trustworthy and those that look angry to be relatively untrustworthy. Given reports of atypical expression perception in children with Autism Spectrum Disorder (ASD, the current study aimed to determine whether the modulation of trustworthiness judgments by emotional expression cues in children with ASD is also atypical. Cognitively-able children with and without ASD, aged 6-12 years, rated the trustworthiness of faces showing happy, angry and neutral expressions. Trust judgments in children with ASD were significantly modulated by overt happy and angry expressions, like those of typically-developing children. Furthermore, subtle emotion cues in neutral faces also influenced trust ratings of the children in both groups. These findings support a powerful influence of emotion cues on perceived trustworthiness, which even extends to children with social cognitive impairments.

  10. The incremental role of trait emotional intelligence on perceived cervical screening barriers.

    Science.gov (United States)

    Costa, Sebastiano; Barberis, Nadia; Larcan, Rosalba; Cuzzocrea, Francesca

    2018-02-13

    Researchers have become increasingly interested in investigating the role of the psychological aspects related to the perception of cervical screening barriers. This study investigates the influence of trait EI on perceived cervical screening barriers. Furthermore, this study investigates the incremental validity of trait EI beyond the Big Five, as well as emotion regulation in the perceived barrier towards the Pap test as revealed in a sample of 206 Italian women that were undergoing cervical screening. Results have shown that trait EI is negatively related to cervical screening barriers. Furthermore, trait EI can be considered as a strong incremental predictor of a woman's perception of screening over and above the Big Five, emotion regulation, age, sexual intercourse experience and past Pap test. Detailed information on the study findings and future research directions are discussed.

  11. PERCEIVED AUTONOMY SUPPORT AND BEHAVIORAL ENGAGEMENT IN PHYSICAL EDUCATION: A CONDITIONAL PROCESS MODEL OF POSITIVE EMOTION AND AUTONOMOUS MOTIVATION.

    Science.gov (United States)

    Yoo, Jin

    2015-06-01

    A variety of theoretical perspectives describe the crucial behavioral roles of motivation and emotion, but how these interact with perceptions of social contexts and behaviors is less well understood. This study examined whether autonomous motivation mediated the relationship between perceived autonomy support and behavioral engagement in physical education and whether this mediating process was moderated by positive emotion. A sample of 592 Korean middle-school students (304 boys, 288 girls; M age = 14.0 yr., SD = 0.8) completed questionnaires. Autonomous motivation partially mediated the positive association between perceived autonomy support and behavioral engagement. Positive emotion moderated the relationship between autonomous motivation and behavioral engagement. This indirect link was stronger as positive emotion increased. These findings suggest the importance of integrating emotion into motivational processes to understand how and when perceived autonomy support is associated with behavioral engagement in physical education.

  12. The Relationship between Social-Emotional Learning Ability and Perceived Social Support in Gifted Students

    Science.gov (United States)

    Ogurlu, Üzeyir; Sevgi-Yalin, Hatun; Yavuz-Birben, Fazilet

    2018-01-01

    This study aimed to examine the relationship between social-emotional learning skills and perceived social support of gifted students. Based on this relationship, the authors also examined to what extent social and emotional learning skills were predictive of social support. In addition, gender variables were compared in social and emotional…

  13. Predicting the filial behaviors of Chinese-Malaysian adolescents from perceived parental investments, filial emotions, and parental warmth and support.

    Science.gov (United States)

    Cheah, Charissa S L; Bayram Özdemir, Sevgi; Leung, Christy Y Y

    2012-06-01

    The present study examined the mediating role of perceived parental warmth and support in predicting Chinese Malaysian adolescents' filial behaviors from their age, perceived parental investments, and positive filial emotions toward their parents. The effects of these predictors were examined separately for mothers and fathers. Participants included 122 Chinese adolescents (M = 13.14 years; SD = 2.22) in Malaysia. Adolescents' perceived parental investments, filial emotions, and warmth and support from each parent were positively, and age was negatively associated with their filial behaviors. No gender differences were found. Perceived maternal warmth and support significantly mediated the effect of age, perceived investments from, and filial emotions toward mothers on adolescents' filial behaviors, but perceived paternal warmth and support did not have a mediating role. The present study sheds light on the unique maternal versus paternal filial role, and important familial processes in Chinese-Malaysian children and adolescents from a cultural perspective. Published by Elsevier Ltd.

  14. Perceived Social Support and Domain-Specific Adjustment of Children with Emotional and Behavioural Difficulties

    Science.gov (United States)

    Popliger, Mina; Toste, Jessica R.; Heath, Nancy L.

    2009-01-01

    The perceived availability of social support has been documented as a protective mechanism among adults and adolescents. However, little research has explored the role of social support among children with emotional and behavioural difficulties (E/BD). The current study sought to investigate the effects of perceived social support from family,…

  15. Age-Group Differences in Interference from Young and Older Emotional Faces.

    Science.gov (United States)

    Ebner, Natalie C; Johnson, Marcia K

    2010-11-01

    Human attention is selective, focusing on some aspects of events at the expense of others. In particular, angry faces engage attention. Most studies have used pictures of young faces, even when comparing young and older age groups. Two experiments asked (1) whether task-irrelevant faces of young and older individuals with happy, angry, and neutral expressions disrupt performance on a face-unrelated task, (2) whether interference varies for faces of different ages and different facial expressions, and (3) whether young and older adults differ in this regard. Participants gave speeded responses on a number task while irrelevant faces appeared in the background. Both age groups were more distracted by own than other-age faces. In addition, young participants' responses were slower for angry than happy faces, whereas older participants' responses were slower for happy than angry faces. Factors underlying age-group differences in interference from emotional faces of different ages are discussed.

  16. Math anxiety and its relationship to inhibitory abilities and perceived emotional intelligence

    Directory of Open Access Journals (Sweden)

    Maria-José Justicia-Galiano

    2016-01-01

    Full Text Available Math anxiety has been found to be an emotional problem that has a negative effect on students' academic performance across different levels of education. This type of anxiety could be related to certain cognitive and emotional processes. A first objective was to examine the relationship between math anxiety and certain inhibitory abilities responsible of eliminating intrusive thoughts or preventing them access to consciousness. A second aim was to determine the extent in which math anxiety and students' self-perceptions of their own emotional abilities are related. To this end, 187 first-year undergraduate psychology students were administered different measures to assess math anxiety, statistics anxiety, inhibitory abilities, and perceived emotional intelligence. The results showed that students with high math anxiety were more likely to experience intrusive thoughts, were less effective at suppressing these thoughts, and reported lower scores in understanding and regulating their emotions. These cognitive mechanisms and emotional abilities are of relevance to better understand the nature of this type of anxiety.

  17. The Influence of Music on Facial Emotion Recognition in Children with Autism Spectrum Disorder and Neurotypical Children.

    Science.gov (United States)

    Brown, Laura S

    2017-03-01

    Children with autism spectrum disorder (ASD) often struggle with social skills, including the ability to perceive emotions based on facial expressions. Research evidence suggests that many individuals with ASD can perceive emotion in music. Examining whether music can be used to enhance recognition of facial emotion by children with ASD would inform development of music therapy interventions. The purpose of this study was to investigate the influence of music with a strong emotional valance (happy; sad) on children with ASD's ability to label emotions depicted in facial photographs, and their response time. Thirty neurotypical children and 20 children with high-functioning ASD rated expressions of happy, neutral, and sad in 30 photographs under two music listening conditions (sad music; happy music). During each music listening condition, participants rated the 30 images using a 7-point scale that ranged from very sad to very happy. Response time data were also collected across both conditions. A significant two-way interaction revealed that participants' ratings of happy and neutral faces were unaffected by music conditions, but sad faces were perceived to be sadder with sad music than with happy music. Across both conditions, neurotypical children rated the happy faces as happier and the sad faces as sadder than did participants with ASD. Response times of the neurotypical children were consistently shorter than response times of the children with ASD; both groups took longer to rate sad faces than happy faces. Response times of neurotypical children were generally unaffected by the valence of the music condition; however, children with ASD took longer to respond when listening to sad music. Music appears to affect perceptions of emotion in children with ASD, and perceptions of sad facial expressions seem to be more affected by emotionally congruent background music than are perceptions of happy or neutral faces. © the American Music Therapy Association 2016

  18. Crossmodal and incremental perception of audiovisual cues to emotional speech.

    Science.gov (United States)

    Barkhuysen, Pashiera; Krahmer, Emiel; Swerts, Marc

    2010-01-01

    In this article we report on two experiments about the perception of audiovisual cues to emotional speech. The article addresses two questions: 1) how do visual cues from a speaker's face to emotion relate to auditory cues, and (2) what is the recognition speed for various facial cues to emotion? Both experiments reported below are based on tests with video clips of emotional utterances collected via a variant of the well-known Velten method. More specifically, we recorded speakers who displayed positive or negative emotions, which were congruent or incongruent with the (emotional) lexical content of the uttered sentence. In order to test this, we conducted two experiments. The first experiment is a perception experiment in which Czech participants, who do not speak Dutch, rate the perceived emotional state of Dutch speakers in a bimodal (audiovisual) or a unimodal (audio- or vision-only) condition. It was found that incongruent emotional speech leads to significantly more extreme perceived emotion scores than congruent emotional speech, where the difference between congruent and incongruent emotional speech is larger for the negative than for the positive conditions. Interestingly, the largest overall differences between congruent and incongruent emotions were found for the audio-only condition, which suggests that posing an incongruent emotion has a particularly strong effect on the spoken realization of emotions. The second experiment uses a gating paradigm to test the recognition speed for various emotional expressions from a speaker's face. In this experiment participants were presented with the same clips as experiment I, but this time presented vision-only. The clips were shown in successive segments (gates) of increasing duration. Results show that participants are surprisingly accurate in their recognition of the various emotions, as they already reach high recognition scores in the first gate (after only 160 ms). Interestingly, the recognition scores

  19. Orthognathic Surgery Has a Significant Effect on Perceived Personality Traits and Emotional Expressions.

    Science.gov (United States)

    Mazzaferro, Daniel M; Wes, Ari M; Naran, Sanjay; Pearl, Rebecca; Bartlett, Scott P; Taylor, Jesse A

    2017-11-01

    The effects of orthognathic surgery go beyond objective cephalometric correction of facial and dental disproportion and malocclusion, respectively. The authors hypothesized that there is tangible improvement following surgery that alters publicly perceived personality traits and emotions. The authors used Amazon.com's Mechanical Turk (MTurk), a crowdsourcing tool, to determine how preoperative and postoperative images of orthognathic surgery patients were perceived on six personality traits and six emotional expressions based on posteroanterior and lateral photographs. Blinded respondents provided demographic information and were randomly assigned to one of two sets of 20 photographs (10 subjects before and after surgery). Data on 20 orthognathic surgery patients were collected from 476 individuals. The majority of participants were female (52.6 percent), 18 to 39 years old (67.9 percent), Caucasian (76.6 percent), had some college or technical training or graduated college (72.7 percent), and had an annual income between $20,000 and $99,999 (74.6 percent). A paired t test analysis found that subjects were perceived significantly more favorably after orthognathic surgery in 12 countenance categories: more dominant, trustworthy, friendly, intelligent, attractive, and happy; and also less threatening, angry, surprised, sad, afraid, and disgusted (p surgery than those earning less (p orthognathic surgery, with both perceived personality traits and emotions deemed more favorable. Additional work is needed to better understand the physiologic underpinnings of such findings. Crowdsourcing technology offers a unique opportunity for surgeons to gather data regarding laypeople's perceptions of surgical outcomes in areas such as orthognathic surgery.

  20. More than words (and faces): evidence for a Stroop effect of prosody in emotion word processing.

    Science.gov (United States)

    Filippi, Piera; Ocklenburg, Sebastian; Bowling, Daniel L; Heege, Larissa; Güntürkün, Onur; Newen, Albert; de Boer, Bart

    2017-08-01

    Humans typically combine linguistic and nonlinguistic information to comprehend emotions. We adopted an emotion identification Stroop task to investigate how different channels interact in emotion communication. In experiment 1, synonyms of "happy" and "sad" were spoken with happy and sad prosody. Participants had more difficulty ignoring prosody than ignoring verbal content. In experiment 2, synonyms of "happy" and "sad" were spoken with happy and sad prosody, while happy or sad faces were displayed. Accuracy was lower when two channels expressed an emotion that was incongruent with the channel participants had to focus on, compared with the cross-channel congruence condition. When participants were required to focus on verbal content, accuracy was significantly lower also when prosody was incongruent with verbal content and face. This suggests that prosody biases emotional verbal content processing, even when conflicting with verbal content and face simultaneously. Implications for multimodal communication and language evolution studies are discussed.

  1. Reading emotions from faces in two indigenous societies.

    Science.gov (United States)

    Crivelli, Carlos; Jarillo, Sergio; Russell, James A; Fernández-Dols, José-Miguel

    2016-07-01

    That all humans recognize certain specific emotions from their facial expression-the Universality Thesis-is a pillar of research, theory, and application in the psychology of emotion. Its most rigorous test occurs in indigenous societies with limited contact with external cultural influences, but such tests are scarce. Here we report 2 such tests. Study 1 was of children and adolescents (N = 68; aged 6-16 years) of the Trobriand Islands (Papua New Guinea, South Pacific) with a Western control group from Spain (N = 113, of similar ages). Study 2 was of children and adolescents (N = 36; same age range) of Matemo Island (Mozambique, Africa). In both studies, participants were shown an array of prototypical facial expressions and asked to point to the person feeling a specific emotion: happiness, fear, anger, disgust, or sadness. The Spanish control group matched faces to emotions as predicted by the Universality Thesis: matching was seen on 83% to 100% of trials. For the indigenous societies, in both studies, the Universality Thesis was moderately supported for happiness: smiles were matched to happiness on 58% and 56% of trials, respectively. For other emotions, however, results were even more modest: 7% to 46% in the Trobriand Islands and 22% to 53% in Matemo Island. These results were robust across age, gender, static versus dynamic display of the facial expressions, and between- versus within-subjects design. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  2. Perfectionism and negative/positive affect associations: the role of cognitive emotion regulation and perceived distress/coping.

    Science.gov (United States)

    Castro, Juliana; Soares, Maria João; Pereira, Ana T; Macedo, António

    2017-01-01

    To explore 1) if perfectionism, perceived distress/coping, and cognitive emotion regulation (CER) are associated with and predictive of negative/positive affect (NA/PA); and 2) if CER and perceived distress/coping are associated with perfectionism and if they mediate the perfectionism-NA/PA associations. There is a distinction between maladaptive and adaptive perfectionism in its association with NA/PA. CER and perceived distress/coping may mediate the maladaptive/adaptive perfectionism and NA/PA associations. 344 students (68.4% girls) completed the Hewitt & Flett and the Frost Multidimensional Perfectionism Scales, the Composite Multidimensional Perfectionism Scale, the Profile of Mood States, the Perceived Stress Scale, and the Cognitive Emotion Regulation Questionnaire. NA predictors were maladaptive/adaptive perfectionism, maladaptive CER and perceived distress (positively), positive reappraisal and planning, and perceived coping (negatively). PA predictors were maladaptive/adaptive perfectionism and perceived distress (negatively), positive reappraisal and planning, positive refocusing and perceived coping (positively). The association between maladaptive perfectionism and NA was mediated by maladaptive CER/low adaptive CER, perceived distress/low coping. Maladaptive perfectionism and low PA association was mediated by perceived distress. High PA was determined by low maladaptive perfectionism and this association was mediated by adaptive REC and coping. Adaptive perfectionism and NA association was mediated by maladaptive CER and perceived distress. CER and perceived distress/coping are associated and mediate the perfectionism-NA/PA associations.

  3. The effects of shopping environment on consumption emotions, perceived values and behavioral intentions

    Directory of Open Access Journals (Sweden)

    Kambiz Heidarzadeh Hanzaee

    2013-09-01

    Full Text Available The main objective of this study is to develop and to test a comprehensive model that investigates the effect of shopping environment on consumption emotion, perceived value and behavioral intentions in tourism setting. The proposed model specifies the effect of environment perceptions on consumption emotions (pleasure and arousal, hedonic and utilitarian value, which in turn emotions and values affect tourist’s satisfaction and behavioral intentions. Data were collected through tourists who visited a tourist city by using cluster random sampling method. A total of 410 questionnaires were used for data analysis. Structural equations modeling (SEM by using LISREL was performed to empirically test the relationships between the constructs of this research. Results show that environment has a positive and significant influence on pleasure and arousal. However, the effect of environment perceptions on behavioral intentions was not significant. In addition, results indicate that pleasure and arousal have positive and significant effects on tourist’s values. Findings also indicate that hedonic and utilitarian values had direct effect on customer satisfaction and the effect of satisfaction on behavioral intention was positive and significant. Finally, it suggests that service providers should focus on components of environment in a way that contributes positively in creating positive emotions in customers, which in turn consumption emotions enhance perceived value and positive behavioral intentions.

  4. Perceiving emotions: Cueing social categorization processes and attentional control through facial expressions.

    Science.gov (United States)

    Cañadas, Elena; Lupiáñez, Juan; Kawakami, Kerry; Niedenthal, Paula M; Rodríguez-Bailón, Rosa

    2016-09-01

    Individuals spontaneously categorise other people on the basis of their gender, ethnicity and age. But what about the emotions they express? In two studies we tested the hypothesis that facial expressions are similar to other social categories in that they can function as contextual cues to control attention. In Experiment 1 we associated expressions of anger and happiness with specific proportions of congruent/incongruent flanker trials. We also created consistent and inconsistent category members within each of these two general contexts. The results demonstrated that participants exhibited a larger congruency effect when presented with faces in the emotional group associated with a high proportion of congruent trials. Notably, this effect transferred to inconsistent members of the group. In Experiment 2 we replicated the effects with faces depicting true and false smiles. Together these findings provide consistent evidence that individuals spontaneously utilise emotions to categorise others and that such categories determine the allocation of attentional control.

  5. Negative emotions in veterans relate to suicide risk through feelings of perceived burdensomeness and thwarted belongingness.

    Science.gov (United States)

    Rogers, Megan L; Kelliher-Rabon, Jessica; Hagan, Christopher R; Hirsch, Jameson K; Joiner, Thomas E

    2017-01-15

    Suicide rates among veterans are disproportionately high compared to rates among the general population. Veterans may experience a number of negative emotions (e.g., anger, self-directed hostility, shame, guilt) during periods of postwar adjustment and reintegration into civilian life that may uniquely confer risk for suicide. Mechanisms of these associations, however, are less well studied. The purpose of the present study was to examine the relationship between negative emotions and suicide risk in veterans through the theoretical framework of the interpersonal theory of suicide. A large sample of veterans (N = 541) completed measures assessing their negative emotions, perceived burdensomeness, thwarted belongingness, and suicide risk. Self-directed hostility and shame related indirectly to suicide risk through both perceived burdensomeness and thwarted belongingness. Thwarted belongingness accounted for the association between anger and suicide risk, whereas perceived burdensomeness accounted for the relationship between guilt and suicide risk. This study had a cross-sectional design and relied solely on self-report measures. These findings provide evidence for the role of negative emotions in conferring risk for suicide in veterans. Clinical implications, limitations, and future research directions are discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. ERP Correlates of Target-Distracter Differentiation in Repeated Runs of a Continuous Recognition Task with Emotional and Neutral Faces

    Science.gov (United States)

    Treese, Anne-Cecile; Johansson, Mikael; Lindgren, Magnus

    2010-01-01

    The emotional salience of faces has previously been shown to induce memory distortions in recognition memory tasks. This event-related potential (ERP) study used repeated runs of a continuous recognition task with emotional and neutral faces to investigate emotion-induced memory distortions. In the second and third runs, participants made more…

  7. Childhood Poverty Predicts Adult Amygdala and Frontal Activity and Connectivity in Response to Emotional Faces

    Directory of Open Access Journals (Sweden)

    Arash eJavanbakht

    2015-06-01

    Full Text Available Childhood poverty negatively impacts physical and mental health in adulthood. Altered brain development in response to social and environmental factors associated with poverty likely contributes to this effect, engendering maladaptive patterns of social attribution and/or elevated physiological stress. In this fMRI study, we examined the association between childhood poverty and neural processing of social signals (i.e., emotional faces in adulthood. 52 subjects from a longitudinal prospective study recruited as children, participated in a brain imaging study at 23-25 years of age using the Emotional Faces Assessment Task (EFAT. Childhood poverty, independent of concurrent adult income, was associated with higher amygdala and mPFC responses to threat vs. happy faces. Also, childhood poverty was associated with decreased functional connectivity between left amygdala and mPFC. This study is unique because it prospectively links childhood poverty to emotional processing during adulthood, suggesting a candidate neural mechanism for negative social-emotional bias. Adults who grew up poor appear to be more sensitive to social threat cues and less sensitive to positive social cues.

  8. The influence of perceived parenting styles on socio-emotional development from pre-puberty into puberty

    OpenAIRE

    Ong, Min Yee; Eilander, Janna; Saw, Seang Mei; Xie, Yuhuan; Meaney, Michael J.; Broekman, Birit F. P.

    2017-01-01

    The relative impact of parenting on socio-emotional development of children has rarely been examined in a longitudinal context. This study examined the association between perceived parenting styles and socio-emotional functioning from childhood to adolescence. We hypothesized that optimal parenting associated with improvement in socio-emotional functioning from childhood into early adulthood, especially for those with more behavioral problems in childhood. Children between ages 7 and 9 years...

  9. Personality judgments from everyday images of faces

    Directory of Open Access Journals (Sweden)

    Clare AM Sutherland

    2015-10-01

    Full Text Available People readily make personality attributions to images of strangers’ faces. Here we investigated the basis of these personality attributions as made to everyday, naturalistic face images. In a first study, we used 1,000 highly varying ‘ambient image’ face photographs to test the correspondence between personality judgments of the Big Five and dimensions known to underlie a range of facial first impressions: approachability, dominance and youthful-attractiveness. Interestingly, the facial Big Five judgments were found to separate to some extent: judgments of openness, extraversion, emotional stability and agreeableness were mainly linked to facial first impressions of approachability, whereas conscientiousness judgments involved a combination of approachability and dominance. In a second study we used average face images to investigate which main cues are used by perceivers to make impressions of the Big Five, by extracting consistent cues to impressions from the large variation in the original images. When forming impressions of strangers from highly varying, naturalistic face photographs, perceivers mainly seem to rely on broad facial cues to approachability, such as smiling.

  10. Cultural in-group advantage: emotion recognition in African American and European American faces and voices.

    Science.gov (United States)

    Wickline, Virginia B; Bailey, Wendy; Nowicki, Stephen

    2009-03-01

    The authors explored whether there were in-group advantages in emotion recognition of faces and voices by culture or geographic region. Participants were 72 African American students (33 men, 39 women), 102 European American students (30 men, 72 women), 30 African international students (16 men, 14 women), and 30 European international students (15 men, 15 women). The participants determined emotions in African American and European American faces and voices. Results showed an in-group advantage-sometimes by culture, less often by race-in recognizing facial and vocal emotional expressions. African international students were generally less accurate at interpreting American nonverbal stimuli than were European American, African American, and European international peers. Results suggest that, although partly universal, emotional expressions have subtle differences across cultures that persons must learn.

  11. Induction of depressed and elated mood by music influences the perception of facial emotional expressions in healthy subjects.

    Science.gov (United States)

    Bouhuys, A L; Bloem, G M; Groothuis, T G

    1995-04-04

    The judgement of healthy subject rating the emotional expressions of a set of schematic drawn faces is validated (study 1) to examine the relationship between mood (depressed/elated) and judgement of emotional expressions of these faces (study 2). Study 1: 30 healthy subjects judged 12 faces with respect to the emotions they express (fear, happiness, anger, sadness, disgust, surprise, rejection and invitation). It was found that a particular face could reflect various emotions. All eight emotions were reflected in the set of faces and the emotions were consensually judged. Moreover, gender differences in judgement could be established. Study 2: In a cross-over design, 24 healthy subjects judged the faces after listening to depressing or elating music. The faces were subdivided in six 'ambiguous' faces (i.e., expressing similar amounts of positive and negative emotions) and six 'clear' faces (i.e., faces showing a preponderance of positive or negative emotions). In addition, these two types of faces were distinguished with respect to the intensity of emotions they express. 11 subjects who showed substantial differences in experienced depression after listening to the music were selected for further analysis. It was found that, when feeling more depressed, the subjects perceived more rejection/sadness in ambiguous faces (displaying less intensive emotions) and less invitation/happiness in clear faces. In addition, subjects saw more fear in clear faces that express less intensive emotions. Hence, results show a depression-related negative bias in the perception of facial displays.

  12. The Perceived Social Costs and Importance of Seeking Emotional Support in the Workplace: Gender Differences and Similarities.

    Science.gov (United States)

    Cahill, Daniel J.; Sias, Patricia M.

    1997-01-01

    Investigates gender differences and similarities in the perceived social costs and importance of seeking emotional support regarding work-related problems. Finds women perceived such support to be more important than did men. Finds no gender differences regarding perceived social costs associated with seeking support from coworkers. Finds women…

  13. Effects of Face and Background Color on Facial Expression Perception

    Directory of Open Access Journals (Sweden)

    Tetsuto Minami

    2018-06-01

    Full Text Available Detecting others’ emotional states from their faces is an essential component of successful social interaction. However, the ability to perceive emotional expressions is reported to be modulated by a number of factors. We have previously found that facial color modulates the judgment of facial expression, while another study has shown that background color plays a modulatory role. Therefore, in this study, we directly compared the effects of face and background color on facial expression judgment within a single experiment. Fear-to-anger morphed faces were presented in face and background color conditions. Our results showed that judgments of facial expressions was influenced by both face and background color. However, facial color effects were significantly greater than background color effects, although the color saturation of faces was lower compared to background colors. These results suggest that facial color is intimately related to the judgment of facial expression, over and above the influence of simple color.

  14. Steroids facing emotions

    NARCIS (Netherlands)

    Putman, P.L.J.

    2006-01-01

    The studies reported in this thesis have been performed to gain a better understanding about motivational mediators of selective attention and memory for emotionally relevant stimuli, and about the roles that some steroid hormones play in regulation of human motivation and emotion. The stimuli used

  15. Time perception and dynamics of facial expressions of emotions.

    Directory of Open Access Journals (Sweden)

    Sophie L Fayolle

    Full Text Available Two experiments were run to examine the effects of dynamic displays of facial expressions of emotions on time judgments. The participants were given a temporal bisection task with emotional facial expressions presented in a dynamic or a static display. Two emotional facial expressions and a neutral expression were tested and compared. Each of the emotional expressions had the same affective valence (unpleasant, but one was high-arousing (expressing anger and the other low-arousing (expressing sadness. Our results showed that time judgments are highly sensitive to movements in facial expressions and the emotions expressed. Indeed, longer perceived durations were found in response to the dynamic faces and the high-arousing emotional expressions compared to the static faces and low-arousing expressions. In addition, the facial movements amplified the effect of emotions on time perception. Dynamic facial expressions are thus interesting tools for examining variations in temporal judgments in different social contexts.

  16. Effects of induced sad mood on facial emotion perception in young and older adults.

    Science.gov (United States)

    Lawrie, Louisa; Jackson, Margaret C; Phillips, Louise H

    2018-02-15

    Older adults perceive less intense negative emotion in facial expressions compared to younger counterparts. Prior research has also demonstrated that mood alters facial emotion perception. Nevertheless, there is little evidence which evaluates the interactive effects of age and mood on emotion perception. This study investigated the effects of sad mood on younger and older adults' perception of emotional and neutral faces. Participants rated the intensity of stimuli while listening to sad music and in silence. Measures of mood were administered. Younger and older participants' rated sad faces as displaying stronger sadness when they experienced sad mood. While younger participants showed no influence of sad mood on happiness ratings of happy faces, older adults rated happy faces as conveying less happiness when they experienced sad mood. This study demonstrates how emotion perception can change when a controlled mood induction procedure is applied to alter mood in young and older participants.

  17. Spatiotemporal brain dynamics of emotional face processing modulations induced by the serotonin 1A/2A receptor agonist psilocybin.

    Science.gov (United States)

    Bernasconi, Fosco; Schmidt, André; Pokorny, Thomas; Kometer, Michael; Seifritz, Erich; Vollenweider, Franz X

    2014-12-01

    Emotional face processing is critically modulated by the serotonergic system. For instance, emotional face processing is impaired by acute psilocybin administration, a serotonin (5-HT) 1A and 2A receptor agonist. However, the spatiotemporal brain mechanisms underlying these modulations are poorly understood. Here, we investigated the spatiotemporal brain dynamics underlying psilocybin-induced modulations during emotional face processing. Electrical neuroimaging analyses were applied to visual evoked potentials in response to emotional faces, following psilocybin and placebo administration. Our results indicate a first time period of strength (i.e., Global Field Power) modulation over the 168-189 ms poststimulus interval, induced by psilocybin. A second time period of strength modulation was identified over the 211-242 ms poststimulus interval. Source estimations over these 2 time periods further revealed decreased activity in response to both neutral and fearful faces within limbic areas, including amygdala and parahippocampal gyrus, and the right temporal cortex over the 168-189 ms interval, and reduced activity in response to happy faces within limbic and right temporo-occipital brain areas over the 211-242 ms interval. Our results indicate a selective and temporally dissociable effect of psilocybin on the neuronal correlates of emotional face processing, consistent with a modulation of the top-down control. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Cognitive emotion regulation in children: Reappraisal of emotional faces modulates neural source activity in a frontoparietal network.

    Science.gov (United States)

    Wessing, Ida; Rehbein, Maimu A; Romer, Georg; Achtergarde, Sandra; Dobel, Christian; Zwitserlood, Pienie; Fürniss, Tilman; Junghöfer, Markus

    2015-06-01

    Emotion regulation has an important role in child development and psychopathology. Reappraisal as cognitive regulation technique can be used effectively by children. Moreover, an ERP component known to reflect emotional processing called late positive potential (LPP) can be modulated by children using reappraisal and this modulation is also related to children's emotional adjustment. The present study seeks to elucidate the neural generators of such LPP effects. To this end, children aged 8-14 years reappraised emotional faces, while neural activity in an LPP time window was estimated using magnetoencephalography-based source localization. Additionally, neural activity was correlated with two indexes of emotional adjustment and age. Reappraisal reduced activity in the left dorsolateral prefrontal cortex during down-regulation and enhanced activity in the right parietal cortex during up-regulation. Activity in the visual cortex decreased with increasing age, more adaptive emotion regulation and less anxiety. Results demonstrate that reappraisal changed activity within a frontoparietal network in children. Decreasing activity in the visual cortex with increasing age is suggested to reflect neural maturation. A similar decrease with adaptive emotion regulation and less anxiety implies that better emotional adjustment may be associated with an advance in neural maturation. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Does perceived race affect discrimination and recognition of ambiguous-race faces? A test of the sociocognitive hypothesis.

    Science.gov (United States)

    Rhodes, Gillian; Lie, Hanne C; Ewing, Louise; Evangelista, Emma; Tanaka, James W

    2010-01-01

    Discrimination and recognition are often poorer for other-race than own-race faces. These other-race effects (OREs) have traditionally been attributed to reduced perceptual expertise, resulting from more limited experience, with other-race faces. However, recent findings suggest that sociocognitive factors, such as reduced motivation to individuate other-race faces, may also contribute. If the sociocognitive hypothesis is correct, then it should be possible to alter discrimination and memory performance for identical faces by altering their perceived race. We made identical ambiguous-race morphed faces look either Asian or Caucasian by presenting them in Caucasian or Asian face contexts, respectively. However, this perceived-race manipulation had no effect on either discrimination (Experiment 1) or memory (Experiment 2) for the ambiguous-race faces, despite the presence of the usual OREs in discrimination and recognition of unambiguous Asian and Caucasian faces in our participant population. These results provide no support for the sociocognitive hypothesis. (PsycINFO Database Record (c) 2009 APA, all rights reserved).

  20. Emotions facilitate the communication of ambiguous group memberships.

    Science.gov (United States)

    Tskhay, Konstantin O; Rule, Nicholas O

    2015-12-01

    It is well known that emotions intersect with obvious social categories (e.g., race), influencing both how targets are categorized and the emotions that are read from their faces. Here, we examined the influence of emotional expression on the perception of less obvious group memberships for which, in the absence of obvious and stable physical markers, emotion may serve as a major avenue for group categorization and identification. Specifically, we examined whether emotions are embedded in the mental representations of sexual orientation and political affiliation, and whether people may use emotional expressions to communicate these group memberships to others. Using reverse correlation methods, we found that mental representations of gay and liberal faces were characterized by more positive facial expressions than mental representations of straight and conservative faces (Study 1). Furthermore, participants were evaluated as expressing more positive emotions when enacting self-defined "gay" and "liberal" versus "straight" and "conservative" facial expressions in the lab (Study 2). In addition, neutral faces morphed with happiness were perceived as more gay than when morphed with anger, and when compared to unmorphed controls (Study 3). Finally, we found that affect facilitated perceptions of sexual orientation and political affiliation in naturalistic settings (Study 4). Together, these studies suggest that emotion is a defining characteristic of person construal that people tend to use both when signaling their group memberships and when receiving those signals to categorize others. (c) 2015 APA, all rights reserved).

  1. Is fear in your head? A comparison of instructed and real-life expressions of emotion in the face and body.

    Science.gov (United States)

    Abramson, Lior; Marom, Inbal; Petranker, Rotem; Aviezer, Hillel

    2017-04-01

    The majority of emotion perception studies utilize instructed and stereotypical expressions of faces or bodies. While such stimuli are highly standardized and well-recognized, their resemblance to real-life expressions of emotion remains unknown. Here we examined facial and body expressions of fear and anger during real-life situations and compared their recognition to that of instructed expressions of the same emotions. In order to examine the source of the affective signal, expressions of emotion were presented as faces alone, bodies alone, and naturally, as faces with bodies. The results demonstrated striking deviations between recognition of instructed and real-life stimuli, which differed as a function of the emotion expressed. In real-life fearful expressions of emotion, bodies were far better recognized than faces, a pattern not found with instructed expressions of emotion. Anger reactions were better recognized from the body than from the face in both real-life and instructed stimuli. However, the real-life stimuli were overall better recognized than their instructed counterparts. These results indicate that differences between instructed and real-life expressions of emotion are prevalent and raise caution against an overreliance of researchers on instructed affective stimuli. The findings also demonstrate that in real life, facial expression perception may rely heavily on information from the contextualizing body. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  2. Cortical deficits of emotional face processing in adults with ADHD: its relation to social cognition and executive function.

    Science.gov (United States)

    Ibáñez, Agustin; Petroni, Agustin; Urquina, Hugo; Torrente, Fernando; Torralva, Teresa; Hurtado, Esteban; Guex, Raphael; Blenkmann, Alejandro; Beltrachini, Leandro; Muravchik, Carlos; Baez, Sandra; Cetkovich, Marcelo; Sigman, Mariano; Lischinsky, Alicia; Manes, Facundo

    2011-01-01

    Although it has been shown that adults with attention-deficit hyperactivity disorder (ADHD) have impaired social cognition, no previous study has reported the brain correlates of face valence processing. This study looked for behavioral, neuropsychological, and electrophysiological markers of emotion processing for faces (N170) in adult ADHD compared to controls matched by age, gender, educational level, and handedness. We designed an event-related potential (ERP) study based on a dual valence task (DVT), in which faces and words were presented to test the effects of stimulus type (faces, words, or face-word stimuli) and valence (positive versus negative). Individual signatures of cognitive functioning in participants with ADHD and controls were assessed with a comprehensive neuropsychological evaluation, including executive functioning (EF) and theory of mind (ToM). Compared to controls, the adult ADHD group showed deficits in N170 emotion modulation for facial stimuli. These N170 impairments were observed in the absence of any deficit in facial structural processing, suggesting a specific ADHD impairment in early facial emotion modulation. The cortical current density mapping of N170 yielded a main neural source of N170 at posterior section of fusiform gyrus (maximum at left hemisphere for words and right hemisphere for faces and simultaneous stimuli). Neural generators of N170 (fusiform gyrus) were reduced in ADHD. In those patients, N170 emotion processing was associated with performance on an emotional inference ToM task, and N170 from simultaneous stimuli was associated with EF, especially working memory. This is the first report to reveal an adult ADHD-specific impairment in the cortical modulation of emotion for faces and an association between N170 cortical measures and ToM and EF.

  3. Age Differences in the Complexity of Emotion Perception.

    Science.gov (United States)

    Kim, Seungyoun; Geren, Jennifer L; Knight, Bob G

    2015-01-01

    The current study examined age differences in the number of emotion components used in the judgment of emotion from facial expressions. Fifty-eight younger and 58 older adults were compared on the complexity of perception of emotion from standardized facial expressions that were either clear or ambiguous exemplars of emotion. Using an intra-individual factor analytic approach, results showed that older adults used more emotion components in perceiving emotion in faces than younger adults. Both age groups reported greater emotional complexity for the clear and prototypical emotional stimuli. Age differences in emotional complexity were more pronounced for the ambiguous expressions compared with the clear expressions. These findings demonstrate that older adults showed increased elaboration of emotion, particularly when emotion cues were subtle and provide support for greater emotion differentiation in older adulthood.

  4. Sleep deprivation reduces perceived emotional intelligence and constructive thinking skills.

    Science.gov (United States)

    Killgore, William D S; Kahn-Greene, Ellen T; Lipizzi, Erica L; Newman, Rachel A; Kamimori, Gary H; Balkin, Thomas J

    2008-07-01

    Insufficient sleep can adversely affect a variety of cognitive abilities, ranging from simple alertness to higher-order executive functions. Although the effects of sleep loss on mood and cognition are well documented, there have been no controlled studies examining its effects on perceived emotional intelligence (EQ) and constructive thinking, abilities that require the integration of affect and cognition and are central to adaptive functioning. Twenty-six healthy volunteers completed the Bar-On Emotional Quotient Inventory (EQi) and the Constructive Thinking Inventory (CTI) at rested baseline and again after 55.5 and 58 h of continuous wakefulness, respectively. Relative to baseline, sleep deprivation was associated with lower scores on Total EQ (decreased global emotional intelligence), Intrapersonal functioning (reduced self-regard, assertiveness, sense of independence, and self-actualization), Interpersonal functioning (reduced empathy toward others and quality of interpersonal relationships), Stress Management skills (reduced impulse control and difficulty with delay of gratification), and Behavioral Coping (reduced positive thinking and action orientation). Esoteric Thinking (greater reliance on formal superstitions and magical thinking processes) was increased. These findings are consistent with the neurobehavioral model suggesting that sleep loss produces temporary changes in cerebral metabolism, cognition, emotion, and behavior consistent with mild prefrontal lobe dysfunction.

  5. Ratings of Emotion in Laterally Presented Faces: Sex and handedness effects

    NARCIS (Netherlands)

    van Strien, J.W.; van Beek, S.

    2000-01-01

    Sixteen right-handed participants (8 male and 8 female students) and 16 left-handed participants (8 male and 8 female students) were presented with cartoon faces expressing emotions ranging from extremely positive to extremely negative. A forced-choice paradigm was used in which the participants

  6. Emotion regulation strategies in bipolar II disorder and borderline personality disorder: differences and relationships with perceived parental style.

    Science.gov (United States)

    Fletcher, Kathryn; Parker, Gordon; Bayes, Adam; Paterson, Amelia; McClure, Georgia

    2014-03-01

    Bipolar II disorder (BP II) and Borderline Personality Disorder (BPD) share common features and can be difficult to differentiate, contributing to misdiagnosis and inappropriate treatment. Research contrasting phenomenological features of both conditions is limited. The current study sought to identify differences in emotion regulation strategies in BP II and BPD in addition to examining relationships with perceived parental style. Participants were recruited from a variety of outpatient and community settings. Eligible participants required a clinical diagnosis of BP II or BPD, subsequently confirmed via structured diagnostic interviews assessing DSM-IV criteria. Participants completed a series of self-reported questionnaires assessing emotion regulation strategies and perceived parental style. The sample comprised 48 (n=24 BP II and n=24 BPD) age and gender-matched participants. Those with BPD were significantly more likely to use maladaptive emotion regulation strategies, less likely to use adaptive emotion regulation strategies, and scored significantly higher on the majority of (perceived) dysfunctional parenting sub-scales than participants with BP II. Dysfunctional parenting experiences were related to maladaptive emotion regulation strategies in participants with BP II and BPD, however differential associations were observed across groups. Relatively small sample sizes; lack of a healthy control comparator group; lack of statistical control for differing sociodemographic and clinical characteristics, medication and psychological treatments; no assessment of state or trait anxiety; over-representation of females in both groups limiting generalisability of results; and reliance on self-report measures. Differences in emotion regulation strategies and perceived parental style provide some support for the validity of distinguishing BP II and BPD. Development of intervention strategies targeting the differing forms of emotion regulatory pathology in these groups

  7. Cyber Victimization in High School: Measurement, Overlap with Face-to-Face Victimization, and Associations with Social-Emotional Outcomes

    Science.gov (United States)

    Brown, Christina Flynn; Demaray, Michelle Kilpatrick; Tennant, Jaclyn E.; Jenkins, Lyndsay N.

    2017-01-01

    Cyber victimization is a contemporary problem facing youth and adolescents (Diamanduros, Downs, & Jenkins, 2008; Kowalski & Limber, 2007). It is imperative for researchers and school personnel to understand the associations between cyber victimization and student social-emotional outcomes. This article explores (a) gender differences in…

  8. Treatment of obesity in children: Parent's perceived emotional barriers as predictor of change in body fat.

    Science.gov (United States)

    Steinsbekk, Silje; Odegård, Rønnaug; Wichstrøm, Lars

    2011-01-01

    Research supports the use of family-based interventions in the treatment of obesity in children, but there is a lack of knowledge about what factors affect parents' ability to carry out the lifestyle changes necessary to reduce their child's obesity. The aim of the present study was to examine whether parents' self-efficacy, perceived emotional barriers, subjective norms, and attitudes could predict change in their children's body fat at 6 month and 2 year follow-ups after a family-based treatment of obesity. Body Mass Index Standard Deviation Scores (BMI SDS) were calculated and body fat (dual-energy X-ray absorptiometry) were measured in 99 treatment-seeking children with obesity (ages 7-12; 48 girls, 51 boys; mean BMI SDS = 2.99) at baseline, after 6 month and after 2 year follow-up. Parental cognitions regarding diet and physical activity were examined by parent-completed questionnaires. Structural equation modeling (SEM) was used to test whether the selected health cognitions could predict treatment outcome. Parental perceived emotional barriers was a significant predictor of change in body fat at 6 month (β = -.32, p = .001) and 2 year (β = -.38, p = .002) follow-up when the initial body fat values were controlled. Self-efficacy, subjective norms and attitudes did not improve the amount of variance explained. Parents' perceived emotional barriers significantly predict change in total body fat in children treated for obesity. In order to increase treatment-efficacy, perceived emotional barriers should be addressed. © 2011 Asian Oceanian Association for the Study of Obesity . Published by Elsevier Ltd. All rights reserved.

  9. P2-27: Electrophysiological Correlates of Conscious and Unconscious Processing of Emotional Faces in Individuals with High and Low Autistic Traits

    Directory of Open Access Journals (Sweden)

    Svjetlana Vukusic

    2012-10-01

    Full Text Available LeDoux (1996 The Emotional Brain has suggested that subconsciouss presentation of fearful emotional information is relayed to the amygdala along a rapid subcortical route. Rapid emotion processing is important because it alerts other parts of brain to emotionally salient information. It also produces immediate reflexive responses to threating stimuli in comparison to slower conscious appraisal, which is of important adaptive survival value. Current theoretical models of autism spectrum disorders (ASD have linked impairments in the processing of emotional information to amygdala dysfunction. It can be suggested that impairment in face processing found in autism may be the result of impaired rapid subconscious processing of emotional information which does not make faces socially salient. Previous studies examined subconscious processing of emotional stimuli with backward masking paradigms by using very brief presentation of emotional face stimuli proceeded by a mask. We used an event-related potential (ERP study within a backward masking paradigm with subjects with low and high autistic tendencies as measured by the Autism Spectrum Quotient (AQ questionnaire. The time course of processing of fearful and happy facial expressions and an emotionally neutral face was investigated during subliminal (16 ms and supraliminal (166 ms stimuli presentation. The task consisted of an explicit categorization of emotional and neutral faces. We looked at ERP components N2, P3a, and also N170 for differences between subjects with low ( 19 AQ.

  10. Emotional intelligence, life satisfaction and subjective happiness in female student health professionals: the mediating effect of perceived stress.

    Science.gov (United States)

    Ruiz-Aranda, D; Extremera, N; Pineda-Galán, C

    2014-03-01

    The objective of the present study was to extend previous findings by examining the relationship between emotional intelligence (EI) and well-being indicators (life satisfaction and happiness) in a 12-week follow-up study. In addition, we examined the influence of perceived stress on the relationship between EI and well-being. Female students from the School of Health Sciences (n = 264) completed an ability measure of emotional intelligence. After 12 weeks, participants completed the Perceived Stress Scale, Satisfaction with Life Scale and Subjective Happiness Scale. Participants with higher EI reported less perceived stress and higher levels of life satisfaction and happiness. The results of this study suggest that perceived stress mediates the relationship between EI and well-being indicators, specifically life satisfaction and happiness. These findings suggest an underlying process by which high emotional intelligence may increase well-being in female students in nursing and allied health sciences by reducing the experience of stress. The implications of these findings for future research and for working with health professions to improve well-being outcomes are discussed. © 2013 John Wiley & Sons Ltd.

  11. The Processing of Human Emotional Faces by Pet and Lab Dogs: Evidence for Lateralization and Experience Effects

    Science.gov (United States)

    Barber, Anjuli L. A.; Randi, Dania; Müller, Corsin A.; Huber, Ludwig

    2016-01-01

    From all non-human animals dogs are very likely the best decoders of human behavior. In addition to a high sensitivity to human attentive status and to ostensive cues, they are able to distinguish between individual human faces and even between human facial expressions. However, so far little is known about how they process human faces and to what extent this is influenced by experience. Here we present an eye-tracking study with dogs emanating from two different living environments and varying experience with humans: pet and lab dogs. The dogs were shown pictures of familiar and unfamiliar human faces expressing four different emotions. The results, extracted from several different eye-tracking measurements, revealed pronounced differences in the face processing of pet and lab dogs, thus indicating an influence of the amount of exposure to humans. In addition, there was some evidence for the influences of both, the familiarity and the emotional expression of the face, and strong evidence for a left gaze bias. These findings, together with recent evidence for the dog's ability to discriminate human facial expressions, indicate that dogs are sensitive to some emotions expressed in human faces. PMID:27074009

  12. The NMDA antagonist ketamine and the 5-HT agonist psilocybin produce dissociable effects on structural encoding of emotional face expressions.

    Science.gov (United States)

    Schmidt, André; Kometer, Michael; Bachmann, Rosilla; Seifritz, Erich; Vollenweider, Franz

    2013-01-01

    Both glutamate and serotonin (5-HT) play a key role in the pathophysiology of emotional biases. Recent studies indicate that the glutamate N-methyl-D-aspartate (NMDA) receptor antagonist ketamine and the 5-HT receptor agonist psilocybin are implicated in emotion processing. However, as yet, no study has systematically compared their contribution to emotional biases. This study used event-related potentials (ERPs) and signal detection theory to compare the effects of the NMDA (via S-ketamine) and 5-HT (via psilocybin) receptor system on non-conscious or conscious emotional face processing biases. S-ketamine or psilocybin was administrated to two groups of healthy subjects in a double-blind within-subject placebo-controlled design. We behaviorally assessed objective thresholds for non-conscious discrimination in all drug conditions. Electrophysiological responses to fearful, happy, and neutral faces were subsequently recorded with the face-specific P100 and N170 ERP. Both S-ketamine and psilocybin impaired the encoding of fearful faces as expressed by a reduced N170 over parieto-occipital brain regions. In contrast, while S-ketamine also impaired the encoding of happy facial expressions, psilocybin had no effect on the N170 in response to happy faces. This study demonstrates that the NMDA and 5-HT receptor systems differentially contribute to the structural encoding of emotional face expressions as expressed by the N170. These findings suggest that the assessment of early visual evoked responses might allow detecting pharmacologically induced changes in emotional processing biases and thus provides a framework to study the pathophysiology of dysfunctional emotional biases.

  13. The recognition of emotional expression in prosopagnosia: decoding whole and part faces.

    Science.gov (United States)

    Stephan, Blossom Christa Maree; Breen, Nora; Caine, Diana

    2006-11-01

    Prosopagnosia is currently viewed within the constraints of two competing theories of face recognition, one highlighting the analysis of features, the other focusing on configural processing of the whole face. This study investigated the role of feature analysis versus whole face configural processing in the recognition of facial expression. A prosopagnosic patient, SC made expression decisions from whole and incomplete (eyes-only and mouth-only) faces where features had been obscured. SC was impaired at recognizing some (e.g., anger, sadness, and fear), but not all (e.g., happiness) emotional expressions from the whole face. Analyses of his performance on incomplete faces indicated that his recognition of some expressions actually improved relative to his performance on the whole face condition. We argue that in SC interference from damaged configural processes seem to override an intact ability to utilize part-based or local feature cues.

  14. Dysregulation in cortical reactivity to emotional faces in PTSD patients with high dissociation symptoms

    Directory of Open Access Journals (Sweden)

    Aleksandra Klimova

    2013-09-01

    Full Text Available Background: Predominant dissociation in posttraumatic stress disorder (PTSD is characterized by restricted affective responses to positive stimuli. To date, no studies have examined neural responses to a range of emotional expressions in PTSD with high dissociative symptoms. Objective: This study tested the hypothesis that PTSD patients with high dissociative symptoms will display increased event-related potential (ERP amplitudes in early components (N1, P1 to threatening faces (angry, fearful, and reduced later ERP amplitudes (Vertex Positive Potential (VPP, P3 to happy faces compared to PTSD patients with low dissociative symptoms. Methods: Thirty-nine civilians with PTSD were classified as high dissociative (n=16 or low dissociative (n=23 according to their responses on the Clinician Administered Dissociative States Scale. ERPs were recorded, whilst participants viewed emotional (happy, angry, fear and neutral facial expressions in a passive viewing task. Results: High dissociative PTSD patients displayed significantly increased N120 amplitude to the majority of facial expressions (neutral, happy, and angry compared to low dissociative PTSD patients under conscious and preconscious conditions. The high dissociative PTSD group had significantly reduced VPP amplitude to happy faces in the conscious condition. Conclusion: High dissociative PTSD patients displayed increased early (preconscious cortical responses to emotional stimuli, and specific reductions to happy facial expressions in later (conscious, face-specific components compared to low dissociative PTSD patients. Dissociation in PTSD may act to increase initial pre-attentive processing of affective stimuli, and specifically reduce cortical reactivity to happy faces when consciously processing these stimuli.

  15. Integrated color face graphs for plant accident display

    International Nuclear Information System (INIS)

    Hara, Fumio

    1987-01-01

    This paper presents an integrated man-machine interface that uses cartoon-like colored graphs in the form of faces, that, through different facial expressions, display a plant condition. This is done by drawing the face on a CRT by nonlinearly transforming 31 variables and coloring the face. This integrated color graphics technique is applied to display the progess of events in the Three Mile Island nuclear power plant accident. Human visual perceptive characteristics are investigated in relation to the perception of the plant accident process, the naturality in face color change, and the consistency between facial expressions and colors. This paper concludes that colors used in an integrated color face graphs must be completely consistent with emotional feelings perceived from the colors. (author)

  16. The extended functional neuroanatomy of emotional processing biases for masked faces in major depressive disorder.

    Directory of Open Access Journals (Sweden)

    Teresa A Victor

    Full Text Available Major depressive disorder (MDD is associated with a mood-congruent processing bias in the amygdala toward face stimuli portraying sad expressions that is evident even when such stimuli are presented below the level of conscious awareness. The extended functional anatomical network that maintains this response bias has not been established, however.To identify neural network differences in the hemodynamic response to implicitly presented facial expressions between depressed and healthy control participants.Unmedicated-depressed participants with MDD (n=22 and healthy controls (HC; n=25 underwent functional MRI as they viewed face stimuli showing sad, happy or neutral face expressions, presented using a backward masking design. The blood-oxygen-level dependent (BOLD signal was measured to identify regions where the hemodynamic response to the emotionally valenced stimuli differed between groups.The MDD subjects showed greater BOLD responses than the controls to masked-sad versus masked-happy faces in the hippocampus, amygdala and anterior inferotemporal cortex. While viewing both masked-sad and masked-happy faces relative to masked-neutral faces, the depressed subjects showed greater hemodynamic responses than the controls in a network that included the medial and orbital prefrontal cortices and anterior temporal cortex.Depressed and healthy participants showed distinct hemodynamic responses to masked-sad and masked-happy faces in neural circuits known to support the processing of emotionally valenced stimuli and to integrate the sensory and visceromotor aspects of emotional behavior. Altered function within these networks in MDD may establish and maintain illness-associated differences in the salience of sensory/social stimuli, such that attention is biased toward negative and away from positive stimuli.

  17. The extended functional neuroanatomy of emotional processing biases for masked faces in major depressive disorder.

    Science.gov (United States)

    Victor, Teresa A; Furey, Maura L; Fromm, Stephen J; Bellgowan, Patrick S F; Öhman, Arne; Drevets, Wayne C

    2012-01-01

    Major depressive disorder (MDD) is associated with a mood-congruent processing bias in the amygdala toward face stimuli portraying sad expressions that is evident even when such stimuli are presented below the level of conscious awareness. The extended functional anatomical network that maintains this response bias has not been established, however. To identify neural network differences in the hemodynamic response to implicitly presented facial expressions between depressed and healthy control participants. Unmedicated-depressed participants with MDD (n=22) and healthy controls (HC; n=25) underwent functional MRI as they viewed face stimuli showing sad, happy or neutral face expressions, presented using a backward masking design. The blood-oxygen-level dependent (BOLD) signal was measured to identify regions where the hemodynamic response to the emotionally valenced stimuli differed between groups. The MDD subjects showed greater BOLD responses than the controls to masked-sad versus masked-happy faces in the hippocampus, amygdala and anterior inferotemporal cortex. While viewing both masked-sad and masked-happy faces relative to masked-neutral faces, the depressed subjects showed greater hemodynamic responses than the controls in a network that included the medial and orbital prefrontal cortices and anterior temporal cortex. Depressed and healthy participants showed distinct hemodynamic responses to masked-sad and masked-happy faces in neural circuits known to support the processing of emotionally valenced stimuli and to integrate the sensory and visceromotor aspects of emotional behavior. Altered function within these networks in MDD may establish and maintain illness-associated differences in the salience of sensory/social stimuli, such that attention is biased toward negative and away from positive stimuli.

  18. The face is not an empty canvas: how facial expressions interact with facial appearance.

    Science.gov (United States)

    Hess, Ursula; Adams, Reginald B; Kleck, Robert E

    2009-12-12

    Faces are not simply blank canvases upon which facial expressions write their emotional messages. In fact, facial appearance and facial movement are both important social signalling systems in their own right. We here provide multiple lines of evidence for the notion that the social signals derived from facial appearance on the one hand and facial movement on the other interact in a complex manner, sometimes reinforcing and sometimes contradicting one another. Faces provide information on who a person is. Sex, age, ethnicity, personality and other characteristics that can define a person and the social group the person belongs to can all be derived from the face alone. The present article argues that faces interact with the perception of emotion expressions because this information informs a decoder's expectations regarding an expresser's probable emotional reactions. Facial appearance also interacts more directly with the interpretation of facial movement because some of the features that are used to derive personality or sex information are also features that closely resemble certain emotional expressions, thereby enhancing or diluting the perceived strength of particular expressions.

  19. Neutral face classification using personalized appearance models for fast and robust emotion detection.

    Science.gov (United States)

    Chiranjeevi, Pojala; Gopalakrishnan, Viswanath; Moogi, Pratibha

    2015-09-01

    Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning-based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, and so on, in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as user stays neutral for majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this paper, we propose a light-weight neutral versus emotion classification engine, which acts as a pre-processer to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at key emotion (KE) points using a statistical texture model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a statistical texture model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves emotion recognition (ER) accuracy and simultaneously reduces computational complexity of the ER system, as validated on multiple databases.

  20. Altered Functional Subnetwork During Emotional Face Processing: A Potential Intermediate Phenotype for Schizophrenia.

    Science.gov (United States)

    Cao, Hengyi; Bertolino, Alessandro; Walter, Henrik; Schneider, Michael; Schäfer, Axel; Taurisano, Paolo; Blasi, Giuseppe; Haddad, Leila; Grimm, Oliver; Otto, Kristina; Dixson, Luanna; Erk, Susanne; Mohnke, Sebastian; Heinz, Andreas; Romanczuk-Seiferth, Nina; Mühleisen, Thomas W; Mattheisen, Manuel; Witt, Stephanie H; Cichon, Sven; Noethen, Markus; Rietschel, Marcella; Tost, Heike; Meyer-Lindenberg, Andreas

    2016-06-01

    Although deficits in emotional processing are prominent in schizophrenia, it has been difficult to identify neural mechanisms related to the genetic risk for this highly heritable illness. Prior studies have not found consistent regional activation or connectivity alterations in first-degree relatives compared with healthy controls, suggesting that a more comprehensive search for connectomic biomarkers is warranted. To identify a potential systems-level intermediate phenotype linked to emotion processing in schizophrenia and to examine the psychological association, task specificity, test-retest reliability, and clinical validity of the identified phenotype. The study was performed in university research hospitals from June 1, 2008, through December 31, 2013. We examined 58 unaffected first-degree relatives of patients with schizophrenia and 94 healthy controls with an emotional face-matching functional magnetic resonance imaging paradigm. Test-retest reliability was analyzed with an independent sample of 26 healthy participants. A clinical association study was performed in 31 patients with schizophrenia and 45 healthy controls. Data analysis was performed from January 1 to September 30, 2014. Conventional amygdala activity and seeded connectivity measures, graph-based global and local network connectivity measures, Spearman rank correlation, intraclass correlation, and gray matter volumes. Among the 152 volunteers included in the relative-control sample, 58 were unaffected first-degree relatives of patients with schizophrenia (mean [SD] age, 33.29 [12.56]; 38 were women), and 94 were healthy controls without a first-degree relative with mental illness (mean [SD] age, 32.69 [10.09] years; 55 were women). A graph-theoretical connectivity approach identified significantly decreased connectivity in a subnetwork that primarily included the limbic cortex, visual cortex, and subcortex during emotional face processing (cluster-level P corrected for familywise error =

  1. Do perceived high performance work systems influence the relationship between emotional labour, burnout and intention to leave? A study of Australian nurses.

    Science.gov (United States)

    Bartram, Timothy; Casimir, Gian; Djurkovic, Nick; Leggat, Sandra G; Stanton, Pauline

    2012-07-01

    The purpose of this article was to explore the relationships between perceived high performance work systems, emotional labour, burnout and intention to leave among nurses in Australia. Previous studies show that emotional labour and burnout are associated with an increase in intention to leave of nurses. There is evidence that high performance work systems are in association with a decrease in turnover. There are no previous studies that examine the relationship between high performance work systems and emotional labour. A cross-sectional, correlational survey. The study was conducted in Australia in 2008 with 183 nurses. Three hypotheses were tested with validated measures of emotional labour, burnout, intention to leave, and perceived high performance work systems. Principal component analysis was used to examine the structure of the measures. The mediation hypothesis was tested using Baron and Kenny's procedure and the moderation hypothesis was tested using hierarchical regression and the product-term. Emotional labour is positively associated with both burnout and intention to leave. Burnout mediates the relationship between emotional labour and intention to leave. Perceived high performance work systems negatively moderates the relationship between emotional labour and burnout. Perceived high performance work systems not only reduces the strength of the negative effect of emotional labour on burnout but also has a unique negative effect on intention to leave. Ensuring effective human resource management practice through the implementation of high performance work systems may reduce the burnout associated with emotional labour. This may assist healthcare organizations to reduce nurse turnover. © 2012 Blackwell Publishing Ltd.

  2. Gender differences in facial emotion recognition in persons with chronic schizophrenia.

    Science.gov (United States)

    Weiss, Elisabeth M; Kohler, Christian G; Brensinger, Colleen M; Bilker, Warren B; Loughead, James; Delazer, Margarete; Nolan, Karen A

    2007-03-01

    The aim of the present study was to investigate possible sex differences in the recognition of facial expressions of emotion and to investigate the pattern of classification errors in schizophrenic males and females. Such an approach provides an opportunity to inspect the degree to which males and females differ in perceiving and interpreting the different emotions displayed to them and to analyze which emotions are most susceptible to recognition errors. Fifty six chronically hospitalized schizophrenic patients (38 men and 18 women) completed the Penn Emotion Recognition Test (ER40), a computerized emotion discrimination test presenting 40 color photographs of evoked happy, sad, anger, fear expressions and neutral expressions balanced for poser gender and ethnicity. We found a significant sex difference in the patterns of error rates in the Penn Emotion Recognition Test. Neutral faces were more commonly mistaken as angry in schizophrenic men, whereas schizophrenic women misinterpreted neutral faces more frequently as sad. Moreover, female faces were better recognized overall, but fear was better recognized in same gender photographs, whereas anger was better recognized in different gender photographs. The findings of the present study lend support to the notion that sex differences in aggressive behavior could be related to a cognitive style characterized by hostile attributions to neutral faces in schizophrenic men.

  3. Different neural and cognitive response to emotional faces in healthy monozygotic twins at risk of depression.

    Science.gov (United States)

    Miskowiak, K W; Glerup, L; Vestbo, C; Harmer, C J; Reinecke, A; Macoveanu, J; Siebner, H R; Kessing, L V; Vinberg, M

    2015-05-01

    Negative cognitive bias and aberrant neural processing of emotional faces are trait-marks of depression. Yet it is unclear whether these changes constitute an endophenotype for depression and are also present in healthy individuals with hereditary risk for depression. Thirty healthy, never-depressed monozygotic (MZ) twins with a co-twin history of depression (high risk group: n = 13) or without co-twin history of depression (low-risk group: n = 17) were enrolled in a functional magnetic resonance imaging (fMRI) study. During fMRI, participants viewed fearful and happy faces while performing a gender discrimination task. After the scan, they were given a faces dot-probe task, a facial expression recognition task and questionnaires assessing mood, personality traits and coping strategies. High-risk twins showed increased neural response to happy and fearful faces in dorsal anterior cingulate cortex (ACC), dorsomedial prefrontal cortex (dmPFC), pre-supplementary motor area and occipito-parietal regions compared to low-risk twins. They also displayed stronger negative coupling between amygdala and pregenual ACC, dmPFC and temporo-parietal regions during emotional face processing. These task-related changes in neural responses in high-risk twins were accompanied by impaired gender discrimination performance during face processing. They also displayed increased attention vigilance for fearful faces and were slower at recognizing facial expressions relative to low-risk controls. These effects occurred in the absence of differences between groups in mood, subjective state or coping. Different neural response and functional connectivity within fronto-limbic and occipito-parietal regions during emotional face processing and enhanced fear vigilance may be key endophenotypes for depression.

  4. Initial Orientation of Attention towards Emotional Faces in Children with Attention Deficit Hyperactivity Disorder

    Directory of Open Access Journals (Sweden)

    Mehrnoosh Ahmadi

    2011-09-01

    Full Text Available Objective: Early recognition of negative emotions is considered to be of vital importance. It seems that children with attention deficit hyperactivity disorder have some difficulties recognizing facial emotional expressions, especially negative ones. This study investigated the preference of children with attention deficit hyperactivity disorder for negative (angry, sad facial expressions compared to normal children.Method: Participants were 35 drug naive boys with ADHD, aged between 6-11 years ,and 31 matched healthy children. Visual orientation data were recorded while participants viewed face pairs (negative-neutral pairs shown for 3000ms. The number of first fixations made to each expression was considered as an index of initial orientation. Results: Group comparisons revealed no difference between attention deficit hyperactivity disorder group and their matched healthy counterparts in initial orientation of attention. A tendency towards negative emotions was found within the normal group, while no difference was observed between initial allocation of attention toward negative and neutral expressions in children with ADHD .Conclusion: Children with attention deficit hyperactivity disorder do not have significant preference for negative facial expressions. In contrast, normal children have a significant preference for negative facial emotions rather than neutral faces.

  5. Perceived Intensity of Emotional Point-Light Displays Is Reduced in Subjects with ASD

    Science.gov (United States)

    Krüger, Britta; Kaletsch, Morten; Pilgramm, Sebastian; Schwippert, Sven-Sören; Hennig, Jürgen; Stark, Rudolf; Lis, Stefanie; Gallhofer, Bernd; Sammer, Gebhard; Zentgraf, Karen; Munzert, Jörn

    2018-01-01

    One major characteristic of autism spectrum disorder (ASD) is problems with social interaction and communication. The present study explored ASD-related alterations in perceiving emotions expressed via body movements. 16 participants with ASD and 16 healthy controls observed video scenes of human interactions conveyed by point-light displays. They…

  6. Association of Irritability and Anxiety With the Neural Mechanisms of Implicit Face Emotion Processing in Youths With Psychopathology.

    Science.gov (United States)

    Stoddard, Joel; Tseng, Wan-Ling; Kim, Pilyoung; Chen, Gang; Yi, Jennifer; Donahue, Laura; Brotman, Melissa A; Towbin, Kenneth E; Pine, Daniel S; Leibenluft, Ellen

    2017-01-01

    Psychiatric comorbidity complicates clinical care and confounds efforts to elucidate the pathophysiology of commonly occurring symptoms in youths. To our knowledge, few studies have simultaneously assessed the effect of 2 continuously distributed traits on brain-behavior relationships in children with psychopathology. To determine shared and unique effects of 2 major dimensions of child psychopathology, irritability and anxiety, on neural responses to facial emotions during functional magnetic resonance imaging. Cross-sectional functional magnetic resonance imaging study in a large, well-characterized clinical sample at a research clinic at the National Institute of Mental Health. The referred sample included youths ages 8 to 17 years, 93 youths with anxiety, disruptive mood dysregulation, and/or attention-deficit/hyperactivity disorders and 22 healthy youths. The child's irritability and anxiety were rated by both parent and child on the Affective Reactivity Index and Screen for Child Anxiety Related Disorders, respectively. Using functional magnetic resonance imaging, neural response was measured across the brain during gender labeling of varying intensities of angry, happy, or fearful face emotions. In mixed-effects analyses, the shared and unique effects of irritability and anxiety were tested on amygdala functional connectivity and activation to face emotions. The mean (SD) age of participants was 13.2 (2.6) years; of the 115 included, 64 were male. Irritability and/or anxiety influenced amygdala connectivity to the prefrontal and temporal cortex. Specifically, irritability and anxiety jointly influenced left amygdala to left medial prefrontal cortex connectivity during face emotion viewing (F4,888 = 9.20; P differences in neural response to face emotions in several areas (F2, 888 ≥ 13.45; all P emotion dysregulation when very anxious and irritable youth process threat-related faces. Activation in the ventral visual circuitry suggests a mechanism

  7. Human Empathy, Personality and Experience Affect the Emotion Ratings of Dog and Human Facial Expressions

    Science.gov (United States)

    Kujala, Miiamaaria V.; Somppi, Sanni; Jokela, Markus; Vainio, Outi; Parkkonen, Lauri

    2017-01-01

    Facial expressions are important for humans in communicating emotions to the conspecifics and enhancing interpersonal understanding. Many muscles producing facial expressions in humans are also found in domestic dogs, but little is known about how humans perceive dog facial expressions, and which psychological factors influence people’s perceptions. Here, we asked 34 observers to rate the valence, arousal, and the six basic emotions (happiness, sadness, surprise, disgust, fear, and anger/aggressiveness) from images of human and dog faces with Pleasant, Neutral and Threatening expressions. We investigated how the subjects’ personality (the Big Five Inventory), empathy (Interpersonal Reactivity Index) and experience of dog behavior affect the ratings of dog and human faces. Ratings of both species followed similar general patterns: human subjects classified dog facial expressions from pleasant to threatening very similarly to human facial expressions. Subjects with higher emotional empathy evaluated Threatening faces of both species as more negative in valence and higher in anger/aggressiveness. More empathetic subjects also rated the happiness of Pleasant humans but not dogs higher, and they were quicker in their valence judgments of Pleasant human, Threatening human and Threatening dog faces. Experience with dogs correlated positively with ratings of Pleasant and Neutral dog faces. Personality also had a minor effect on the ratings of Pleasant and Neutral faces in both species. The results imply that humans perceive human and dog facial expression in a similar manner, and the perception of both species is influenced by psychological factors of the evaluators. Especially empathy affects both the speed and intensity of rating dogs’ emotional facial expressions. PMID:28114335

  8. Human Empathy, Personality and Experience Affect the Emotion Ratings of Dog and Human Facial Expressions.

    Directory of Open Access Journals (Sweden)

    Miiamaaria V Kujala

    Full Text Available Facial expressions are important for humans in communicating emotions to the conspecifics and enhancing interpersonal understanding. Many muscles producing facial expressions in humans are also found in domestic dogs, but little is known about how humans perceive dog facial expressions, and which psychological factors influence people's perceptions. Here, we asked 34 observers to rate the valence, arousal, and the six basic emotions (happiness, sadness, surprise, disgust, fear, and anger/aggressiveness from images of human and dog faces with Pleasant, Neutral and Threatening expressions. We investigated how the subjects' personality (the Big Five Inventory, empathy (Interpersonal Reactivity Index and experience of dog behavior affect the ratings of dog and human faces. Ratings of both species followed similar general patterns: human subjects classified dog facial expressions from pleasant to threatening very similarly to human facial expressions. Subjects with higher emotional empathy evaluated Threatening faces of both species as more negative in valence and higher in anger/aggressiveness. More empathetic subjects also rated the happiness of Pleasant humans but not dogs higher, and they were quicker in their valence judgments of Pleasant human, Threatening human and Threatening dog faces. Experience with dogs correlated positively with ratings of Pleasant and Neutral dog faces. Personality also had a minor effect on the ratings of Pleasant and Neutral faces in both species. The results imply that humans perceive human and dog facial expression in a similar manner, and the perception of both species is influenced by psychological factors of the evaluators. Especially empathy affects both the speed and intensity of rating dogs' emotional facial expressions.

  9. Evidence for a supra-modal representation of emotion from cross-modal adaptation.

    Science.gov (United States)

    Pye, Annie; Bestelmeyer, Patricia E G

    2015-01-01

    Successful social interaction hinges on accurate perception of emotional signals. These signals are typically conveyed multi-modally by the face and voice. Previous research has demonstrated uni-modal contrastive aftereffects for emotionally expressive faces or voices. Here we were interested in whether these aftereffects transfer across modality as theoretical models predict. We show that adaptation to facial expressions elicits significant auditory aftereffects. Adaptation to angry facial expressions caused ambiguous vocal stimuli drawn from an anger-fear morphed continuum to be perceived as less angry and more fearful relative to adaptation to fearful faces. In a second experiment, we demonstrate that these aftereffects are not dependent on learned face-voice congruence, i.e. adaptation to one facial identity transferred to an unmatched voice identity. Taken together, our findings provide support for a supra-modal representation of emotion and suggest further that identity and emotion may be processed independently from one another, at least at the supra-modal level of the processing hierarchy. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Prediction of Perceived Empathy Based on Emotional Schemas and Resilience in Mothers with Physically-Disabled Children

    Directory of Open Access Journals (Sweden)

    Dana Mohammad Aminzadeh

    2017-07-01

    Conclusion According to the results, the high resiliency and a positive emotional schemas such as having superior values and validation are predictors of perceived empathy in the mothers of disabled children. This means that the mothers of children with disabilities in dealing with situations when they have more resiliency and and interpret them as positive, are able to communicate more effectively with their surroundings.In this regard, one of the factors is perceived empathy that has a significant impact on the development of personal relationships between individuals and reflects the person's mental health. In addition, it can be used with resiliency and emotional schemas, so therapeutic intervention is implimented with respect to these two variables.

  11. Emotional experience with dyslexia and self-esteem: the protective role of perceived family support in late adulthood.

    Science.gov (United States)

    Carawan, Lena W; Nalavany, Blace A; Jenkins, Carol

    2016-01-01

    Despite the growing body of evidence that suggests dyslexia persists through the life span, there is a dearth of research that explores the complicating factor of dyslexia in late adulthood. Based upon stress and coping theory, this study examined whether perceived family support protects the impact of negative emotional experience with dyslexia on self-esteem. Adults aged 21 years and older with diagnosed or self-reported dyslexia were participants in a web-based survey. A total of 224 individuals completed the survey. These findings are from the 50 participants who reported to be 60 years or older. Completed measures include their perception of family support, emotional experience with dyslexia, self-esteem, and demographic variables. Preliminary analysis revealed that negative emotional experience with dyslexia negatively impacts self-esteem. Hierarchical moderated regression analysis demonstrated that positive perceived family support significantly buffers, mitigates, and protects the effects of negative emotional experiences with dyslexia on self-esteem in individuals with dyslexia in late adulthood. In this study, family support promoted self-esteem because as a protective dynamic, it helped older adults cope with the emotional distress associated with dyslexia. Implications of these findings are discussed.

  12. Cross-modal perception (face and voice in emotions. ERPs and behavioural measures

    Directory of Open Access Journals (Sweden)

    Michela Balconi

    2007-04-01

    Full Text Available Emotion decoding constitutes a case of multimodal processing of cues from multiple channels. Previous behavioural and neuropsychological studies indicated that, when we have to decode emotions on the basis of multiple perceptive information, a cross-modal integration has place. The present study investigates the simultaneous processing of emotional tone of voice and emotional facial expression by event-related potentials (ERPs, through an ample range of different emotions (happiness, sadness, fear, anger, surprise, and disgust. Auditory emotional stimuli (a neutral word pronounced in an affective tone and visual patterns (emotional facial expressions were matched in congruous (the same emotion in face and voice and incongruous (different emotions pairs. Subjects (N=30 were required to process the stimuli and to indicate their comprehension (by stimpad. ERPs variations and behavioural data (response time, RTs were submitted to repeated measures analysis of variance (ANOVA. We considered two time intervals (150-250; 250-350 ms post-stimulus, in order to explore the ERP variations. ANOVA showed two different ERP effects, a negative deflection (N2, more anterior-distributed (Fz, and a positive deflection (P2, more posterior-distributed, with different cognitive functions. In the first case N2 may be considered a marker of the emotional content (sensitive to type of emotion, whereas P2 may represent a cross-modal integration marker, it being varied as a function of the congruous/incongruous condition, showing a higher peak for congruous stimuli than incongruous stimuli. Finally, a RT reduction was found for some emotion types for congruous condition (i.e. sadness and an inverted effect for other emotions (i.e. fear, anger, and surprise.

  13. The Effects of Anxiety on the Recognition of Multisensory Emotional Cues with Different Cultural Familiarity

    Directory of Open Access Journals (Sweden)

    Ai Koizumi

    2011-10-01

    Full Text Available Anxious individuals have been shown to interpret others' facial expressions negatively. However, whether this negative interpretation bias depends on the modality and familiarity of emotional cues remains largely unknown. We examined whether trait-anxiety affects recognition of multisensory emotional cues (ie, face and voice, which were expressed by actors from either the same or different cultural background as the participants (ie, familiar in-group and unfamiliar out-group. The dynamic face and voice cues of the same actors were synchronized, and conveyed either congruent (eg, happy face and voice or incongruent emotions (eg, happy face and angry voice. Participants were to indicate the perceived emotion in one of the cues, while ignoring the other. The results showed that when recognizing emotions of in-group actors, highly anxious individuals, compared with low anxious ones, were more likely to interpret others' emotions in a negative manner, putting more weight on the to-be-ignored angry cues. This interpretation bias was found regardless of the cue modality. However, when recognizing emotions of out-group actors, low and high anxious individuals showed no difference in the interpretation of emotions irrespective of modality. These results suggest that trait-anxiety affects recognition of emotional expressions in a modality independent yet cultural familiarity dependent manner.

  14. Capacity limitations to extract the mean emotion from multiple facial expressions depend on emotion variance.

    Science.gov (United States)

    Ji, Luyan; Pourtois, Gilles

    2018-04-20

    We examined the processing capacity and the role of emotion variance in ensemble representation for multiple facial expressions shown concurrently. A standard set size manipulation was used, whereby the sets consisted of 4, 8, or 16 morphed faces each uniquely varying along a happy-angry continuum (Experiment 1) or a neutral-happy/angry continuum (Experiments 2 & 3). Across the three experiments, we reduced the amount of emotion variance in the sets to explore the boundaries of this process. Participants judged the perceived average emotion from each set on a continuous scale. We computed and compared objective and subjective difference scores, using the morph units and post-experiment ratings, respectively. Results of the subjective scores were more consistent than the objective ones across the first two experiments where the variance was relatively large, and revealed each time that increasing set size led to a poorer averaging ability, suggesting capacity limitations in establishing ensemble representations for multiple facial expressions. However, when the emotion variance in the sets was reduced in Experiment 3, both subjective and objective scores remained unaffected by set size, suggesting that the emotion averaging process was unlimited in these conditions. Collectively, these results suggest that extracting mean emotion from a set composed of multiple faces depends on both structural (attentional) and stimulus-related effects. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Linear Representation of Emotions in Whole Persons by Combining Facial and Bodily Expressions in the Extrastriate Body Area

    Science.gov (United States)

    Yang, Xiaoli; Xu, Junhai; Cao, Linjing; Li, Xianglin; Wang, Peiyuan; Wang, Bin; Liu, Baolin

    2018-01-01

    Our human brain can rapidly and effortlessly perceive a person’s emotional state by integrating the isolated emotional faces and bodies into a whole. Behavioral studies have suggested that the human brain encodes whole persons in a holistic rather than part-based manner. Neuroimaging studies have also shown that body-selective areas prefer whole persons to the sum of their parts. The body-selective areas played a crucial role in representing the relationships between emotions expressed by different parts. However, it remains unclear in which regions the perception of whole persons is represented by a combination of faces and bodies, and to what extent the combination can be influenced by the whole person’s emotions. In the present study, functional magnetic resonance imaging data were collected when participants performed an emotion distinction task. Multi-voxel pattern analysis was conducted to examine how the whole person-evoked responses were associated with the face- and body-evoked responses in several specific brain areas. We found that in the extrastriate body area (EBA), the whole person patterns were most closely correlated with weighted sums of face and body patterns, using different weights for happy expressions but equal weights for angry and fearful ones. These results were unique for the EBA. Our findings tentatively support the idea that the whole person patterns are represented in a part-based manner in the EBA, and modulated by emotions. These data will further our understanding of the neural mechanism underlying perceiving emotional persons. PMID:29375348

  16. Perceived social pressure not to experience negative emotion is linked to selective attention for negative information.

    Science.gov (United States)

    Bastian, Brock; Pe, Madeline Lee; Kuppens, Peter

    2017-02-01

    Social norms and values may be important predictors of how people engage with and regulate their negative emotional experiences. Previous research has shown that social expectancies (the perceived social pressure not to feel negative emotion (NE)) exacerbate feelings of sadness. In the current research, we examined whether social expectancies may be linked to how people process emotional information. Using a modified classical flanker task involving emotional rather than non-emotional stimuli, we found that, for those who experienced low levels of NE, social expectancies were linked to the selective avoidance of negative emotional information. Those who experienced high levels of NE did not show a selective avoidance of negative emotional information. The findings suggest that, for people who experience many NEs, social expectancies may lead to discrepancies between how they think they ought to feel and the kind of emotional information they pay attention to.

  17. Psychopathic traits are associated with reduced attention to the eyes of emotional faces among adult male non-offenders

    Directory of Open Access Journals (Sweden)

    Steven Mark Gillespie

    2015-10-01

    Full Text Available Psychopathic traits are linked with impairments in emotional facial expression recognition. These impairments may, in part, reflect reduced attention to the eyes of emotional faces. Although reduced attention to the eyes has been noted among children with conduct problems and callous-unemotional traits, similar findings are yet to be found in relation to psychopathic traits among adult male participants. Here we investigated the relationship of primary (selfish, uncaring and secondary (impulsive, antisocial psychopathic traits with attention to the eyes among adult male non-offenders during an emotion recognition task. We measured the number of fixations, and overall dwell time, on the eyes and the mouth of male and female faces showing the six basic emotions at varying levels of intensity. We found no relationship of primary or secondary psychopathic traits with recognition accuracy. However, primary psychopathic traits were associated with a reduced number of fixations, and lower overall dwell time, on the eyes relative to the mouth across expressions, intensity, and sex. Furthermore, the relationship of primary psychopathic traits with attention to the eyes of angry and fearful faces was influenced by the sex and intensity of the expression. We also showed that a greater number of fixations on the eyes, relative to the mouth, was associated with increased accuracy for angry and fearful expression recognition. These results are the first to show effects of psychopathic traits on attention to the eyes of emotional faces in an adult male sample, and may support amygdala based accounts of psychopathy. These findings may also have methodological implications for clinical studies of emotion recognition.

  18. Emotional labour and work engagement among nurses: examining perceived compassion, leadership and work ethic as stress buffers.

    Science.gov (United States)

    Mauno, Saija; Ruokolainen, Mervi; Kinnunen, Ulla; De Bloom, Jessica

    2016-05-01

    The study examined whether three resources, that is, compassion, transformational leadership and work ethic feasibility, buffer against the negative effects of emotional labour on work engagement. Emotional labour is a common job stressor among nurses, but little is known about whether certain personal and work resources buffer against it in relation to work engagement. Revealing buffers of emotional labour would help organizations to design tailored interventions. Cross-sectional online survey conducted in 2014. Participants were 3466 Finnish nurses. Hypotheses were tested via hierarchical moderated regression analyses. Higher emotional labour related to lower engagement. Two interaction effects were found. First, work ethic feasibility buffered against emotional labour: the nurses who perceived work ethic feasibility as high in a situation of high emotional labour, scored higher on engagement compared with those nurses who in this stress situation perceived work ethic feasibility to be low. Second, high compassion was detrimental to engagement in the presence of high emotional labour. Transformational leadership did not act as a buffer but showed a positive relationship with engagement. Work ethic feasibility (being able to work according to high ethical standards) is an important resource in nursing as it protects an employee against the negative effects of emotional labour and as it also directly promotes engagement. However, compassion may not always be beneficial in nursing, especially if co-occurring with high job stress. Transformational leadership has potential to improve engagement in nursing although it may not operate as a stress buffer. © 2016 John Wiley & Sons Ltd.

  19. The Effect of Perceived Abusive Management of Workers to Their Performance and the Role of Emotional Intelligence on This Effect

    Directory of Open Access Journals (Sweden)

    Mahmut AKIN

    2014-06-01

    Full Text Available In this research it is evaluated the effect of perceived abusive management of workers to their performance and the role of emotional intelligence on this effect. Research is realized with 253 civil servants in Kırşehir. Workers performance is handled two dimensional. One of them is task performance and the other one is contextual performance. According to the results, emotional intelligence is diminished or eliminated the effects of abusive management on workers performance. Additionally; emotional intelligence has a positive oriented relationship with task and contextual performance, abusive management has a negative oriented relationship with task and contextual performance. There is a positive oriented relationship between task and contextual performance. Another result of the research is the negative oriented relationship between emotional intelligence and the perceived abusive management.

  20. Visual and associated affective processing of face information in schizophrenia: A selective review.

    Science.gov (United States)

    Chen, Yue; Ekstrom, Tor

    Perception of facial features is crucial in social life. In past decades, extensive research showed that the ability to perceive facial emotion expression was compromised in schizophrenia patients. Given that face perception involves visual/cognitive and affective processing, the roles of these two processing domains in the compromised face perception in schizophrenia were studied and discussed, but not clearly defined. One particular issue was whether face-specific processing is implicated in this psychiatric disorder. Recent investigations have probed into the components of face perception processes such as visual detection, identity recognition, emotion expression discrimination and working memory conveyed from faces. Recent investigations have further assessed the associations between face processing and basic visual processing and between face processing and social cognitive processing such as Theory of Mind. In this selective review, we discuss the investigative findings relevant to the issues of cognitive and affective association and face-specific processing. We highlight the implications of multiple processing domains and face-specific processes as potential mechanisms underlying compromised face perception in schizophrenia. These findings suggest a need for a domain-specific therapeutic approach to the improvement of face perception in schizophrenia.

  1. Victimisation through bullying and cyberbullying: Emotional intelligence, severity of victimisation and technology use in different types of victims.

    Science.gov (United States)

    Beltrán-Catalán, María; Zych, Izabela; Ortega-Ruiz, Rosario; Llorent, Vicente J

    2018-05-01

    Bullying and cyberbullying are global public health problems. However, very few studies described prevalence, similarities and differences among face-to-face victims, cybervictims and students who are victimised through both bullying and cyberbullying. This study was conducted to describe these different patterns of victimisation and severity of victimisation, emotional intelligence and technology use in different types of victims. A total number of 2,139 secondary school students from 22 schools, randomly selected from all provinces of Andalusia, Spain, participated in this study. Information about bullying, cyberbullying, social networking sites use and perceived emotional intelligence was collected. Face-to-face victimisation only is the most common type of victimisation followed by mixed victimisation. Cybervictimisation only is rare. Mixed victims score higher in severity of bullying and present higher emotional attention than face-to-face victims. Most victims of cyberbullying are also face-to-face victims. Holistic approach that focuses on different problems at the same time seems to be needed to tackle these behaviours.

  2. Memory for faces with emotional expressions in Alzheimer's disease and healthy older participants: positivity effect is not only due to familiarity.

    Science.gov (United States)

    Sava, Alina-Alexandra; Krolak-Salmon, Pierre; Delphin-Combe, Floriane; Cloarec, Morgane; Chainay, Hanna

    2017-01-01

    Young individuals better memorize initially seen faces with emotional rather than neutral expressions. Healthy older participants and Alzheimer's disease (AD) patients show better memory for faces with positive expressions. The socioemotional selectivity theory postulates that this positivity effect in memory reflects a general age-related preference for positive stimuli, subserving emotion regulation. Another explanation might be that older participants use compensatory strategies, often considering happy faces as previously seen. The question about the existence of this effect in tasks not permitting such compensatory strategies is still open. Thus, we compared the performance of healthy participants and AD patients for positive, neutral, and negative faces in such tasks. Healthy older participants and AD patients showed a positivity effect in memory, but there was no difference between emotional and neutral faces in young participants. Our results suggest that the positivity effect in memory is not entirely due to the sense of familiarity for smiling faces.

  3. How group-based emotions are shaped by collective emotions: evidence for emotional transfer and emotional burden.

    Science.gov (United States)

    Goldenberg, Amit; Saguy, Tamar; Halperin, Eran

    2014-10-01

    Extensive research has established the pivotal role that group-based emotions play in shaping intergroup processes. The underlying implicit assumption in previous work has been that these emotions reflect what the rest of the group feels (i.e., collective emotions). However, one can experience an emotion in the name of her or his group, which is inconsistent with what the collective feels. The current research investigated this phenomenon of emotional nonconformity. Particularly, we proposed that when a certain emotional reaction is perceived as appropriate, but the collective is perceived as not experiencing this emotion, people would experience stronger levels of group-based emotion, placing their emotional experience farther away from that of the collective. We provided evidence for this process across 2 different emotions: group-based guilt and group-based anger (Studies 1 and 2) and across different intergroup contexts (Israeli-Palestinian relations in Israel, and Black-White relations in the United States). In Studies 3 and 4, we demonstrate that this process is moderated by the perceived appropriateness of the collective emotional response. Studies 4 and 5 further provided evidence for the mechanisms underlying this effect, pointing to a process of emotional burden (i.e., feeling responsible for carrying the emotion in the name of the group) and of emotional transfer (i.e., transferring negative feelings one has toward the ingroup, toward the event itself). This work brings to light processes that were yet to be studied regarding the relationship between group members, their perception of their group, and the emotional processes that connect them. 2014 APA, all rights reserved

  4. The part-whole perception of emotion.

    Science.gov (United States)

    Glazer, Trip

    2018-02-01

    A clever argument purports to show that we can directly perceive the emotions of others: (1) some emotional expressions are parts of the emotions they express; (2) in perceiving a part of something, one can perceive the whole; (3) therefore, in perceiving some emotional expressions, one can perceive the emotions they express. My aim in this paper is to assess the extent to which contemporary theories of emotion support the first premise of this argument. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Gender differences in human single neuron responses to male emotional faces.

    Science.gov (United States)

    Newhoff, Morgan; Treiman, David M; Smith, Kris A; Steinmetz, Peter N

    2015-01-01

    Well-documented differences in the psychology and behavior of men and women have spurred extensive exploration of gender's role within the brain, particularly regarding emotional processing. While neuroanatomical studies clearly show differences between the sexes, the functional effects of these differences are less understood. Neuroimaging studies have shown inconsistent locations and magnitudes of gender differences in brain hemodynamic responses to emotion. To better understand the neurophysiology of these gender differences, we analyzed recordings of single neuron activity in the human brain as subjects of both genders viewed emotional expressions. This study included recordings of single-neuron activity of 14 (6 male) epileptic patients in four brain areas: amygdala (236 neurons), hippocampus (n = 270), anterior cingulate cortex (n = 256), and ventromedial prefrontal cortex (n = 174). Neural activity was recorded while participants viewed a series of avatar male faces portraying positive, negative or neutral expressions. Significant gender differences were found in the left amygdala, where 23% (n = 15∕66) of neurons in men were significantly affected by facial emotion, vs. 8% (n = 6∕76) of neurons in women. A Fisher's exact test comparing the two ratios found a highly significant difference between the two (p differences between genders at the single-neuron level in the human amygdala. These differences may reflect gender-based distinctions in evolved capacities for emotional processing and also demonstrate the importance of including subject gender as an independent factor in future studies of emotional processing by single neurons in the human amygdala.

  6. On the link between perceived parental rearing behaviors and self-conscious emotions in adolescents

    NARCIS (Netherlands)

    Meesters, C.M.G.; Muris, P.E.H.M.; Dibbets, P.; Cima, M.; Lemmens, L.H.J.M.

    2017-01-01

    This study examined relationships between the self-conscious emotions of guilt and shame in both clinical (N = 104) and non-clinical (N = 477) (young) adolescents aged 11-18 years, who completed a questionnaire to assess perceived parental rearing behaviors (EMBU-C) and a scenario-based instrument

  7. Improved emotional conflict control triggered by the processing priority of negative emotion.

    Science.gov (United States)

    Yang, Qian; Wang, Xiangpeng; Yin, Shouhang; Zhao, Xiaoyue; Tan, Jinfeng; Chen, Antao

    2016-04-18

    The prefrontal cortex is responsible for emotional conflict resolution, and this control mechanism is affected by the emotional valence of distracting stimuli. In the present study, we investigated effects of negative and positive stimuli on emotional conflict control using a face-word Stroop task in combination with functional brain imaging. Emotional conflict was absent in the negative face context, in accordance with the null activation observed in areas regarding emotional face processing (fusiform face area, middle temporal/occipital gyrus). Importantly, these visual areas negatively coupled with the dorsolateral prefrontal cortex (DLPFC). However, the significant emotional conflict was observed in the positive face context, this effect was accompanied by activation in areas associated with emotional face processing, and the default mode network (DMN), here, DLPFC mainly negatively coupled with DMN, rather than visual areas. These results suggested that the conflict control mechanism exerted differently between negative faces and positive faces, it implemented more efficiently in the negative face condition, whereas it is more devoted to inhibiting internal interference in the positive face condition. This study thus provides a plausible mechanism of emotional conflict resolution that the rapid pathway for negative emotion processing efficiently triggers control mechanisms to preventively resolve emotional conflict.

  8. Expectations of emotions during testimony: the role of communicator and perceiver characteristics.

    Science.gov (United States)

    Bederian-Gardner, Daniel; Goldfarb, Deborah

    2014-01-01

    This study investigated the influence of communicator (child victim) and perceiver (adult participant) characteristics on expectations about witnesses' emotional displays during testimony. In total, 191 adults were asked whether or not they expected child victims who were testifying about sexual abuse to display sadness, fear, anger, disgust, happiness, or a neutral demeanor, and how intensely the adults expected each emotion to be displayed. In describing the victims, child age (5 vs. 13 years old) and child gender (female vs. male) were factorially combined as within-subject factors. Results included that victim gender predicted expectations of fear, and victim age predicted expectations of anger and disgust. There was a significant interaction of victim age and victim gender for expectations of sadness. Of participants who expected multiple emotions, a combination of negative and neutral emotions was expected more from 13-year-old female victims than from 5-year-old female victims. Child victim empathy predicted ratings of how intensely sad and fearful the child victim would look. Implications of these findings for psychological research and the legal system are discussed. Copyright © 2014 John Wiley & Sons, Ltd.

  9. Apparent height and body mass index influence perceived leadership ability in three-dimensional faces.

    Science.gov (United States)

    Re, Daniel E; Dzhelyova, Milena; Holzleitner, Iris J; Tigue, Cara C; Feinberg, David R; Perrett, David I

    2012-01-01

    Facial appearance has a well-documented effect on perceived leadership ability. Face judgments of leadership ability predict political election outcomes across the world, and similar judgments of business CEOs predict company profits. Body height is also associated with leadership ability, with taller people attaining positions of leadership more than their shorter counterparts in both politics and in the corporate world. Previous studies have found some face characteristics that are associated with leadership judgments, however there have been no studies with three-dimensional faces. We assessed which facial characteristics drive leadership judgments in three-dimensional faces. We found a perceptual relationship between height and leadership ability. We also found that facial maturity correlated with leadership judgments, and that faces of people with an unhealthily high body mass index received lower leadership ratings. We conclude that face attributes associated with body size and maturity alter leadership perception, and may influence real-world democratic leadership selection.

  10. Early (N170) activation of face-specific cortex by face-like objects

    OpenAIRE

    Hadjikhani, Nouchine; Kveraga, Kestutis; Naik, Paulami; Ahlfors, Seppo P.

    2009-01-01

    The tendency to perceive faces in random patterns exhibiting configural properties of faces is an example of pareidolia. Perception of ‘real’ faces has been associated with a cortical response signal arising at about 170ms after stimulus onset; but what happens when non-face objects are perceived as faces? Using magnetoencephalography (MEG), we found that objects incidentally perceived as faces evoked an early (165ms) activation in the ventral fusiform cortex, at a time and location similar t...

  11. Perceived Academic Control and Academic Emotions Predict Undergraduate University Student Success: Examining Effects on Dropout Intention and Achievement.

    Science.gov (United States)

    Respondek, Lisa; Seufert, Tina; Stupnisky, Robert; Nett, Ulrike E

    2017-01-01

    The present study addressed concerns over the high risk of university students' academic failure. It examined how perceived academic control and academic emotions predict undergraduate students' academic success, conceptualized as both low dropout intention and high achievement (indicated by GPA). A cross-sectional survey was administered to 883 undergraduate students across all disciplines of a German STEM orientated university. The study additionally compared freshman students ( N = 597) vs. second-year students ( N = 286). Using structural equation modeling, for the overall sample of undergraduate students we found that perceived academic control positively predicted enjoyment and achievement, as well as negatively predicted boredom and anxiety. The prediction of dropout intention by perceived academic control was fully mediated via anxiety. When taking perceived academic control into account, we found no specific impact of enjoyment or boredom on the intention to dropout and no specific impact of all three academic emotions on achievement. The multi-group analysis showed, however, that perceived academic control, enjoyment, and boredom among second-year students had a direct relationship with dropout intention. A major contribution of the present study was demonstrating the important roles of perceived academic control and anxiety in undergraduate students' academic success. Concerning corresponding institutional support and future research, the results suggested distinguishing incoming from advanced undergraduate students.

  12. Configuration perception and face memory, and face context effects in developmental prosopagnosia.

    Science.gov (United States)

    Huis in 't Veld, Elisabeth; Van den Stock, Jan; de Gelder, Beatrice

    2012-01-01

    This study addresses two central and controversial issues in developmental prosopagnosia (DP), configuration- versus feature-based face processing and the influence of affective information from either facial or bodily expressions on face recognition. A sample of 10 DPs and 10 controls were tested with a previously developed face and object recognition and memory battery (Facial Expressive Action Stimulus Test, FEAST), a task measuring the influence of emotional faces and bodies on face identity matching (Face-Body Compound task), and an emotionally expressive face memory task (Emotional Face Memory task, FaMe-E). We show that DPs were impaired in upright, but not inverted, face matching but they performed at the level of controls on part-to-whole matching. Second, DPs showed impaired memory for both neutral and emotional faces and scored within the normal range on the Face-Body Compound task. Third, configural perception but not feature-based processing was significantly associated with memory performance. Taken together the results indicate that DPs have a deficit in configural processing at the perception stage that may underlie the memory impairment.

  13. Individualism, Collectivism, and Client Expression of Different Emotions: Their Relations to Perceived Counselor Effectiveness

    Science.gov (United States)

    Seo, Young Seok

    2011-01-01

    This study examined how individualism, collectivism, and counselor emphasis of different client emotions were related to perceived counselor effectiveness. Data were collected from 192 (122 women and 70 men) Korean students attending a large university in South Korea and from 170 (115 women and 55 men) American students attending a large…

  14. On the link between perceived parental rearing behaviors and self-conscious emotions in adolescents

    NARCIS (Netherlands)

    Meesters, Cor; Muris, Peter; Dibbets, Pauline; Cima, Maaike; Lemmens, Lotte H.J.M.

    2017-01-01

    This study examined relationships between the self-conscious emotions of guilt and shame in both clinical (N = 104) and non-clinical (N = 477) (young) adolescents aged 11–18 years, who completed a questionnaire to assess perceived parental rearing behaviors (EMBU-C) and a scenario-based instrument

  15. Emotional Autonomy and Perceived Parenting Styles: Relational Analysis in the Hong Kong Cultural Context

    Science.gov (United States)

    Chan, Kwok Wai; Chan, Siu Mui

    2009-01-01

    Three hundred and seven teacher education students of a Hong Kong university were administered two questionnaires, one measuring emotional autonomy (EAS) and the other measuring perceived parenting styles (PAQ) of their parents. It was found that the Hong Kong teacher education students tended to be autonomous and they characterized their parents…

  16. Effects of news frames on perceived risk, emotions, and learning.

    Directory of Open Access Journals (Sweden)

    Christine Otieno

    Full Text Available The media play a key role in forming opinions by influencing people´s understanding and perception of a topic. People gather information about topics of interest from the internet and print media, which employ various news frames to attract attention. One example of a common news frame is the human-interest frame, which emotionalizes and dramatizes information and often accentuates individual affectedness. Our study investigated effects of human-interest frames compared to a neutral-text condition with respect to perceived risk, emotions, and knowledge acquisition, and tested whether these effects can be "generalized" to common variants of the human-interest frame. Ninety-one participants read either one variant of the human-interest frame or a neutrally formulated version of a newspaper article describing the effects of invasive species in general and the Asian ladybug (an invasive species in particular. The framing was achieved by varying the opening and concluding paragraphs (about invasive species, as well as the headline. The core text (about the Asian ladybug was the same across all conditions. All outcome variables on framing effects referred to this common core text. We found that all versions of the human-interest frame increased perceived risk and the strength of negative emotions compared to the neutral text. Furthermore, participants in the human-interest frame condition displayed better (quantitative learning outcomes but also biased knowledge, highlighting a potential dilemma: Human-interest frames may increase learning, but they also lead to a rather unbalanced view of the given topic on a "deeper level".

  17. Effects of News Frames on Perceived Risk, Emotions, and Learning

    Science.gov (United States)

    Otieno, Christine; Spada, Hans; Renkl, Alexander

    2013-01-01

    The media play a key role in forming opinions by influencing people´s understanding and perception of a topic. People gather information about topics of interest from the internet and print media, which employ various news frames to attract attention. One example of a common news frame is the human-interest frame, which emotionalizes and dramatizes information and often accentuates individual affectedness. Our study investigated effects of human-interest frames compared to a neutral-text condition with respect to perceived risk, emotions, and knowledge acquisition, and tested whether these effects can be "generalized" to common variants of the human-interest frame. Ninety-one participants read either one variant of the human-interest frame or a neutrally formulated version of a newspaper article describing the effects of invasive species in general and the Asian ladybug (an invasive species) in particular. The framing was achieved by varying the opening and concluding paragraphs (about invasive species), as well as the headline. The core text (about the Asian ladybug) was the same across all conditions. All outcome variables on framing effects referred to this common core text. We found that all versions of the human-interest frame increased perceived risk and the strength of negative emotions compared to the neutral text. Furthermore, participants in the human-interest frame condition displayed better (quantitative) learning outcomes but also biased knowledge, highlighting a potential dilemma: Human-interest frames may increase learning, but they also lead to a rather unbalanced view of the given topic on a “deeper level”. PMID:24223999

  18. Single trial classification for the categories of perceived emotional facial expressions: an event-related fMRI study

    Science.gov (United States)

    Song, Sutao; Huang, Yuxia; Long, Zhiying; Zhang, Jiacai; Chen, Gongxiang; Wang, Shuqing

    2016-03-01

    Recently, several studies have successfully applied multivariate pattern analysis methods to predict the categories of emotions. These studies are mainly focused on self-experienced emotions, such as the emotional states elicited by music or movie. In fact, most of our social interactions involve perception of emotional information from the expressions of other people, and it is an important basic skill for humans to recognize the emotional facial expressions of other people in a short time. In this study, we aimed to determine the discriminability of perceived emotional facial expressions. In a rapid event-related fMRI design, subjects were instructed to classify four categories of facial expressions (happy, disgust, angry and neutral) by pressing different buttons, and each facial expression stimulus lasted for 2s. All participants performed 5 fMRI runs. One multivariate pattern analysis method, support vector machine was trained to predict the categories of facial expressions. For feature selection, ninety masks defined from anatomical automatic labeling (AAL) atlas were firstly generated and each were treated as the input of the classifier; then, the most stable AAL areas were selected according to prediction accuracies, and comprised the final feature sets. Results showed that: for the 6 pair-wise classification conditions, the accuracy, sensitivity and specificity were all above chance prediction, among which, happy vs. neutral , angry vs. disgust achieved the lowest results. These results suggested that specific neural signatures of perceived emotional facial expressions may exist, and happy vs. neutral, angry vs. disgust might be more similar in information representation in the brain.

  19. Neural Correlates of Task-Irrelevant First and Second Language Emotion Words — Evidence from the Face-Word Stroop Task

    Directory of Open Access Journals (Sweden)

    Lin Fan

    2016-11-01

    Full Text Available Emotionally valenced words have thus far not been empirically examined in a bilingual population with the emotional face-word Stroop paradigm. Chinese-English bilinguals were asked to identify the facial expressions of emotion with their first (L1 or second (L2 language task-irrelevant emotion words superimposed on the face pictures. We attempted to examine how the emotional content of words modulates behavioral performance and cerebral functioning in the bilinguals’ two languages. The results indicated that there were significant congruency effects for both L1 and L2 emotion words, and that identifiable differences in the magnitude of Stroop effect between the two languages were also observed, suggesting L1 is more capable of activating the emotional response to word stimuli. For event-related potentials (ERPs data, an N350-550 effect was observed only in L1 task with greater negativity for incongruent than congruent trials. The size of N350-550 effect differed across languages, whereas no identifiable language distinction was observed in the effect of conflict slow potential (conflict SP. Finally, more pronounced negative amplitude at 230-330 ms was observed in L1 than in L2, but only for incongruent trials. This negativity, likened to an orthographic decoding N250, may reflect the extent of attention to emotion word processing at word-form level, while N350-550 reflects a complicated set of processes in the conflict processing. Overall, the face-word congruency effect has reflected identifiable language distinction at 230-330 and 350-550 ms, which provides supporting evidence for the theoretical proposals assuming attenuated emotionality of L2 processing.

  20. Predicting the Filial Behaviors of Chinese-Malaysian Adolescents from Perceived Parental Investments, Filial Emotions, and Parental Warmth and Support

    Science.gov (United States)

    Cheah, Charissa S. L.; Ozdemir, Sevgi Bayram; Leung, Christy Y. Y.

    2012-01-01

    The present study examined the mediating role of perceived parental warmth and support in predicting Chinese Malaysian adolescents' filial behaviors from their age, perceived parental investments, and positive filial emotions toward their parents. The effects of these predictors were examined separately for mothers and fathers. Participants…

  1. Emotional Intelligence in medical practice

    Directory of Open Access Journals (Sweden)

    Abu Hasan Sarkar

    2016-08-01

    Full Text Available Emotional Intelligence is the ability to perceive, express, understand and regulate one’s inner emotions and the emotions of others. It is considered to be a ‘must have’ competence in the workplace. Several scientific studies have proven that the application of emotional intelligence is effective in improving the teaching-learning process and that it leads to organizational growth; however, only limited work has been carried out to assess its effectiveness in the practice of medicine, especially in India. Various scales have been developed to measure emotional intelligence but they are not universally applicable because emotional intelligence depends upon culture and personal background among other factors. In recent years in India, conflicts between patients and doctors have had serious, sometimes fatal, consequences for the physician. Behavior, when faced with a potential conflict-like situation, depends to a great extent on the emotional intelligence of the physician. Emotional intelligence of medical students and medical professionals can be honed through exposure to the medical humanities which are known to promote patient-centered care. Building better physician-patient relationships might help in averting doctor-patient conflict.

  2. Neural Substrates of Social Emotion Regulation: A fMRI Study on Imitation and Expressive Suppression to Dynamic Facial Signals

    Directory of Open Access Journals (Sweden)

    Pascal eVrticka

    2013-02-01

    Full Text Available Emotion regulation is crucial for successfully engaging in social interactions. Yet, little is known about the neural mechanisms controlling behavioral responses to emotional expressions perceived in the face of other people, which constitute a key element of interpersonal communication. Here, we investigated brain systems involved in social emotion perception and regulation, using functional magnetic resonance imaging (fMRI in 20 healthy participants who saw dynamic facial expressions of either happiness or sadness, and were asked to either imitate the expression or to suppress any expression on their own face (in addition to a gender judgment control task. fMRI results revealed higher activity in regions associated with emotion (e.g., the insula, motor function (e.g., motor cortex, and theory of mind during imitation. Activity in dorsal cingulate cortex was also increased during imitation, possibly reflecting greater action monitoring or conflict with own feeling states. In addition, premotor regions were more strongly activated during both imitation and suppression, suggesting a recruitment of motor control for both the production and inhibition of emotion expressions. Expressive suppression produced increases in dorsolateral and lateral prefrontal cortex typically related to cognitive control. These results suggest that voluntary imitation and expressive suppression modulate brain responses to emotional signals perceived from faces, by up- and down-regulating activity in distributed subcortical and cortical networks that are particularly involved in emotion, action monitoring, and cognitive control.

  3. The ties to unbind: Age-related differences in feature (unbinding in working memory for emotional faces

    Directory of Open Access Journals (Sweden)

    Didem ePehlivanoglu

    2014-04-01

    Full Text Available In the present study, we investigated age-related differences in the processing of emotional stimuli. Specifically, we were interested in whether older adults would show deficits in unbinding emotional expression (i.e., either no emotion, happiness, anger, or disgust from bound stimuli (i.e., photographs of faces expressing these emotions, as a hyperbinding account of age-related differences in working memory would predict. Younger and older adults completed different N-Back tasks (side-by-side 0-Back, 1-Back, 2-Back under three conditions: match/mismatch judgments based on either the identity of the face (identity condition, the face’s emotional expression (expression condition, or both identity and expression of the face (binding condition. Both age groups performed more slowly and with lower accuracy in the expression condition than in the binding condition, indicating the presence of an unbinding process. This unbinding effect was more pronounced in older adults than in younger adults, but only in the 2-Back task. Thus, older adults seemed to have a specific deficit in unbinding in working memory, over and beyond age-related differences observed in perceptual processing (0-Back and attention/short-term memory (1-Back. Additionally, no age-related differences were found in accuracy in the 0-Back task, but such differences emerged in the 1-Back task, and were further magnified in the 2-Back task, indicating independent age-related differences in attention/short-term memory and working memory. Pupil dilation data confirmed that the attention/short-term memory version of the task (1-Back is more effortful in older adults than younger adults.

  4. PERVALE-S: a new cognitive task to assess deaf people’s ability to perceive basic and social emotions

    Science.gov (United States)

    Mestre, José M.; Larrán, Cristina; Herrero, Joaquín; Guil, Rocío; de la Torre, Gabriel G.

    2015-01-01

    A poorly understood aspect of deaf people (DP) is how their emotional information is processed. Verbal ability is key to improve emotional knowledge in people. Nevertheless, DP are unable to distinguish intonation, intensity, and the rhythm of language due to lack of hearing. Some DP have acquired both lip-reading abilities and sign language, but others have developed only sign language. PERVALE-S was developed to assess the ability of DP to perceive both social and basic emotions. PERVALE-S presents different sets of visual images of a real deaf person expressing both basic and social emotions, according to the normative standard of emotional expressions in Spanish Sign Language. Emotional expression stimuli were presented at two different levels of intensity (1: low; and 2: high) because DP do not distinguish an object in the same way as hearing people (HP) do. Then, participants had to click on the more suitable emotional expression. PERVALE-S contains video instructions (given by a sign language interpreter) to improve DP’s understanding about how to use the software. DP had to watch the videos before answering the items. To test PERVALE-S, a sample of 56 individuals was recruited (18 signers, 8 lip-readers, and 30 HP). Participants also performed a personality test (High School Personality Questionnaire adapted) and a fluid intelligence (Gf) measure (RAPM). Moreover, all deaf participants were rated by four teachers for the deaf. Results: there were no significant differences between deaf and HP in performance in PERVALE-S. Confusion matrices revealed that embarrassment, envy, and jealousy were worse perceived. Age was just related to social-emotional tasks (but not in basic emotional tasks). Emotional perception ability was related mainly to warmth and consciousness, but negatively related to tension. Meanwhile, Gf was related to only social-emotional tasks. There were no gender differences. PMID:26300828

  5. Gender Differences in Human Single Neuron Responses to Male Emotional Faces

    Directory of Open Access Journals (Sweden)

    Morgan eNewhoff

    2015-09-01

    Full Text Available Well-documented differences in the psychology and behavior of men and women have spurred extensive exploration of gender's role within the brain, particularly regarding emotional processing. While neuroanatomical studies clearly show differences between the sexes, the functional effects of these differences are less understood. Neuroimaging studies have shown inconsistent locations and magnitudes of gender differences in brain hemodynamic responses to emotion. To better understand the neurophysiology of these gender differences, we analyzed recordings of single neuron activity in the human brain as subjects of both genders viewed emotional expressions.This study included recordings of single-neuron activity of 14 (6 male epileptic patients in four brain areas: amygdala (236 neurons, hippocampus (n=270, anterior cingulate cortex (n=256, and ventromedial prefrontal cortex (n=174. Neural activity was recorded while participants viewed a series of avatar male faces portraying positive, negative or neutral expressions.Significant gender differences were found in the left amygdala, where 23% (n=15/66 of neurons in men were significantly affected by facial emotion, versus 8% (n=6/76 of neurons in women. A Fisher's exact test comparing the two ratios found a highly significant difference between the two (p<0.01. These results show specific differences between genders at the single-neuron level in the human amygdala. These differences may reflect gender-based distinctions in evolved capacities for emotional processing and also demonstrate the importance of including subject gender as an independent factor in future studies of emotional processing by single neurons in the human amygdala.

  6. Elevated responses to constant facial emotions in different faces in the human amygdala: an fMRI study of facial identity and expression

    Directory of Open Access Journals (Sweden)

    Weiller Cornelius

    2004-11-01

    Full Text Available Abstract Background Human faces provide important signals in social interactions by inferring two main types of information, individual identity and emotional expression. The ability to readily assess both, the variability and consistency among emotional expressions in different individuals, is central to one's own interpretation of the imminent environment. A factorial design was used to systematically test the interaction of either constant or variable emotional expressions with constant or variable facial identities in areas involved in face processing using functional magnetic resonance imaging. Results Previous studies suggest a predominant role of the amygdala in the assessment of emotional variability. Here we extend this view by showing that this structure activated to faces with changing identities that display constant emotional expressions. Within this condition, amygdala activation was dependent on the type and intensity of displayed emotion, with significant responses to fearful expressions and, to a lesser extent so to neutral and happy expressions. In contrast, the lateral fusiform gyrus showed a binary pattern of increased activation to changing stimulus features while it was also differentially responsive to the intensity of displayed emotion when processing different facial identities. Conclusions These results suggest that the amygdala might serve to detect constant facial emotions in different individuals, complementing its established role for detecting emotional variability.

  7. How Substance Users With ADHD Perceive the Relationship Between Substance Use and Emotional Functioning.

    Science.gov (United States)

    Mitchell, John T; Weisner, Thomas S; Jensen, Peter S; Murray, Desiree W; Molina, Brooke S G; Arnold, Eugene L; Hechtman, Lily; Swanson, James M; Hinshaw, Stephen P; Victor, Elizabeth C; Kollins, Scott H; Wells, Karen C; Belendiuk, Katherine A; Blonde, Andrew; Nguyen, Celeste; Ambriz, Lizeth; Nguyen, Jenny L

    2017-02-01

    Although substance use (SU) is elevated in ADHD and both are associated with disrupted emotional functioning, little is known about how emotions and SU interact in ADHD. We used a mixed qualitative-quantitative approach to explore this relationship. Narrative comments were coded for 67 persistent (50 ADHD, 17 local normative comparison group [LNCG]) and 25 desistent (20 ADHD, 5 LNCG) substance users from the Multimodal Treatment Study of Children with ADHD (MTA) adult follow-up (21.7-26.7 years-old). SU persisters perceived SU positively affects emotional states and positive emotional effects outweigh negative effects. No ADHD group effects emerged. Qualitative analysis identified perceptions that cannabis enhanced positive mood for ADHD and LNCG SU persisters, and improved negative mood and ADHD for ADHD SU persisters. Perceptions about SU broadly and mood do not differentiate ADHD and non-ADHD SU persisters. However, perceptions that cannabis is therapeutic may inform ADHD-related risk for cannabis use.

  8. Amygdala habituation to emotional faces in adolescents with internalizing disorders, adolescents with childhood sexual abuse related PTSD and healthy adolescents

    Directory of Open Access Journals (Sweden)

    Bianca G. van den Bulk

    2016-10-01

    Full Text Available Adolescents with internalizing disorders and adolescents with childhood sexual abuse related post-traumatic stress disorder (CSA-related PTSD show a large overlap in symptomatology. In addition, brain research indicated hyper-responsiveness and sustained activation instead of habituation of amygdala activation to emotional faces in both groups. Little is known, however, about whether the same patterns of amygdala habituation are present in these two groups. The current study examined habituation patterns of amygdala activity to emotional faces (fearful, happy and neutral in adolescents with a DSM-IV depressive and/or anxiety disorder (N = 25, adolescents with CSA-related PTSD (N = 19 and healthy controls (N = 26. Behaviourally, the adolescents from the internalizing and CSA-related PTSD group reported more anxiety to fearful and neutral faces than adolescents from the control group and adolescents from the CSA-related PTSD group reacted slower compared to the internalizing group. At the whole brain level, there was a significant interaction between time and group within the left amygdala. Follow-up ROI analysis showed elevated initial activity in the amygdala and rapid habituation in the CSA-related PTSD group compared to the internalizing group. These findings suggest that habituation patterns of amygdala activation provide additional information on problems with emotional face processing. Furthermore, the results suggest there are differences in the underlying neurobiological mechanisms related to emotional face processing for adolescents with internalizing disorders and adolescents with CSA-related PTSD. Possibly CSA-related PTSD is characterized by a stronger primary emotional response driven by the amygdala.

  9. Amygdala habituation to emotional faces in adolescents with internalizing disorders, adolescents with childhood sexual abuse related PTSD and healthy adolescents.

    Science.gov (United States)

    van den Bulk, Bianca G; Somerville, Leah H; van Hoof, Marie-José; van Lang, Natasja D J; van der Wee, Nic J A; Crone, Eveline A; Vermeiren, Robert R J M

    2016-10-01

    Adolescents with internalizing disorders and adolescents with childhood sexual abuse related post-traumatic stress disorder (CSA-related PTSD) show a large overlap in symptomatology. In addition, brain research indicated hyper-responsiveness and sustained activation instead of habituation of amygdala activation to emotional faces in both groups. Little is known, however, about whether the same patterns of amygdala habituation are present in these two groups. The current study examined habituation patterns of amygdala activity to emotional faces (fearful, happy and neutral) in adolescents with a DSM-IV depressive and/or anxiety disorder (N=25), adolescents with CSA-related PTSD (N=19) and healthy controls (N=26). Behaviourally, the adolescents from the internalizing and CSA-related PTSD group reported more anxiety to fearful and neutral faces than adolescents from the control group and adolescents from the CSA-related PTSD group reacted slower compared to the internalizing group. At the whole brain level, there was a significant interaction between time and group within the left amygdala. Follow-up ROI analysis showed elevated initial activity in the amygdala and rapid habituation in the CSA-related PTSD group compared to the internalizing group. These findings suggest that habituation patterns of amygdala activation provide additional information on problems with emotional face processing. Furthermore, the results suggest there are differences in the underlying neurobiological mechanisms related to emotional face processing for adolescents with internalizing disorders and adolescents with CSA-related PTSD. Possibly CSA-related PTSD is characterized by a stronger primary emotional response driven by the amygdala. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Do bodily expressions compete with facial expressions? Time course of integration of emotional signals from the face and the body.

    Science.gov (United States)

    Gu, Yuanyuan; Mai, Xiaoqin; Luo, Yue-jia

    2013-01-01

    The decoding of social signals from nonverbal cues plays a vital role in the social interactions of socially gregarious animals such as humans. Because nonverbal emotional signals from the face and body are normally seen together, it is important to investigate the mechanism underlying the integration of emotional signals from these two sources. We conducted a study in which the time course of the integration of facial and bodily expressions was examined via analysis of event-related potentials (ERPs) while the focus of attention was manipulated. Distinctive integrating features were found during multiple stages of processing. In the first stage, threatening information from the body was extracted automatically and rapidly, as evidenced by enhanced P1 amplitudes when the subjects viewed compound face-body images with fearful bodies compared with happy bodies. In the second stage, incongruency between emotional information from the face and the body was detected and captured by N2. Incongruent compound images elicited larger N2s than did congruent compound images. The focus of attention modulated the third stage of integration. When the subjects' attention was focused on the face, images with congruent emotional signals elicited larger P3s than did images with incongruent signals, suggesting more sustained attention and elaboration of congruent emotional information extracted from the face and body. On the other hand, when the subjects' attention was focused on the body, images with fearful bodies elicited larger P3s than did images with happy bodies, indicating more sustained attention and elaboration of threatening information from the body during evaluative processes.

  11. An examination of personality, emotional intelligence, coping, gender and subjective well-being with perceived stress (trait and state) in undergraduate students.

    OpenAIRE

    Osborne, Shona Elizabeth

    2009-01-01

    This multivariate study aimed to further understand student stress. Associations between personality, emotional intelligence, coping and subjective well-being with perceived stress (trait and state) were examined in 238 undergraduate students, using self-report measures. Gender differences in these variables were also investigated. The results showed that students low in emotional stability, extraversion, emotional intelligence, subjective well-being and those with a tendency to use emotion...

  12. Reading the face of a leader: women with low facial masculinity are perceived as competitive

    OpenAIRE

    Silberzahn, Raphael; Menges, Jochen

    2016-01-01

    In competitive settings, people prefer leaders with masculine faces. But is facial masculinity a trait that is similarly desired in men and women leaders? Across three studies, we discovered that people indeed prefer men and women leaders who have faces with masculine traits. But surprisingly, we find that people also prefer women with low facial masculinity as leaders in competitive contexts (Study 1). Our findings indicate that low facial masculinity in women, but not in men is perceived to...

  13. Abstract representations of associated emotions in the human brain.

    Science.gov (United States)

    Kim, Junsuk; Schultz, Johannes; Rohe, Tim; Wallraven, Christian; Lee, Seong-Whan; Bülthoff, Heinrich H

    2015-04-08

    Emotions can be aroused by various kinds of stimulus modalities. Recent neuroimaging studies indicate that several brain regions represent emotions at an abstract level, i.e., independently from the sensory cues from which they are perceived (e.g., face, body, or voice stimuli). If emotions are indeed represented at such an abstract level, then these abstract representations should also be activated by the memory of an emotional event. We tested this hypothesis by asking human participants to learn associations between emotional stimuli (videos of faces or bodies) and non-emotional stimuli (fractals). After successful learning, fMRI signals were recorded during the presentations of emotional stimuli and emotion-associated fractals. We tested whether emotions could be decoded from fMRI signals evoked by the fractal stimuli using a classifier trained on the responses to the emotional stimuli (and vice versa). This was implemented as a whole-brain searchlight, multivoxel activation pattern analysis, which revealed successful emotion decoding in four brain regions: posterior cingulate cortex (PCC), precuneus, MPFC, and angular gyrus. The same analysis run only on responses to emotional stimuli revealed clusters in PCC, precuneus, and MPFC. Multidimensional scaling analysis of the activation patterns revealed clear clustering of responses by emotion across stimulus types. Our results suggest that PCC, precuneus, and MPFC contain representations of emotions that can be evoked by stimuli that carry emotional information themselves or by stimuli that evoke memories of emotional stimuli, while angular gyrus is more likely to take part in emotional memory retrieval. Copyright © 2015 the authors 0270-6474/15/355655-09$15.00/0.

  14. Does perceived discrimination affect health? Longitudinal relationships between work discrimination and women's physical and emotional health.

    Science.gov (United States)

    Pavalko, Eliza K; Mossakowski, Krysia N; Hamilton, Vanessa J

    2003-03-01

    This study uses longitudinal data to examine the causal relationships between perceived work discrimination and women's physical and emotional health. Using data on 1,778 employed women in the National Longitudinal Survey of Mature Women, we investigate the structural and individual characteristics that predict later perceptions of discrimination and the effects of those perceptions on subsequent health. We find that perceptions of discrimination are influenced by job attitudes, prior experiences of discrimination, and work contexts, but prior health is not related to later perceptions. However, perceptions of discrimination do impact subsequent health, and these effects remain significant after controlling for prior emotional health, physical health limitations, discrimination, and job characteristics. Overall, the results provide even stronger support for the health impact of workplace discrimination and suggest a need for further longitudinal analyses of causes and consequences of perceived discrimination.

  15. Dissociable neural effects of stimulus valence and preceding context during the inhibition of responses to emotional faces.

    Science.gov (United States)

    Schulz, Kurt P; Clerkin, Suzanne M; Halperin, Jeffrey M; Newcorn, Jeffrey H; Tang, Cheuk Y; Fan, Jin

    2009-09-01

    Socially appropriate behavior requires the concurrent inhibition of actions that are inappropriate in the context. This self-regulatory function requires an interaction of inhibitory and emotional processes that recruits brain regions beyond those engaged by either processes alone. In this study, we isolated brain activity associated with response inhibition and emotional processing in 24 healthy adults using event-related functional magnetic resonance imaging (fMRI) and a go/no-go task that independently manipulated the context preceding no-go trials (ie, number of go trials) and the valence (ie, happy, sad, and neutral) of the face stimuli used as trial cues. Parallel quadratic trends were seen in correct inhibitions on no-go trials preceded by increasing numbers of go trials and associated activation for correct no-go trials in inferior frontal gyrus pars opercularis, pars triangularis, and pars orbitalis, temporoparietal junction, superior parietal lobule, and temporal sensory association cortices. Conversely, the comparison of happy versus neutral faces and sad versus neutral faces revealed valence-dependent activation in the amygdala, anterior insula cortex, and posterior midcingulate cortex. Further, an interaction between inhibition and emotion was seen in valence-dependent variations in the quadratic trend in no-go activation in the right inferior frontal gyrus and left posterior insula cortex. These results suggest that the inhibition of response to emotional cues involves the interaction of partly dissociable limbic and frontoparietal networks that encode emotional cues and use these cues to exert inhibitory control over the motor, attention, and sensory functions needed to perform the task, respectively. 2008 Wiley-Liss, Inc.

  16. Do Valenced Odors and Trait Body Odor Disgust Affect Evaluation of Emotion in Dynamic Faces?

    Science.gov (United States)

    Syrjänen, Elmeri; Liuzza, Marco Tullio; Fischer, Håkan; Olofsson, Jonas K

    2017-12-01

    Disgust is a core emotion evolved to detect and avoid the ingestion of poisonous food as well as the contact with pathogens and other harmful agents. Previous research has shown that multisensory presentation of olfactory and visual information may strengthen the processing of disgust-relevant information. However, it is not known whether these findings extend to dynamic facial stimuli that changes from neutral to emotionally expressive, or if individual differences in trait body odor disgust may influence the processing of disgust-related information. In this preregistered study, we tested whether a classification of dynamic facial expressions as happy or disgusted, and an emotional evaluation of these facial expressions, would be affected by individual differences in body odor disgust sensitivity, and by exposure to a sweat-like, negatively valenced odor (valeric acid), as compared with a soap-like, positively valenced odor (lilac essence) or a no-odor control. Using Bayesian hypothesis testing, we found evidence that odors do not affect recognition of emotion in dynamic faces even when body odor disgust sensitivity was used as moderator. However, an exploratory analysis suggested that an unpleasant odor context may cause faster RTs for faces, independent of their emotional expression. Our results further our understanding of the scope and limits of odor effects on facial perception affect and suggest further studies should focus on reproducibility, specifying experimental circumstances where odor effects on facial expressions may be present versus absent.

  17. The influence of coping styles and perceived control on emotional distress in persons at risk for a hereditary heart disease

    NARCIS (Netherlands)

    Hoedemaekers, Ehy; Jaspers, Jan P. C.; Van Tintelen, J. Peter

    2007-01-01

    This prospective study investigates the influence of two coping styles (monitoring and blunting) and perceived control (health loci-is of control and mastery) on emotional distress in persons at risk of a hereditary cardiac disease. Emotional distress in people at risk for a hereditary cardiac

  18. The Moderator Role of Perceived Emotional Intelligence in the Relationship between Sources of Stress and Mental Health in Teachers.

    Science.gov (United States)

    Pulido-Martos, Manuel; Lopez-Zafra, Esther; Estévez-López, Fernando; Augusto-Landa, José María

    2016-03-03

    This study analyzes the role of Perceived Emotional Intelligence (PEI) on sources of job stress and mental health in 250 elementary school teachers from Jaén (Spain). The aim of the study was two-fold: (1) to analyze the associations between Perceived Emotional Intelligence (PEI), sources of occupational stress and mental health; and (2) to determine whether PEI moderates the relationship between sources of occupational stress and mental health. An initial sample of 250 teachers was assessed Three questionnaires, the Trait Meta-Mood Scale, the Sources of Stress Scale in Teachers and the Medical Outcomes Study 36-item Short Form Health Survey, were used to evaluate PEI, sources of occupational stress and mental health, respectively. Teachers with higher levels of emotional attention reported lower levels of mental health (r = -.30; p relationship between sources of occupational stress and emotional role. Specifically, each significant interaction (i.e., deficiencies x attention, adaptation x attention, and adaptation x clarity) made a small and unique contribution in the explanation of emotional role (all p < .05, all sr 2 ∼ .02). Finally, our results imply that PEI is an important moderator of teachers´ occupational stressors on mental health.

  19. Facial Expression Aftereffect Revealed by Adaption to Emotion-Invisible Dynamic Bubbled Faces

    Science.gov (United States)

    Luo, Chengwen; Wang, Qingyun; Schyns, Philippe G.; Kingdom, Frederick A. A.; Xu, Hong

    2015-01-01

    Visual adaptation is a powerful tool to probe the short-term plasticity of the visual system. Adapting to local features such as the oriented lines can distort our judgment of subsequently presented lines, the tilt aftereffect. The tilt aftereffect is believed to be processed at the low-level of the visual cortex, such as V1. Adaptation to faces, on the other hand, can produce significant aftereffects in high-level traits such as identity, expression, and ethnicity. However, whether face adaptation necessitate awareness of face features is debatable. In the current study, we investigated whether facial expression aftereffects (FEAE) can be generated by partially visible faces. We first generated partially visible faces using the bubbles technique, in which the face was seen through randomly positioned circular apertures, and selected the bubbled faces for which the subjects were unable to identify happy or sad expressions. When the subjects adapted to static displays of these partial faces, no significant FEAE was found. However, when the subjects adapted to a dynamic video display of a series of different partial faces, a significant FEAE was observed. In both conditions, subjects could not identify facial expression in the individual adapting faces. These results suggest that our visual system is able to integrate unrecognizable partial faces over a short period of time and that the integrated percept affects our judgment on subsequently presented faces. We conclude that FEAE can be generated by partial face with little facial expression cues, implying that our cognitive system fills-in the missing parts during adaptation, or the subcortical structures are activated by the bubbled faces without conscious recognition of emotion during adaptation. PMID:26717572

  20. [Emotional intelligence and oscillatory responses on the emotional facial expressions].

    Science.gov (United States)

    Kniazev, G G; Mitrofanova, L G; Bocharov, A V

    2013-01-01

    Emotional intelligence-related differences in oscillatory responses to emotional facial expressions were investigated in 48 subjects (26 men and 22 women) in age 18-30 years. Participants were instructed to evaluate emotional expression (angry, happy and neutral) of each presented face on an analog scale ranging from -100 (very hostile) to + 100 (very friendly). High emotional intelligence (EI) participants were found to be more sensitive to the emotional content of the stimuli. It showed up both in their subjective evaluation of the stimuli and in a stronger EEG theta synchronization at an earlier (between 100 and 500 ms after face presentation) processing stage. Source localization using sLORETA showed that this effect was localized in the fusiform gyrus upon the presentation of angry faces and in the posterior cingulate gyrus upon the presentation of happy faces. At a later processing stage (500-870 ms) event-related theta synchronization in high emotional intelligence subject was higher in the left prefrontal cortex upon the presentation of happy faces, but it was lower in the anterior cingulate cortex upon presentation of angry faces. This suggests the existence of a mechanism that can be selectively increase the positive emotions and reduce negative emotions.

  1. Slow motion in films and video clips: Music influences perceived duration and emotion, autonomic physiological activation and pupillary responses.

    Science.gov (United States)

    Wöllner, Clemens; Hammerschmidt, David; Albrecht, Henning

    2018-01-01

    Slow motion scenes are ubiquitous in screen-based audiovisual media and are typically accompanied by emotional music. The strong effects of slow motion on observers are hypothetically related to heightened emotional states in which time seems to pass more slowly. These states are simulated in films and video clips, and seem to resemble such experiences in daily life. The current study investigated time perception and emotional response to media clips containing decelerated human motion, with or without music using psychometric and psychophysiological testing methods. Participants were presented with slow-motion scenes taken from commercial films, ballet and sports footage, as well as the same scenes converted to real-time. Results reveal that slow-motion scenes, compared to adapted real-time scenes, led to systematic underestimations of duration, lower perceived arousal but higher valence, lower respiration rates and smaller pupillary diameters. The presence of music compared to visual-only presentations strongly affected results in terms of higher accuracy in duration estimates, higher perceived arousal and valence, higher physiological activation and larger pupillary diameters, indicating higher arousal. Video genre affected responses in addition. These findings suggest that perceiving slow motion is not related to states of high arousal, but rather affects cognitive dimensions of perceived time and valence. Music influences these experiences profoundly, thus strengthening the impact of stretched time in audiovisual media.

  2. Oxytocin effects on emotional response to others' faces via serotonin system in autism: A pilot study.

    Science.gov (United States)

    Fukai, Mina; Hirosawa, Tetsu; Kikuchi, Mitsuru; Ouchi, Yasuomi; Takahashi, Tetsuya; Yoshimura, Yuko; Miyagishi, Yoshiaki; Kosaka, Hirotaka; Yokokura, Masamichi; Yoshikawa, Etsuji; Bunai, Tomoyasu; Minabe, Yoshio

    2017-09-30

    The oxytocin (OT)-related serotonergic system is thought to play an important role in the etiology and social symptoms of autism spectrum disorder (ASD). However, no evidence exists for the relation between the prosocial effect of chronic OT administration and the brain serotonergic system. Ten male subjects with ASD were administered OT for 8-10 weeks in an open-label, single-arm, non-randomized, uncontrolled manner. Before and during the OT treatment, positron emission tomography was used with the ( 11 C)-3-amino-4-(2-[(demethylamino)methyl]phenylthio)benzonitrile( 11 C-DASB) radiotracer. Then binding of serotonin transporter ( 11 C-DASB BP ND ) was estimated. The main outcome measures were changes in 11 C-DASB BP ND and changes in the emotional response to others' faces. No significant change was found in the emotional response to others' faces after the 8-10 week OT treatment. However, the increased serotonin transporter (SERT) level in the striatum after treatment was correlated significantly with increased negative emotional response to human faces. This study revealed a relation between changes in the serotonergic system and in prosociality after chronic OT administration. Additional studies must be conducted to verify the chronic OT effects on social behavior via the serotonergic system. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  3. Face puzzle—two new video-based tasks for measuring explicit and implicit aspects of facial emotion recognition

    Science.gov (United States)

    Kliemann, Dorit; Rosenblau, Gabriela; Bölte, Sven; Heekeren, Hauke R.; Dziobek, Isabel

    2013-01-01

    Recognizing others' emotional states is crucial for effective social interaction. While most facial emotion recognition tasks use explicit prompts that trigger consciously controlled processing, emotional faces are almost exclusively processed implicitly in real life. Recent attempts in social cognition suggest a dual process perspective, whereby explicit and implicit processes largely operate independently. However, due to differences in methodology the direct comparison of implicit and explicit social cognition has remained a challenge. Here, we introduce a new tool to comparably measure implicit and explicit processing aspects comprising basic and complex emotions in facial expressions. We developed two video-based tasks with similar answer formats to assess performance in respective facial emotion recognition processes: Face Puzzle, implicit and explicit. To assess the tasks' sensitivity to atypical social cognition and to infer interrelationship patterns between explicit and implicit processes in typical and atypical development, we included healthy adults (NT, n = 24) and adults with autism spectrum disorder (ASD, n = 24). Item analyses yielded good reliability of the new tasks. Group-specific results indicated sensitivity to subtle social impairments in high-functioning ASD. Correlation analyses with established implicit and explicit socio-cognitive measures were further in favor of the tasks' external validity. Between group comparisons provide first hints of differential relations between implicit and explicit aspects of facial emotion recognition processes in healthy compared to ASD participants. In addition, an increased magnitude of between group differences in the implicit task was found for a speed-accuracy composite measure. The new Face Puzzle tool thus provides two new tasks to separately assess explicit and implicit social functioning, for instance, to measure subtle impairments as well as potential improvements due to social cognitive

  4. Effects of Acute Alcohol Consumption on the Processing of Emotion in Faces: Implications for Understanding Alcohol-Related Aggression

    Science.gov (United States)

    Attwood, Angela S.; Munafò, Marcus R.

    2016-01-01

    The negative consequences of chronic alcohol abuse are well known, but heavy episodic consumption ("binge drinking") is also associated with significant personal and societal harms. Aggressive tendencies are increased after alcohol but the mechanisms underlying these changes are not fully understood. While effects on behavioural control are likely to be important, other effects may be involved given the widespread action of alcohol. Altered processing of social signals is associated with changes in social behaviours, including aggression, but until recently there has been little research investigating the effects of acute alcohol consumption on these outcomes. Recent work investigating the effects of acute alcohol on emotional face processing has suggested reduced sensitivity to submissive signals (sad faces) and increased perceptual bias towards provocative signals (angry faces) after alcohol consumption, which may play a role in alcohol-related aggression. Here we discuss a putative mechanism that may explain how alcohol consumption influences emotional processing and subsequent aggressive responding, via disruption of OFC-amygdala connectivity. While the importance of emotional processing on social behaviours is well established, research into acute alcohol consumption and emotional processing is still in its infancy. Further research is needed and we outline a research agenda to address gaps in the literature. PMID:24920135

  5. Manipulating Greek musical modes and tempo affects perceived musical emotion in musicians and nonmusicians

    Directory of Open Access Journals (Sweden)

    D. Ramos

    2011-02-01

    Full Text Available The combined influence of tempo and mode on emotional responses to music was studied by crossing 7 changes in mode with 3 changes in tempo. Twenty-four musicians aged 19 to 25 years (12 males and 12 females and 24 nonmusicians aged 17 to 25 years (12 males and 12 females were required to perform two tasks: 1 listening to different musical excerpts, and 2 associating an emotion to them such as happiness, serenity, fear, anger, or sadness. ANOVA showed that increasing the tempo strongly affected the arousal (F(2,116 = 268.62, mean square error (MSE = 0.6676, P < 0.001 and, to a lesser extent, the valence of emotional responses (F(6,348 = 8.71, MSE = 0.6196, P < 0.001. Changes in modes modulated the affective valence of the perceived emotions (F(6,348 = 4.24, MSE = 0.6764, P < 0.001. Some interactive effects were found between tempo and mode (F (1,58 = 115.6, MSE = 0.6428, P < 0.001, but, in most cases, the two parameters had additive effects. This finding demonstrates that small changes in the pitch structures of modes modulate the emotions associated with the pieces, confirming the cognitive foundation of emotional responses to music.

  6. Audiovisual integration of emotional signals from music improvisation does not depend on temporal correspondence.

    Science.gov (United States)

    Petrini, Karin; McAleer, Phil; Pollick, Frank

    2010-04-06

    In the present study we applied a paradigm often used in face-voice affect perception to solo music improvisation to examine how the emotional valence of sound and gesture are integrated when perceiving an emotion. Three brief excerpts expressing emotion produced by a drummer and three by a saxophonist were selected. From these bimodal congruent displays the audio-only, visual-only, and audiovisually incongruent conditions (obtained by combining the two signals both within and between instruments) were derived. In Experiment 1 twenty musical novices judged the perceived emotion and rated the strength of each emotion. The results indicate that sound dominated the visual signal in the perception of affective expression, though this was more evident for the saxophone. In Experiment 2 a further sixteen musical novices were asked to either pay attention to the musicians' movements or to the sound when judging the perceived emotions. The results showed no effect of visual information when judging the sound. On the contrary, when judging the emotional content of the visual information, a worsening in performance was obtained for the incongruent condition that combined different emotional auditory and visual information for the same instrument. The effect of emotionally discordant information thus became evident only when the auditory and visual signals belonged to the same categorical event despite their temporal mismatch. This suggests that the integration of emotional information may be reinforced by its semantic attributes but might be independent from temporal features. Copyright 2010 Elsevier B.V. All rights reserved.

  7. Early (M170) activation of face-specific cortex by face-like objects.

    Science.gov (United States)

    Hadjikhani, Nouchine; Kveraga, Kestutis; Naik, Paulami; Ahlfors, Seppo P

    2009-03-04

    The tendency to perceive faces in random patterns exhibiting configural properties of faces is an example of pareidolia. Perception of 'real' faces has been associated with a cortical response signal arising at approximately 170 ms after stimulus onset, but what happens when nonface objects are perceived as faces? Using magnetoencephalography, we found that objects incidentally perceived as faces evoked an early (165 ms) activation in the ventral fusiform cortex, at a time and location similar to that evoked by faces, whereas common objects did not evoke such activation. An earlier peak at 130 ms was also seen for images of real faces only. Our findings suggest that face perception evoked by face-like objects is a relatively early process, and not a late reinterpretation cognitive phenomenon.

  8. Face validity and reliability of a pictorial instrument for assessing fundamental movement skill perceived competence in young children.

    Science.gov (United States)

    Barnett, Lisa M; Ridgers, Nicola D; Zask, Avigdor; Salmon, Jo

    2015-01-01

    To determine reliability and face validity of an instrument to assess young children's perceived fundamental movement skill competence. Validation and reliability study. A pictorial instrument based on the Test Gross Motor Development-2 assessed perceived locomotor (six skills) and object control (six skills) competence using the format and item structure from the physical competence subscale of the Pictorial Scale of Perceived Competence and Acceptance for Young Children. Sample 1 completed object control items in May (n=32) and locomotor items in October 2012 (n=23) at two time points seven days apart. Children were asked at the end of the test-retest their understanding of what was happening in each picture to determine face validity. Sample 2 (n=58) completed 12 items in November 2012 on a single occasion to test internal reliability only. Sample 1 children were aged 5-7 years (M=6.0, SD=0.8) at object control assessment and 5-8 years at locomotor assessment (M=6.5, SD=0.9). Sample 2 children were aged 6-8 years (M=7.2, SD=0.73). Intra-class correlations assessed in Sample 1 children were excellent for object control (intra-class correlation=0.78), locomotor (intra-class correlation=0.82) and all 12 skills (intra-class correlations=0.83). Face validity was acceptable. Internal consistency was adequate in both samples for each subscale and all 12 skills (alpha range 0.60-0.81). This study has provided preliminary evidence for instrument reliability and face validity. This enables future alignment between the measurement of perceived and actual fundamental movement skill competence in young children. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  9. Imagining sex and adapting to it: different aftereffects after perceiving versus imagining faces.

    Science.gov (United States)

    D'Ascenzo, Stefania; Tommasi, Luca; Laeng, Bruno

    2014-03-01

    A prolonged exposure (i.e., perceptual adaptation) to a male or a female face can produce changes (i.e., aftereffects) in the subsequent gender attribution of a neutral or average face, so that it appears respectively more female or more male. Studies using imagery adaptation and its aftereffects have yielded conflicting results. In the present study we used an adaptation paradigm with both imagined and perceived faces as adaptors, and assessed the aftereffects in judged masculinity/femininity when viewing an androgynous test face. We monitored eye movements and pupillary responses as a way to confirm whether participants did actively engage in visual imagery. The results indicated that both perceptual and imagery adaptation produce aftereffects, but that they run in opposite directions: a contrast effect with perception (e.g., after visual exposure to a female face, the androgynous appears as more male) and an assimilation effect with imagery (e.g., after imaginative exposure to a female face, the androgynous face appears as more female). The pupillary responses revealed dilations consistent with increased cognitive effort during the imagery phase, suggesting that the assimilation aftereffect occurred in the presence of an active and effortful mental imagery process, as also witnessed by the pattern of eye movements recorded during the imagery adaptation phase. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Emotion processing for arousal and neutral content in Alzheimer's disease.

    Science.gov (United States)

    Satler, Corina; Uribe, Carlos; Conde, Carlos; Da-Silva, Sergio Leme; Tomaz, Carlos

    2010-02-01

    Objective. To assess the ability of Alzheimer's disease (AD) patients to perceive emotional information and to assign subjective emotional rating scores to audiovisual presentations. Materials and Methods. 24 subjects (14 with AD, matched to controls for age and educational levels) were studied. After neuropsychological assessment, they watched a Neutral story and then a story with Emotional content. Results. Recall scores for both stories were significantly lower in AD (Neutral and Emotional: P = .001). CG assigned different emotional scores for each version of the test, P = .001, while ratings of AD did not differ, P = .32. Linear regression analyses determined the best predictors of emotional rating and recognition memory for each group among neuropsychological tests battery. Conclusions. AD patients show changes in emotional processing on declarative memory and a preserved ability to express emotions in face of arousal content. The present findings suggest that these impairments are due to general cognitive decline.

  11. Emotional face recognition deficits and medication effects in pre-manifest through stage-II Huntington's disease.

    Science.gov (United States)

    Labuschagne, Izelle; Jones, Rebecca; Callaghan, Jenny; Whitehead, Daisy; Dumas, Eve M; Say, Miranda J; Hart, Ellen P; Justo, Damian; Coleman, Allison; Dar Santos, Rachelle C; Frost, Chris; Craufurd, David; Tabrizi, Sarah J; Stout, Julie C

    2013-05-15

    Facial emotion recognition impairments have been reported in Huntington's disease (HD). However, the nature of the impairments across the spectrum of HD remains unclear. We report on emotion recognition data from 344 participants comprising premanifest HD (PreHD) and early HD patients, and controls. In a test of recognition of facial emotions, we examined responses to six basic emotional expressions and neutral expressions. In addition, and within the early HD sample, we tested for differences on emotion recognition performance between those 'on' vs. 'off' neuroleptic or selective serotonin reuptake inhibitor (SSRI) medications. The PreHD groups showed significant (precognition, compared to controls, on fearful, angry and surprised faces; whereas the early HD groups were significantly impaired across all emotions including neutral expressions. In early HD, neuroleptic use was associated with worse facial emotion recognition, whereas SSRI use was associated with better facial emotion recognition. The findings suggest that emotion recognition impairments exist across the HD spectrum, but are relatively more widespread in manifest HD than in the premanifest period. Commonly prescribed medications to treat HD-related symptoms also appear to affect emotion recognition. These findings have important implications for interpersonal communication and medication usage in HD. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  12. PERVALE-S: a new cognitive task to assess deaf people’s ability to perceive basic and social emotions

    Directory of Open Access Journals (Sweden)

    JOSE MIGUEL MESTRE

    2015-08-01

    Full Text Available PERVALE-S was developed to assess the ability of DP to perceive both social and basic emotions. PERVALE-S presents different sets of visual images of a real deaf person expressing both basic and social emotions, according to the normative standard of emotional expressions in Spanish sign language. Emotional expression stimuli were presented at two different levels of intensity (1: low; and 2: high because DP do not distinguish the same range of frequency adverbs as hearing people (HP do. Then, participants had to click on the more suitable emotional expression. PERVALE-S contains video instructions of a sign language interpreter to improve DP’s understanding about how to use the software. DP had to watch the videos before answering the items. To test PERVALE-S, a sample of 56 individuals was recruited (18 signers, 8 lip-readers, and 30 hearing people. Participants also responded to a personality test (HSPQ adapted and a fluid intelligence measure (RAPM. Moreover, four teachers from deaf center rated all deaf participants. Results: there were no significant differences between DP and HP in performance in PERVALE-S. Confusion matrices revealed that embarrassment, envy, and jealousy were worse perceived by participants (DP and HP. There were not significant differences of emotional perception performance among lip-readings, signers, and hearings. Regarding emotional perception ability (EPA, basic emotion performance was positively related to consciousness, and negatively with tension. Social emotion performance was positively related to age and fluid intelligence, and negatively related to dominance. When an adapted instrument for assessing EPA is developed without language implications, the performance among DP and HP are closer. This instrument could have experimental interest in order of eliminating language influences in EPA.

  13. What's good for the goose is not good for the gander: Age and gender differences in scanning emotion faces.

    Science.gov (United States)

    Sullivan, Susan; Campbell, Anna; Hutton, Sam B; Ruffman, Ted

    2017-05-01

    Research indicates that older adults' (≥60 years) emotion recognition is worse than that of young adults, young and older men's emotion recognition is worse than that of young and older women (respectively), older adults' looking at mouths compared with eyes is greater than that of young adults. Nevertheless, previous research has not compared older men's and women's looking at emotion faces so the present study had two aims: (a) to examine whether the tendency to look at mouths is stronger amongst older men compared with older women and (b) to examine whether men's mouth looking correlates with better emotion recognition. We examined the emotion recognition abilities and spontaneous gaze patterns of young (n = 60) and older (n = 58) males and females as they labelled emotion faces. Older men spontaneously looked more to mouths than older women, and older men's looking at mouths correlated with their emotion recognition, whereas women's looking at eyes correlated with their emotion recognition. The findings are discussed in relation to a growing body of research suggesting both age and gender differences in response to emotional stimuli and the differential efficacy of mouth and eyes looking for men and women. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. Early (N170) activation of face-specific cortex by face-like objects

    Science.gov (United States)

    Hadjikhani, Nouchine; Kveraga, Kestutis; Naik, Paulami; Ahlfors, Seppo P.

    2009-01-01

    The tendency to perceive faces in random patterns exhibiting configural properties of faces is an example of pareidolia. Perception of ‘real’ faces has been associated with a cortical response signal arising at about 170ms after stimulus onset; but what happens when non-face objects are perceived as faces? Using magnetoencephalography (MEG), we found that objects incidentally perceived as faces evoked an early (165ms) activation in the ventral fusiform cortex, at a time and location similar to that evoked by faces, whereas common objects did not evoke such activation. An earlier peak at 130 ms was also seen for images of real faces only. Our findings suggest that face perception evoked by face-like objects is a relatively early process, and not a late re-interpretation cognitive phenomenon. PMID:19218867

  15. Dispositional fear, negative affectivity, and neuroimaging response to visually suppressed emotional faces.

    Science.gov (United States)

    Vizueta, Nathalie; Patrick, Christopher J; Jiang, Yi; Thomas, Kathleen M; He, Sheng

    2012-01-02

    "Invisible" stimulus paradigms provide a method for investigating basic affective processing in clinical and non-clinical populations. Neuroimaging studies utilizing continuous flash suppression (CFS) have shown increased amygdala response to invisible fearful versus neutral faces. The current study used CFS in conjunction with functional MRI to test for differences in brain reactivity to visible and invisible emotional faces in relation to two distinct trait dimensions relevant to psychopathology: negative affectivity (NA) and fearfulness. Subjects consisted of college students (N=31) assessed for fear/fearlessness along with dispositional NA. The main brain regions of interest included the fusiform face area (FFA), superior temporal sulcus (STS), and amygdala. Higher NA, but not trait fear, was associated with enhanced response to fearful versus neutral faces in STS and right amygdala (but not FFA), within the invisible condition specifically. The finding that NA rather than fearfulness predicted degree of amygdala reactivity to suppressed faces implicates the input subdivision of the amygdala in the observed effects. Given the central role of NA in anxiety and mood disorders, the current data also support use of the CFS methodology for investigating the neurobiology of these disorders. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. Explaining Emotional Attachment to a Protected Area by Visitors' Perceived Importance of Seeing Wildlife, Behavioral Connections with Nature and Sociodemographics

    NARCIS (Netherlands)

    Huigen, Paulus P.P.; Haartsen, Tialda; Folmer, Akke

    2013-01-01

    Recently, the interest in understanding emotional bonds with protected nature areas has been growing. The role of wildlife in emotional bonds with places has until now not been the focus of many studies. The aim of our paper is to explore relations between the perceived importance of seeing wildlife

  17. Facial Emotion Recognition Performance Differentiates Between Behavioral Variant Frontotemporal Dementia and Major Depressive Disorder.

    Science.gov (United States)

    Chiu, Isabelle; Piguet, Olivier; Diehl-Schmid, Janine; Riedl, Lina; Beck, Johannes; Leyhe, Thomas; Holsboer-Trachsler, Edith; Kressig, Reto W; Berres, Manfred; Monsch, Andreas U; Sollberger, Marc

    Misdiagnosis of early behavioral variant frontotemporal dementia (bvFTD) with major depressive disorder (MDD) is not uncommon due to overlapping symptoms. The aim of this study was to improve the discrimination between these disorders using a novel facial emotion perception task. In this prospective cohort study (July 2013-March 2016), we compared 25 patients meeting Rascovsky diagnostic criteria for bvFTD, 20 patients meeting DSM-IV criteria for MDD, 21 patients meeting McKhann diagnostic criteria for Alzheimer's disease dementia, and 31 healthy participants on a novel emotion intensity rating task comprising morphed low-intensity facial stimuli. Participants were asked to rate the intensity of morphed faces on the congruent basic emotion (eg, rating on sadness when sad face is shown) and on the 5 incongruent basic emotions (eg, rating on each of the other basic emotions when sad face is shown). While bvFTD patients underrated congruent emotions (P dementia patients perceived emotions similarly to healthy participants, indicating no impact of cognitive impairment on rating scores. Our congruent and incongruent facial emotion intensity rating task allows a detailed assessment of facial emotion perception in patient populations. By using this simple task, we achieved an almost complete discrimination between bvFTD and MDD, potentially helping improve the diagnostic certainty in early bvFTD. © Copyright 2018 Physicians Postgraduate Press, Inc.

  18. Neural markers of emotional face perception across psychotic disorders and general population.

    Science.gov (United States)

    Sabharwal, Amri; Kotov, Roman; Szekely, Akos; Leung, Hoi-Chung; Barch, Deanna M; Mohanty, Aprajita

    2017-07-01

    There is considerable variation in negative and positive symptoms of psychosis, global functioning, and emotional face perception (EFP), not only in schizophrenia but also in other psychotic disorders and healthy individuals. However, EFP impairment and its association with worse symptoms and global functioning have been examined largely in the domain of schizophrenia. The present study adopted a dimensional approach to examine the association of behavioral and neural measures of EFP with symptoms of psychosis and global functioning across individuals with schizophrenia spectrum (SZ; N = 28) and other psychotic (OP; N = 29) disorders, and never-psychotic participants (NP; N = 21). Behavioral and functional MRI data were recorded as participants matched emotional expressions of faces and geometrical shapes. Lower accuracy and increased activity in early visual regions, hippocampus, and amygdala during emotion versus shape matching were associated with higher negative, but not positive, symptoms and lower global functioning, across all participants. This association remained even after controlling for group-related (SZ, OP, and NP) variance, dysphoria, and antipsychotic medication status, except in amygdala. Furthermore, negative symptoms mediated the relationship between behavioral and brain EFP measures and global functioning. This study provides some of the first evidence supporting the specific relationship of EFP measures with negative symptoms and global functioning across psychotic and never-psychotic samples, and transdiagnostically across different psychotic disorders. Present findings help bridge the gap between basic EFP-related neuroscience research and clinical research in psychosis, and highlight EFP as a potential symptom-specific marker that tracks global functioning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. Behavioral dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia.

    Science.gov (United States)

    Daini, Roberta; Comparetti, Chiara M; Ricciardelli, Paola

    2014-01-01

    Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition.

  20. No Differences in Emotion Recognition Strategies in Children with Autism Spectrum Disorder: Evidence from Hybrid Faces

    OpenAIRE

    Evers, Kris; Kerkhof, Inneke; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2014-01-01

    Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD). However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybr...

  1. Sad music induces pleasant emotion.

    Science.gov (United States)

    Kawakami, Ai; Furukawa, Kiyoshi; Katahira, Kentaro; Okanoya, Kazuo

    2013-01-01

    In general, sad music is thought to cause us to experience sadness, which is considered an unpleasant emotion. As a result, the question arises as to why we listen to sad music if it evokes sadness. One possible answer to this question is that we may actually feel positive emotions when we listen to sad music. This suggestion may appear to be counterintuitive; however, in this study, by dividing musical emotion into perceived emotion and felt emotion, we investigated this potential emotional response to music. We hypothesized that felt and perceived emotion may not actually coincide in this respect: sad music would be perceived as sad, but the experience of listening to sad music would evoke positive emotions. A total of 44 participants listened to musical excerpts and provided data on perceived and felt emotions by rating 62 descriptive words or phrases related to emotions on a scale that ranged from 0 (not at all) to 4 (very much). The results revealed that the sad music was perceived to be more tragic, whereas the actual experiences of the participants listening to the sad music induced them to feel more romantic, more blithe, and less tragic emotions than they actually perceived with respect to the same music. Thus, the participants experienced ambivalent emotions when they listened to the sad music. After considering the possible reasons that listeners were induced to experience emotional ambivalence by the sad music, we concluded that the formulation of a new model would be essential for examining the emotions induced by music and that this new model must entertain the possibility that what we experience when listening to music is vicarious emotion.

  2. Sad music induces pleasant emotion

    Science.gov (United States)

    Kawakami, Ai; Furukawa, Kiyoshi; Katahira, Kentaro; Okanoya, Kazuo

    2013-01-01

    In general, sad music is thought to cause us to experience sadness, which is considered an unpleasant emotion. As a result, the question arises as to why we listen to sad music if it evokes sadness. One possible answer to this question is that we may actually feel positive emotions when we listen to sad music. This suggestion may appear to be counterintuitive; however, in this study, by dividing musical emotion into perceived emotion and felt emotion, we investigated this potential emotional response to music. We hypothesized that felt and perceived emotion may not actually coincide in this respect: sad music would be perceived as sad, but the experience of listening to sad music would evoke positive emotions. A total of 44 participants listened to musical excerpts and provided data on perceived and felt emotions by rating 62 descriptive words or phrases related to emotions on a scale that ranged from 0 (not at all) to 4 (very much). The results revealed that the sad music was perceived to be more tragic, whereas the actual experiences of the participants listening to the sad music induced them to feel more romantic, more blithe, and less tragic emotions than they actually perceived with respect to the same music. Thus, the participants experienced ambivalent emotions when they listened to the sad music. After considering the possible reasons that listeners were induced to experience emotional ambivalence by the sad music, we concluded that the formulation of a new model would be essential for examining the emotions induced by music and that this new model must entertain the possibility that what we experience when listening to music is vicarious emotion. PMID:23785342

  3. Task demands modulate decision and eye movement responses in the chimeric face test: examining the right hemisphere processing account

    Directory of Open Access Journals (Sweden)

    Jason eCoronel

    2014-03-01

    Full Text Available A large and growing body of work, conducted in both brain-intact and brain-damaged populations, has used the free viewing chimeric face test as a measure of hemispheric dominance for the extraction of emotional information from faces. These studies generally show that normal right-handed individuals tend to perceive chimeric faces as more emotional if the emotional expression is presented on the half of the face to the viewer’s left (left hemiface. However, the mechanisms underlying this lateralized bias remain unclear. Here, we examine the extent to which this bias is driven by right hemisphere processing advantages versus default scanning biases in a unique way -- by changing task demands. In particular, we compare the original task with one in which right-hemisphere-biased processing cannot provide a decision advantage. Our behavioral and eye-movement data are inconsistent with the predictions of a default scanning bias account and support the idea that the left hemiface bias found in the chimeric face test is largely due to strategic use of right hemisphere processing mechanisms.

  4. Economic Disadvantage, Perceived Family Life Quality, and Emotional Well-Being in Chinese Adolescents: A Longitudinal Study

    Science.gov (United States)

    Shek, Daniel T. L.

    2008-01-01

    Over three consecutive years, Chinese secondary school students experiencing and not experiencing economic disadvantage (n = 280 and 2,187, respectively) responded to measures of perceived family life quality (parenting attributes and parent-child relational quality) and emotional well-being (hopelessness, mastery, life satisfaction and…

  5. The Impact of Students' Perceived Emotional Intelligence, Social Attitudes and Teacher Expectations on Academic Performance

    Science.gov (United States)

    Jimenez-Morales, M. Isabel; Lopez-Zafra, Esther

    2013-01-01

    Introduction: The aim of this study is to analyze the role that Perceived Emotional Intelligence and social competences have on academic performance. Furthermore, we analyze the role of teacher's expectancies on performance in secondary school students. Method: One hundred ninety three students (50.7% male and 49.3 % female) from the first and…

  6. Major depressive disorder alters perception of emotional body movements

    Directory of Open Access Journals (Sweden)

    Morten eKaletsch

    2014-01-01

    Full Text Available Much recent research has shown an association between mood disorders and an altered emotion perception. However, these studies were conducted mainly with stimuli such as faces. This is the first study to examine possible differences in how people with major depressive disorder (MDD and healthy controls perceive emotions expressed via body movements. 30 patients with MDD and 30 healthy controls observed video scenes of human interactions conveyed by point–light displays (PLDs. They rated the depicted emotions and judged their confidence in their rating. Results showed that patients with MDD rated the depicted interactions more negatively than healthy controls. They also rated interactions with negative emotionality as being more intense and were more confident in their ratings. It is concluded that patients with MDD exhibit an altered emotion perception compared to healthy controls when rating emotions expressed via body movements depicted in PLDs.

  7. Differences in neural and cognitive response to emotional faces in middle-aged dizygotic twins at familial risk of depression

    DEFF Research Database (Denmark)

    Miskowiak, K W; Svendsen, A M B; Harmer, C J

    2017-01-01

    -twin history of depression (high-risk) and 20 were without co-twin history of depression (low-risk). During fMRI, participants viewed fearful and happy faces while performing a gender discrimination task. After the scan, they were given a faces dot-probe task, a facial expression recognition task......BACKGROUND: Negative bias and aberrant neural processing of emotional faces are trait-marks of depression but findings in healthy high-risk groups are conflicting. METHODS: Healthy middle-aged dizygotic twins (N = 42) underwent functional magnetic resonance imaging (fMRI): 22 twins had a co...... the amygdala and ventral prefrontal cortex and pregenual anterior cingulate. This was accompanied by greater fear-specific fronto-temporal response and reduced fronto-occipital response to all emotional faces relative to baseline. The risk groups showed no differences in mood, subjective state or coping...

  8. Matching faces with emotional expressions

    Directory of Open Access Journals (Sweden)

    Wenfeng eChen

    2011-08-01

    Full Text Available There is some evidence that faces with a happy expression are recognized better than faces with other expressions. However, little is known about whether this happy face advantage also applies to perceptual face matching, and whether similar differences exist among other expressions. Using a sequential matching paradigm, we systematically compared the effects of seven basic facial expressions on identity recognition. Identity matching was quickest when a pair of faces had an identical happy/sad/neutral expression, poorer when they had a fearful/surprise/angry expression, and poorest when they had a disgust expression. Faces with a happy/sad/fear/surprise expression were matched faster than those with an anger/disgust expression when the second face in a pair had a neutral expression. These results demonstrate that effects of facial expression on identity recognition are not limited to happy faces when a learned face is immediately tested. The results suggest different influences of expression in perceptual matching and long-term recognition memory.

  9. Visual Scanning Patterns and Executive Function in Relation to Facial Emotion Recognition in Aging

    Science.gov (United States)

    Circelli, Karishma S.; Clark, Uraina S.; Cronin-Golomb, Alice

    2012-01-01

    Objective The ability to perceive facial emotion varies with age. Relative to younger adults (YA), older adults (OA) are less accurate at identifying fear, anger, and sadness, and more accurate at identifying disgust. Because different emotions are conveyed by different parts of the face, changes in visual scanning patterns may account for age-related variability. We investigated the relation between scanning patterns and recognition of facial emotions. Additionally, as frontal-lobe changes with age may affect scanning patterns and emotion recognition, we examined correlations between scanning parameters and performance on executive function tests. Methods We recorded eye movements from 16 OA (mean age 68.9) and 16 YA (mean age 19.2) while they categorized facial expressions and non-face control images (landscapes), and administered standard tests of executive function. Results OA were less accurate than YA at identifying fear (precognition of sad expressions and with scanning patterns for fearful, sad, and surprised expressions. Conclusion We report significant age-related differences in visual scanning that are specific to faces. The observed relation between scanning patterns and executive function supports the hypothesis that frontal-lobe changes with age may underlie some changes in emotion recognition. PMID:22616800

  10. Training approach-avoidance of smiling faces affects emotional vulnerability in socially anxious individuals

    Science.gov (United States)

    Rinck, Mike; Telli, Sibel; Kampmann, Isabel L.; Woud, Marcella L.; Kerstholt, Merel; te Velthuis, Sarai; Wittkowski, Matthias; Becker, Eni S.

    2013-01-01

    Previous research revealed an automatic behavioral bias in high socially anxious individuals (HSAs): although their explicit evaluations of smiling faces are positive, they show automatic avoidance of these faces. This is reflected by faster pushing than pulling of smiling faces in an Approach-Avoidance Task (AAT; Heuer et al., 2007). The current study addressed the causal role of this avoidance bias for social anxiety. To this end, we used the AAT to train HSAs, either to approach smiling faces or to avoid them. We examined whether such an AAT training could change HSAs' automatic avoidance tendencies, and if yes, whether AAT effects would generalize to a new approach task with new facial stimuli, and to mood and anxiety in a social threat situation (a video-recorded self-presentation). We found that HSAs trained to approach smiling faces did indeed approach female faces faster after the training than HSAs trained to avoid smiling faces. Moreover, approach-faces training reduced emotional vulnerability: it led to more positive mood and lower anxiety after the self-presentation than avoid-faces training. These results suggest that automatic approach-avoidance tendencies have a causal role in social anxiety, and that they can be modified by a simple computerized training. This may open new avenues in the therapy of social phobia. PMID:23970862

  11. Training Approach-Avoidance of Smiling Faces Affects Emotional Vulnerability in Socially Anxious Individuals

    Directory of Open Access Journals (Sweden)

    Mike eRinck

    2013-08-01

    Full Text Available Previous research revealed an automatic behavioral bias in high socially anxious individuals (HSAs: Although their explicit evaluations of smiling faces are positive, they show automatic avoidance of these faces. This is reflected by faster pushing than pulling of smiling faces in an Approach-Avoidance Task (AAT; Heuer, Rinck, & Becker, 2007. The current study addressed the causal role of this avoidance bias for social anxiety. To this end, we used the AAT to train HSAs, either to approach smiling faces or to avoid them. We examined whether such an AAT training could change HSAs’ automatic avoidance tendencies, and if yes, whether AAT effects would generalize to a new approach task with new facial stimuli, and to mood and anxiety in a social threat situation (a video-recorded self-presentation. We found that HSAs trained to approach smiling faces did indeed approach female faces faster after the training than HSAs trained to avoid smiling faces. Moreover, approach-faces training reduced emotional vulnerability: It led to more positive mood and lower anxiety after the self-presentation than avoid-faces training. These results suggest that automatic approach-avoidance tendencies have a causal role in social anxiety, and that they can be modified by a simple computerized training. This may open new avenues in the therapy of social phobia.

  12. Unconsciously Triggered Emotional Conflict by Emotional Facial Expressions

    Science.gov (United States)

    Chen, Antao; Cui, Qian; Zhang, Qinglin

    2013-01-01

    The present study investigated whether emotional conflict and emotional conflict adaptation could be triggered by unconscious emotional information as assessed in a backward-masked affective priming task. Participants were instructed to identify the valence of a face (e.g., happy or sad) preceded by a masked happy or sad face. The results of two experiments revealed the emotional conflict effect but no emotional conflict adaptation effect. This demonstrates that emotional conflict can be triggered by unconsciously presented emotional information, but participants may not adjust their subsequent performance trial-by trial to reduce this conflict. PMID:23409084

  13. Love withdrawal predicts electrocortical responses to emotional faces with performance feedback: a follow-up and extension.

    Science.gov (United States)

    Huffmeijer, Renske; Bakermans-Kranenburg, Marian J; Alink, Lenneke R A; van IJzendoorn, Marinus H

    2014-06-02

    Parental use of love withdrawal is thought to affect children's later psychological functioning because it creates a link between children's performance and relational consequences. In addition, recent studies have begun to show that experiences of love withdrawal also relate to the neural processing of socio-emotional information relevant to a performance-relational consequence link, and can moderate effects of oxytocin on social information processing and behavior. The current study follows-up on our previous results by attempting to confirm and extend previous findings indicating that experiences of maternal love withdrawal are related to electrocortical responses to emotional faces presented with performance feedback. More maternal love withdrawal was related to enhanced early processing of facial feedback stimuli (reflected in more positive VPP amplitudes, and confirming previous findings). However, attentional engagement with and processing of the stimuli at a later stage were diminished in those reporting higher maternal love withdrawal (reflected in less positive LPP amplitudes, and diverging from previous findings). Maternal love withdrawal affects the processing of emotional faces presented with performance feedback differently in different stages of neural processing.

  14. Association of Maternal Interaction with Emotional Regulation in 4 and 9 Month Infants During the Still Face Paradigm

    Science.gov (United States)

    Lowe, Jean R.; MacLean, Peggy C.; Duncan, Andrea F.; Aragón, Crystal; Schrader, Ronald M.; Caprihan, Arvind; Phillips, John P.

    2013-01-01

    This study used the Still Face Paradigm to investigate the relationship of maternal interaction on infants’ emotion regulation responses. Seventy infant-mother dyads were seen at 4 months and 25 of these same dyads were re-evaluated at 9 months. Maternal interactions were coded for attention seeking and contingent responding. Emotional regulation was described by infant stress reaction and overall positive affect. Results indicated that at both 4 and 9 months mothers who used more contingent responding interactions had infants who showed more positive affect. In contrast, mothers who used more attention seeking play had infants who showed less positive affect after the Still Face Paradigm. Patterns of stress reaction were reversed, as mothers who used more attention seeking play had infants with less negative affect. Implications for intervention and emotional regulation patterns over time are discussed. PMID:22217393

  15. Perception of emotion in facial stimuli: The interaction of ADRA2A and COMT genotypes, and sex.

    Science.gov (United States)

    Tamm, Gerly; Kreegipuu, Kairi; Harro, Jaanus

    2016-01-04

    Emotional facial stimuli are important social signals that are essential to be perceived and recognized in order to make appropriate decisions and responses in everyday communication. The ability to voluntarily guide attention to perceive and recognize emotions, and react to them varies largely across individuals, and has a strong genetic component (Friedman et al., 2008). Two key genetic variants of the catecholamine system that have been related to emotion perception and attention are the catechol-O-methyl transferase genetic variant (COMT Val158Met) and the α2A-receptor gene promoter polymorphism (ADRA2A C-1291G) accordingly. So far, the interaction of the two with sex in emotion perception has not been studied. Multilevel modeling method was applied to study how COMT Val158Met, ADRA2A C-1291G and sex are associated with measures of emotion perception in a large sample of young adults. Participants (n=506) completed emotion recognition and behavioral emotion detection tasks. It was found that COMT Val158Met genotype in combination with the ADRA2A C-1291G and sex predicts emotion detection, and perception of valence and arousal. In simple visual detection, the ADRA2A C-1291G G-allele leads to slower detection of a highly arousing face (scheming), which is modulated by each additional COMT Val158Met Met-allele and male sex predicting faster responses. The combination of G-allele, Met-allele and male sex also predicts higher perceived negativity in sad faces. No effects of C-1291G, Val158Met, and sex were found on verbal emotion recognition. Applying the findings to study the interplay between catecholamine-O-methyl transferase activity and α2A-receptors in emotion perception disorders (such as ADHD, autism and schizophrenia) in men and women would be the next step towards understanding individual differences in emotion perception. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. The Role of Perceived In-group Moral Superiority in Reparative Intentions and Approach Motivation

    Directory of Open Access Journals (Sweden)

    Zsolt P. Szabó

    2017-05-01

    Full Text Available Three studies examined how members of a national group react to in-group wrongdoings. We expected that perceived in-group moral superiority would lead to unwillingness to repair the aggression. We also expected that internal-focused emotions such as group-based guilt and group-based shame would predict specific, misdeed-related reparative intentions but not general approach motivation toward the victim groups. In Study 1, facing the in-group’s recent aggression, participants who believed that the Hungarians have been more moral throughout their history than members of other nations, used more exonerating cognitions, experienced less in-group critical emotions and showed less willingness to provide reparations for the members of the victim group. Study 2 and Study 3 confirmed most findings of Study 1. Perceived in-group moral superiority directly or indirectly reduced willingness to provide either general or specific reparations, while internally focused in-group critical emotions predicted specific misdeed-related reparative intentions but not general approach motivation. The role of emotional attachment to the in-group is considered.

  17. The Role of Perceived In-group Moral Superiority in Reparative Intentions and Approach Motivation.

    Science.gov (United States)

    Szabó, Zsolt P; Mészáros, Noémi Z; Csertő, István

    2017-01-01

    Three studies examined how members of a national group react to in-group wrongdoings. We expected that perceived in-group moral superiority would lead to unwillingness to repair the aggression. We also expected that internal-focused emotions such as group-based guilt and group-based shame would predict specific, misdeed-related reparative intentions but not general approach motivation toward the victim groups. In Study 1, facing the in-group's recent aggression, participants who believed that the Hungarians have been more moral throughout their history than members of other nations, used more exonerating cognitions, experienced less in-group critical emotions and showed less willingness to provide reparations for the members of the victim group. Study 2 and Study 3 confirmed most findings of Study 1. Perceived in-group moral superiority directly or indirectly reduced willingness to provide either general or specific reparations, while internally focused in-group critical emotions predicted specific misdeed-related reparative intentions but not general approach motivation. The role of emotional attachment to the in-group is considered.

  18. Neglect in human communication: quantifying the cost of cell-phone interruptions in face to face dialogs.

    Science.gov (United States)

    Lopez-Rosenfeld, Matías; Calero, Cecilia I; Fernandez Slezak, Diego; Garbulsky, Gerry; Bergman, Mariano; Trevisan, Marcos; Sigman, Mariano

    2015-01-01

    There is a prevailing belief that interruptions using cellular phones during face to face interactions may affect severely how people relate and perceive each other. We set out to determine this cost quantitatively through an experiment performed in dyads, in a large audience in a TEDx event. One of the two participants (the speaker) narrates a story vividly. The listener is asked to deliberately ignore the speaker during part of the story (for instance, attending to their cell-phone). The speaker is not aware of this treatment. We show that total amount of attention is the major factor driving subjective beliefs about the story and the conversational partner. The effects are mostly independent on how attention is distributed in time. All social parameters of human communication are affected by attention time with a sole exception: the perceived emotion of the story. Interruptions during day-to-day communication between peers are extremely frequent. Our data should provide a note of caution, by indicating that they have a major effect on the perception people have about what they say (whether it is interesting or not . . .) and about the virtues of the people around them.

  19. Neglect in human communication: quantifying the cost of cell-phone interruptions in face to face dialogs.

    Directory of Open Access Journals (Sweden)

    Matías Lopez-Rosenfeld

    Full Text Available There is a prevailing belief that interruptions using cellular phones during face to face interactions may affect severely how people relate and perceive each other. We set out to determine this cost quantitatively through an experiment performed in dyads, in a large audience in a TEDx event. One of the two participants (the speaker narrates a story vividly. The listener is asked to deliberately ignore the speaker during part of the story (for instance, attending to their cell-phone. The speaker is not aware of this treatment. We show that total amount of attention is the major factor driving subjective beliefs about the story and the conversational partner. The effects are mostly independent on how attention is distributed in time. All social parameters of human communication are affected by attention time with a sole exception: the perceived emotion of the story. Interruptions during day-to-day communication between peers are extremely frequent. Our data should provide a note of caution, by indicating that they have a major effect on the perception people have about what they say (whether it is interesting or not . . . and about the virtues of the people around them.

  20. Early life stress and trauma and enhanced limbic activation to emotionally valenced faces in depressed and healthy children.

    Science.gov (United States)

    Suzuki, Hideo; Luby, Joan L; Botteron, Kelly N; Dietrich, Rachel; McAvoy, Mark P; Barch, Deanna M

    2014-07-01

    Previous studies have examined the relationships between structural brain characteristics and early life stress in adults. However, there is limited evidence for functional brain variation associated with early life stress in children. We hypothesized that early life stress and trauma would be associated with increased functional brain activation response to negative emotional faces in children with and without a history of depression. Psychiatric diagnosis and life events in children (starting at age 3-5 years) were assessed in a longitudinal study. A follow-up magnetic resonance imaging (MRI) study acquired data (N = 115 at ages 7-12, 51% girls) on functional brain response to fearful, sad, and happy faces relative to neutral faces. We used a region-of-interest mask within cortico-limbic areas and conducted regression analyses and repeated-measures analysis of covariance. Greater activation responses to fearful, sad, and happy faces in the amygdala and its neighboring regions were found in children with greater life stress. Moreover, an association between life stress and left hippocampal and globus pallidus activity depended on children's diagnostic status. Finally, all children with greater life trauma showed greater bilateral amygdala and cingulate activity specific to sad faces but not the other emotional faces, although right amygdala activity was moderated by psychiatric status. These findings suggest that limbic hyperactivity may be a biomarker of early life stress and trauma in children and may have implications in the risk trajectory for depression and other stress-related disorders. However, this pattern varied based on emotion type and history of psychopathology. Copyright © 2014 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  1. Perceived threat and perceived neglect: Couples' underlying concerns during conflict.

    Science.gov (United States)

    Sanford, Keith

    2010-06-01

    The Couples Underlying Concern Inventory assesses 2 fundamental types of distress that couples experience during interpersonal conflict. Perceived threat involves a perception that one's partner is blaming and controlling the self. Perceived neglect involves a perception that one's partner is failing to make desired contributions or investments. Scales measuring these 2 underlying concerns were developed in Study 1, where a sample of 1,224 married people rated a pool of 57 words describing oneself and perceptions of a partner during a specific episode of conflict. Factor analysis identified 2 dimensions, and 2 brief 8-item scales were created. In Study 2, a sample of 2,315 married people completed the resulting 16-item inventory along with 10 self-report scales measuring types of emotion, cognition, and behavior during conflict. A 2-dimensional factor structure was confirmed, and measurement invariance was demonstrated across 4 racial/ethnic groups. Both perceived threat and perceived neglect correlated with relationship satisfaction and conflict communication. More importantly, each concern was associated with a different, and theoretically expected, set of variables regarding self emotion, emotion perceived in a partner, and cognition during conflict.

  2. Emotion Processing for Arousal and Neutral Content in Alzheimer's Disease

    Directory of Open Access Journals (Sweden)

    Corina Satler

    2009-01-01

    Full Text Available Objective. To assess the ability of Alzheimer's disease (AD patients to perceive emotional information and to assign subjective emotional rating scores to audiovisual presentations. Materials and Methods. 24 subjects (14 with AD, matched to controls for age and educational levels were studied. After neuropsychological assessment, they watched a Neutral story and then a story with Emotional content. Results. Recall scores for both stories were significantly lower in AD (Neutral and Emotional: P=.001. CG assigned different emotional scores for each version of the test, P=.001, while ratings of AD did not differ, P=.32. Linear regression analyses determined the best predictors of emotional rating and recognition memory for each group among neuropsychological tests battery. Conclusions. AD patients show changes in emotional processing on declarative memory and a preserved ability to express emotions in face of arousal content. The present findings suggest that these impairments are due to general cognitive decline.

  3. Immediacy Bias in Emotion Perception: Current Emotions Seem More Intense than Previous Emotions

    Science.gov (United States)

    Van Boven, Leaf; White, Katherine; Huber, Michaela

    2009-01-01

    People tend to perceive immediate emotions as more intense than previous emotions. This "immediacy bias" in emotion perception occurred for exposure to emotional but not neutral stimuli (Study 1), when emotional stimuli were separated by both shorter (2 s; Studies 1 and 2) and longer (20 min; Studies 3, 4, and 5) delays, and for emotional…

  4. Familiar face + novel face = familiar face? Representational bias in the perception of morphed faces in chimpanzees

    Directory of Open Access Journals (Sweden)

    Yoshi-Taka Matsuda

    2016-08-01

    Full Text Available Highly social animals possess a well-developed ability to distinguish the faces of familiar from novel conspecifics to induce distinct behaviors for maintaining society. However, the behaviors of animals when they encounter ambiguous faces of familiar yet novel conspecifics, e.g., strangers with faces resembling known individuals, have not been well characterised. Using a morphing technique and preferential-looking paradigm, we address this question via the chimpanzee’s facial–recognition abilities. We presented eight subjects with three types of stimuli: (1 familiar faces, (2 novel faces and (3 intermediate morphed faces that were 50% familiar and 50% novel faces of conspecifics. We found that chimpanzees spent more time looking at novel faces and scanned novel faces more extensively than familiar or intermediate faces. Interestingly, chimpanzees looked at intermediate faces in a manner similar to familiar faces with regards to the fixation duration, fixation count, and saccade length for facial scanning, even though the participant was encountering the intermediate faces for the first time. We excluded the possibility that subjects merely detected and avoided traces of morphing in the intermediate faces. These findings suggest a bias for a feeling-of-familiarity that chimpanzees perceive familiarity with an intermediate face by detecting traces of a known individual, as 50% alternation is sufficient to perceive familiarity.

  5. Different neural and cognitive response to emotional faces in healthy monozygotic twins at risk of depression

    DEFF Research Database (Denmark)

    Miskowiak, K W; Glerup, L; Vestbo, C

    2015-01-01

    while performing a gender discrimination task. After the scan, they were given a faces dot-probe task, a facial expression recognition task and questionnaires assessing mood, personality traits and coping strategies. RESULTS: High-risk twins showed increased neural response to happy and fearful faces...... processing. These task-related changes in neural responses in high-risk twins were accompanied by impaired gender discrimination performance during face processing. They also displayed increased attention vigilance for fearful faces and were slower at recognizing facial expressions relative to low......BACKGROUND: Negative cognitive bias and aberrant neural processing of emotional faces are trait-marks of depression. Yet it is unclear whether these changes constitute an endophenotype for depression and are also present in healthy individuals with hereditary risk for depression. METHOD: Thirty...

  6. Facing Complaining Customer and Suppressed Emotion at Worksite Related to Sleep Disturbance in Korea.

    Science.gov (United States)

    Lim, Sung Shil; Lee, Wanhyung; Hong, Kwanyoung; Jeung, Dayee; Chang, Sei Jin; Yoon, Jin Ha

    2016-11-01

    This study aimed to investigate the effect of facing complaining customer and suppressed emotion at worksite on sleep disturbance among working population. We enrolled 13,066 paid workers (male = 6,839, female = 6,227, age Working Condition Survey (2011). The odds ratio (OR) and 95% confidence intervals (CI) for sleep disturbance occurrence were calculated using multiple logistic regression models. Among workers in working environments where they always engage complaining customers had a significantly higher risk for sleep disturbance than rarely group (The OR [95% CI]; 5.46 [3.43-8.68] in male, 5.59 [3.30-9.46] in female workers). The OR (95% CI) for sleep disturbance was 1.78 (1.16-2.73) and 1.63 (1.02-2.63), for the male and female groups always suppressing their emotions at the workplace compared with those rarely group. Compared to those who both rarely engaged complaining customers and rarely suppressed their emotions at work, the OR (CI) for sleep disturbance was 9.66 (4.34-20.80) and 10.17 (4.46-22.07), for men and women always exposed to both factors. Sleep disturbance was affected by interactions of both emotional demands (engaging complaining customers and suppressing emotions at the workplace). The level of emotional demand, including engaging complaining customers and suppressing emotions at the workplace is significantly associated with sleep disturbance among Korean working population.

  7. Cultural similarities and differences in perceiving and recognizing facial expressions of basic emotions.

    Science.gov (United States)

    Yan, Xiaoqian; Andrews, Timothy J; Young, Andrew W

    2016-03-01

    The ability to recognize facial expressions of basic emotions is often considered a universal human ability. However, recent studies have suggested that this commonality has been overestimated and that people from different cultures use different facial signals to represent expressions (Jack, Blais, Scheepers, Schyns, & Caldara, 2009; Jack, Caldara, & Schyns, 2012). We investigated this possibility by examining similarities and differences in the perception and categorization of facial expressions between Chinese and white British participants using whole-face and partial-face images. Our results showed no cultural difference in the patterns of perceptual similarity of expressions from whole-face images. When categorizing the same expressions, however, both British and Chinese participants were slightly more accurate with whole-face images of their own ethnic group. To further investigate potential strategy differences, we repeated the perceptual similarity and categorization tasks with presentation of only the upper or lower half of each face. Again, the perceptual similarity of facial expressions was similar between Chinese and British participants for both the upper and lower face regions. However, participants were slightly better at categorizing facial expressions of their own ethnic group for the lower face regions, indicating that the way in which culture shapes the categorization of facial expressions is largely driven by differences in information decoding from this part of the face. (c) 2016 APA, all rights reserved).

  8. The measurement of perceived Emotional Intelligence for Spanish adolescents with social anxiety disorder symptoms

    Directory of Open Access Journals (Sweden)

    Mª del Mar Diaz-Castela

    2013-05-01

    Full Text Available Emotional Intelligence (EI is a concept that has been discussed for decades in Psychology but has received very little empirical study until recently. And with this growing interest, its accompanying concept, Perceived Emotional Intelligence (PEI, has also received more attention. It is due to this growing interest in PEI that this paper explores two important aspects of the PEI: the measurement of PEI and the implications PEI may have for adolescent anxiety disorder symptomology. This study explores a well-known questionnaire of PEI, namely the Trait Meta-Mood Scale questionnaire (TMMS. The Spanish shortened version of the Trait Meta-Mood Scale questionnaire (TMMS-24 and a series of well-known questionnaires of Social Anxiety Disorder symptomology were administrated to 425 Spanish high-school adolescents. The results of this study corroborated that the TMMS-24 has good psychometric properties in adolescents, and that one of its three scales (Emotional Repair appears to be involved in adolescent SAD symptomology.

  9. Emotional distress among LGBT youth: the influence of perceived discrimination based on sexual orientation.

    Science.gov (United States)

    Almeida, Joanna; Johnson, Renee M; Corliss, Heather L; Molnar, Beth E; Azrael, Deborah

    2009-08-01

    The authors evaluated emotional distress among 9th-12th grade students, and examined whether the association between being lesbian, gay, bisexual, and/or transgendered (i.e., "LGBT") and emotional distress was mediated by perceptions of having been treated badly or discriminated against because others thought they were gay or lesbian. Data come from a school-based survey in Boston, Massachusetts (n = 1,032); 10% were LGBT, 58% were female, and ages ranged from 13 to 19 years. About 45% were Black, 31% were Hispanic, and 14% were White. LGBT youth scored significantly higher on the scale of depressive symptomatology. They were also more likely than heterosexual, non-transgendered youth to report suicidal ideation (30% vs. 6%, p discrimination accounted for increased depressive symptomatology among LGBT males and females, and accounted for an elevated risk of self-harm and suicidal ideation among LGBT males. Perceived discrimination is a likely contributor to emotional distress among LGBT youth.

  10. Emotional Face Identification in Youths with Primary Bipolar Disorder or Primary Attention-Deficit/Hyperactivity Disorder

    Science.gov (United States)

    Seymour, Karen E.; Pescosolido, Matthew F.; Reidy, Brooke L.; Galvan, Thania; Kim, Kerri L.; Young, Matthew; Dickstein, Daniel P.

    2013-01-01

    Objective: Bipolar disorder (BD) and attention-deficit/hyperactivity disorder (ADHD) are often comorbid or confounded; therefore, we evaluated emotional face identification to better understand brain/behavior interactions in children and adolescents with either primary BD, primary ADHD, or typically developing controls (TDC). Method: Participants…

  11. Visual attractiveness is leaky: The asymmetrical relationship between face and hair

    Directory of Open Access Journals (Sweden)

    Chihiro eSaegusa

    2015-04-01

    Full Text Available Predicting personality is crucial when communicating with people. It has been revealed that the perceived attractiveness or beauty of the face is a cue. As shown in the well-known what is beautiful is good stereotype, perceived attractiveness is often associated with desirable personality. Although such research on attractiveness used mainly the face isolated from other body parts, the face is not always seen in isolation in the real world. Rather, it is surrounded by one’s hairstyle, and is perceived as a part of total presence. In human vision, perceptual organization/integration occurs mostly in a bottom up, task-irrelevant fashion. This raises an intriguing possibility that task-irrelevant stimulus that is perceptually integrated with a target may influence our affective evaluation. In such a case, there should be a mutual influence between attractiveness perception of the face and surrounding hair, since they are assumed to share strong and unique perceptual organization. In the current study, we examined the influence of a task-irrelevant stimulus on our attractiveness evaluation, using face and hair as stimuli. The results revealed asymmetrical influences in the evaluation of one while ignoring the other. When hair was task-irrelevant, it still affected attractiveness of the face, but only if the hair itself had never been evaluated by the same evaluator. On the other hand, the face affected the hair regardless of whether the face itself was evaluated before. This has intriguing implications on the asymmetry between face and hair, and perceptual integration between them in general. Together with data from a post-hoc questionnaire, it is suggested that both implicit non-selective and explicit selective processes contribute to attractiveness evaluation. The findings provide an understanding of attractiveness perception in real-life situations, as well as a new paradigm to reveal unknown implicit aspects of information integration for

  12. Perceived Family Support and Self-Esteem: The Mediational Role of Emotional Experience in Adults with Dyslexia

    Science.gov (United States)

    Nalavany, Blace A.; Carawan, Lena W.

    2012-01-01

    Although a growing body of literature shows that perceived family support (PFS) influences self-esteem in adults with dyslexia, little empirical attention has been given to the mechanisms through which this effect operates across early, middle, and late adulthood. The present study examined the mediational effect of emotional experience with…

  13. The relationship of parental overprotection, perceived vulnerability, and parenting stress to behavioral, emotional, and social adjustment in children with cancer.

    Science.gov (United States)

    Colletti, Christina J M; Wolfe-Christensen, Cortney; Carpentier, Melissa Y; Page, Melanie C; McNall-Knapp, René Y; Meyer, William H; Chaney, John M; Mullins, Larry L

    2008-08-01

    To examine the relationship of self-reported parental overprotection, perceived child vulnerability, and parenting stress to parent-reported behavioral, emotional, and social adjustment of children currently on treatment for cancer. Parents of 62 children (34 boys, 28 girls) currently on treatment for cancer were recruited from an outpatient pediatric cancer clinic. Children ranged in age from 2 to 12 years; age at diagnosis ranged from 1.33 to 11.83 years. Higher levels of parenting stress, but not parental overprotection or perceived child vulnerability, were associated with poorer behavioral and social adjustment. Higher levels of perceived child vulnerability and parenting stress, but not parental overprotection, were independently associated with poorer emotional adjustment. Specific parenting variables appear to be related to specific adjustment outcomes in children with cancer. Longitudinal follow-up of these children is necessary to determine the developmental trajectory of parent variables and long-term child outcomes.

  14. Subliminal presentation of emotionally negative vs positive primes increases the perceived beauty of target stimuli.

    Science.gov (United States)

    Era, Vanessa; Candidi, Matteo; Aglioti, Salvatore Maria

    2015-11-01

    Emotions have a profound influence on aesthetic experiences. Studies using affective priming procedures demonstrate, for example, that inducing a conscious negative emotional state biases the perception of abstract stimuli towards the sublime (Eskine et al. Emotion 12:1071-1074, 2012. doi: 10.1037/a0027200). Moreover, subliminal happy facial expressions have a positive impact on the aesthetic evaluation of abstract art (Flexas et al. PLoS ONE 8:e80154, 2013). Little is known about how emotion influences aesthetic perception of non-abstract, representational stimuli, especially those that are particularly relevant for social behaviour, like human bodies. Here, we explore whether the subliminal presentation of emotionally charged visual primes modulates the explicit subjective aesthetic judgment of body images. Using a forward/backward masking procedure, we presented subliminally positive and negative, arousal-matched, emotional or neutral primes and measured their effect on the explicit evaluation of perceived beauty (high vs low) and emotion (positive vs negative) evoked by abstract and body images. We found that negative primes increased subjective aesthetic evaluations of target bodies or abstract images in comparison with positive primes. No influence of primes on the emotional dimension of the targets was found, thus ruling out an unspecific arousal effect and strengthening the link between emotional valence and aesthetic appreciation. More specifically, that subliminal negative primes increase beauty ratings compared to subliminal positive primes indicates a clear link between negative emotions and positive aesthetic evaluations and vice versa, suggesting a possible link between negative emotion and the experience of sublime in art. The study expands previous research by showing the effect of subliminal negative emotions on the subjective aesthetic evaluation not only of abstract but also of body images.

  15. Emotion and anxiety potentiate the way attention alters visual appearance.

    Science.gov (United States)

    Barbot, Antoine; Carrasco, Marisa

    2018-04-12

    The ability to swiftly detect and prioritize the processing of relevant information around us is critical for the way we interact with our environment. Selective attention is a key mechanism that serves this purpose, improving performance in numerous visual tasks. Reflexively attending to sudden information helps detect impeding threat or danger, a possible reason why emotion modulates the way selective attention affects perception. For instance, the sudden appearance of a fearful face potentiates the effects of exogenous (involuntary, stimulus-driven) attention on performance. Internal states such as trait anxiety can also modulate the impact of attention on early visual processing. However, attention does not only improve performance; it also alters the way visual information appears to us, e.g. by enhancing perceived contrast. Here we show that emotion potentiates the effects of exogenous attention on both performance and perceived contrast. Moreover, we found that trait anxiety mediates these effects, with stronger influences of attention and emotion in anxious observers. Finally, changes in performance and appearance correlated with each other, likely reflecting common attentional modulations. Altogether, our findings show that emotion and anxiety interact with selective attention to truly alter how we see.

  16. Repetition Blindness for Faces: A Comparison of Face Identity, Expression, and Gender Judgments

    OpenAIRE

    Murphy, Karen; Ward, Zoe

    2017-01-01

    Repetition blindness (RB) refers to the impairment in reporting two identical targets within a rapid serial visual presentation stream. While numerous studies have demonstrated RB for words and picture of objects, very few studies have examined RB for faces. This study extended this research by examining RB when the two faces were complete repeats (same emotion and identity), identity repeats (same individual, different emotion), and emotion repeats (different individual, same emotion) for id...

  17. Gaze Cueing by Pareidolia Faces

    Directory of Open Access Journals (Sweden)

    Kohske Takahashi

    2013-12-01

    Full Text Available Visual images that are not faces are sometimes perceived as faces (the pareidolia phenomenon. While the pareidolia phenomenon provides people with a strong impression that a face is present, it is unclear how deeply pareidolia faces are processed as faces. In the present study, we examined whether a shift in spatial attention would be produced by gaze cueing of face-like objects. A robust cueing effect was observed when the face-like objects were perceived as faces. The magnitude of the cueing effect was comparable between the face-like objects and a cartoon face. However, the cueing effect was eliminated when the observer did not perceive the objects as faces. These results demonstrated that pareidolia faces do more than give the impression of the presence of faces; indeed, they trigger an additional face-specific attentional process.

  18. Gaze cueing by pareidolia faces.

    Science.gov (United States)

    Takahashi, Kohske; Watanabe, Katsumi

    2013-01-01

    Visual images that are not faces are sometimes perceived as faces (the pareidolia phenomenon). While the pareidolia phenomenon provides people with a strong impression that a face is present, it is unclear how deeply pareidolia faces are processed as faces. In the present study, we examined whether a shift in spatial attention would be produced by gaze cueing of face-like objects. A robust cueing effect was observed when the face-like objects were perceived as faces. The magnitude of the cueing effect was comparable between the face-like objects and a cartoon face. However, the cueing effect was eliminated when the observer did not perceive the objects as faces. These results demonstrated that pareidolia faces do more than give the impression of the presence of faces; indeed, they trigger an additional face-specific attentional process.

  19. Bush v. Bin Laden: Effect of State Emotion on Perceived Threat is Mediated by Emotion Towards the Threat Agent (Bush vs. Ben Laden: l’Effet de l’Emotion etat sur la Menace Percue est Mediatisees par l’Emotion vis-a-vis de l’Agent Menacant)

    Science.gov (United States)

    2009-07-01

    auteurs discu- tent des implications des resultats pour les theories qui postulent un effet de I’emotion sur la perception du risque et pour com...effect of global negative emotion on perceived threat . The authors discuss implications of the findings for theories that postulate an effect of... auteurs ont mene une etude perception j emotion j experimentale afin d’examiner les effets d’etats emotionnels Specifi- ques (peur et colere) et globaux

  20. Audio-Visual Integration Modifies Emotional Judgment in Music

    Directory of Open Access Journals (Sweden)

    Shen-Yuan Su

    2011-10-01

    Full Text Available The conventional view that perceived emotion in music is derived mainly from auditory signals has led to neglect of the contribution of visual image. In this study, we manipulated mode (major vs. minor and examined the influence of a video image on emotional judgment in music. Melodies in either major or minor mode were controlled for tempo and rhythm and played to the participants. We found that Taiwanese participants, like Westerners, judged major melodies as expressing positive, and minor melodies negative, emotions. The major or minor melodies were then paired with video images of the singers, which were either emotionally congruent or incongruent with their modes. Results showed that participants perceived stronger positive or negative emotions with congruent audio-visual stimuli. Compared to listening to music alone, stronger emotions were perceived when an emotionally congruent video image was added and weaker emotions were perceived when an incongruent image was added. We therefore demonstrate that mode is important to perceive the emotional valence in music and that treating musical art as a purely auditory event might lose the enhanced emotional strength perceived in music, since going to a concert may lead to stronger perceived emotion than listening to the CD at home.

  1. Amygdala Hyperactivation During Face Emotion Processing in Unaffected Youth at Risk for Bipolar Disorder

    Science.gov (United States)

    Olsavsky, Aviva K.; Brotman, Melissa A.; Rutenberg, Julia G.; Muhrer, Eli J.; Deveney, Christen M.; Fromm, Stephen J.; Towbin, Kenneth; Pine, Daniel S.; Leibenluft, Ellen

    2012-01-01

    Objective: Youth at familial risk for bipolar disorder (BD) show deficits in face emotion processing, but the neural correlates of these deficits have not been examined. This preliminary study tests the hypothesis that, relative to healthy comparison (HC) subjects, both BD subjects and youth at risk for BD (i.e., those with a first-degree BD…

  2. Intranasal Oxytocin Administration Dampens Amygdala Reactivity towards Emotional Faces in Male and Female PTSD Patients.

    Science.gov (United States)

    Koch, Saskia Bj; van Zuiden, Mirjam; Nawijn, Laura; Frijling, Jessie L; Veltman, Dick J; Olff, Miranda

    2016-05-01

    Post-traumatic stress disorder (PTSD) is a disabling psychiatric disorder. As a substantial part of PTSD patients responds poorly to currently available psychotherapies, pharmacological interventions boosting treatment response are needed. Because of its anxiolytic and pro-social properties, the neuropeptide oxytocin (OT) has been proposed as promising strategy for treatment augmentation in PTSD. As a first step to investigate the therapeutic potential of OT in PTSD, we conducted a double-blind, placebo-controlled, cross-over functional MRI study examining OT administration effects (40 IU) on amygdala reactivity toward emotional faces in unmedicated male and female police officers with (n=37, 21 males) and without (n=40, 20 males) PTSD. Trauma-exposed controls were matched to PTSD patients based on age, sex, years of service and educational level. Under placebo, the expected valence-dependent amygdala reactivity (ie, greater activity toward fearful-angry faces compared with happy-neutral faces) was absent in PTSD patients. OT administration dampened amygdala reactivity toward all emotional faces in male and female PTSD patients, but enhanced amygdala reactivity in healthy male and female trauma-exposed controls, independent of sex and stimulus valence. In PTSD patients, greater anxiety prior to scanning and amygdala reactivity during the placebo session were associated with greater reduction of amygdala reactivity after OT administration. Taken together, our results indicate presumably beneficial neurobiological effects of OT administration in male and female PTSD patients. Future studies should investigate OT administration in clinical settings to fully appreciate its therapeutic potential.

  3. Dissociation between Emotional Remapping of Fear and Disgust in Alexithymia.

    Directory of Open Access Journals (Sweden)

    Cristina Scarpazza

    Full Text Available There is growing evidence that individuals are able to understand others' emotions because they "embody" them, i.e., re-experience them by activating a representation of the observed emotion within their own body. One way to study emotion embodiment is provided by a multisensory stimulation paradigm called emotional visual remapping of touch (eVRT, in which the degree of embodiment/remapping of emotions is measured as enhanced detection of near-threshold tactile stimuli on one's own face while viewing different emotional facial expressions. Here, we measured remapping of fear and disgust in participants with low (LA and high (HA levels of alexithymia, a personality trait characterized by a difficulty in recognizing emotions. The results showed that fear is remapped in LA but not in HA participants, while disgust is remapped in HA but not in LA participants. To investigate the hypothesis that HA might exhibit increased responses to emotional stimuli producing a heightened physical and visceral sensations, i.e., disgust, in a second experiment we investigated participants' interoceptive abilities and the link between interoception and emotional modulations of VRT. The results showed that participants' disgust modulations of VRT correlated with their ability to perceive bodily signals. We suggest that the emotional profile of HA individuals on the eVRT task could be related to their abnormal tendency to be focalized on their internal bodily signals, and to experience emotions in a "physical" way. Finally, we speculated that these results in HA could be due to a enhancement of insular activity during the perception of disgusted faces.

  4. Emotion perception across cultures: the role of cognitive mechanisms.

    Science.gov (United States)

    Engelmann, Jan B; Pogosyan, Marianna

    2013-01-01

    Despite consistently documented cultural differences in the perception of facial expressions of emotion, the role of culture in shaping cognitive mechanisms that are central to emotion perception has received relatively little attention in past research. We review recent developments in cross-cultural psychology that provide particular insights into the modulatory role of culture on cognitive mechanisms involved in interpretations of facial expressions of emotion through two distinct routes: display rules and cognitive styles. Investigations of emotion intensity perception have demonstrated that facial expressions with varying levels of intensity of positive affect are perceived and categorized differently across cultures. Specifically, recent findings indicating significant levels of differentiation between intensity levels of facial expressions among American participants, as well as deviations from clear categorization of high and low intensity expressions among Japanese and Russian participants, suggest that display rules shape mental representations of emotions, such as intensity levels of emotion prototypes. Furthermore, a series of recent studies using eye tracking as a proxy for overt attention during face perception have identified culture-specific cognitive styles, such as the propensity to attend to very specific features of the face. Together, these results suggest a cascade of cultural influences on cognitive mechanisms involved in interpretations of facial expressions of emotion, whereby cultures impart specific behavioral practices that shape the way individuals process information from the environment. These cultural influences lead to differences in cognitive styles due to culture-specific attentional biases and emotion prototypes, which partially account for the gradient of cultural agreements and disagreements obtained in past investigations of emotion perception.

  5. Perceived Emotional Intelligence as a Predictor of Depressive Symptoms after a One Year Follow-Up during Adolescence

    Science.gov (United States)

    Gomez-Baya, Diego; Mendoza, Ramon; Paino, Susana

    2016-01-01

    Research to date has identified various risk factors in the emergence of depressive disorders in adolescence. There are very few studies, however, which have analyzed the role of perceived emotional intelligence in depressive symptoms longitudinally during adolescence. This work aimed to analyze longitudinal relationships between perceived…

  6. Emotional intelligence, perceived stress and academic performance of Sri Lankan medical undergraduates

    Directory of Open Access Journals (Sweden)

    P. Ranasinghe

    2017-02-01

    Full Text Available Abstract Background Previous research has shown that higher Emotional Intelligence (EI is associated with better academic and work performance. The present study intended to explore the relationship between EI, perceived stress and academic performance and associated factors among medical undergraduates. Methods This descriptive cross-sectional research study was conducted among 471 medical undergraduates of 2nd, 4th and final years of University of Colombo, Sri Lanka. Students were rated on self administered Perceived Stress Scale (PSS and Schutte Self-Report Emotional Intelligence Test (SEIT. Examination results were used as the dichotomous outcome variable in a logistic regression analysis. Results Females had higher mean EI scores (p = 0.014. A positive correlation was found between the EI score and the number of extracurricular activities (r = 0.121, p = 0.008. Those who were satisfied regarding their choice to study medicine, and who were planning to do postgraduate studies had significantly higher EI scores and lower PSS scores (p <0.001. Among final year undergraduates, those who passed the Clinical Sciences examination in the first attempt had a higher EI score (p <0.001 and a lower PSS score (p <0.05. Results of the binary logistic-regression analysis in the entire study population indicated that female gender (OR:1.98 and being satisfied regarding their choice of the medical undergraduate programme (OR:3.69 were significantly associated with passing the examinations. However, PSS Score and engagement in extracurricular activities were not associated with ‘Examination Results’. Conclusions Higher EI was associated with better academic performance amongst final year medical students. In addition a higher EI was observed in those who had a higher level of self satisfaction. Self-perceived stress was lower in those with a higher EI. Enhancing EI might help to improve academic performance among final year medical

  7. Emotional intelligence, perceived stress and academic performance of Sri Lankan medical undergraduates.

    Science.gov (United States)

    Ranasinghe, P; Wathurapatha, W S; Mathangasinghe, Y; Ponnamperuma, G

    2017-02-20

    Previous research has shown that higher Emotional Intelligence (EI) is associated with better academic and work performance. The present study intended to explore the relationship between EI, perceived stress and academic performance and associated factors among medical undergraduates. This descriptive cross-sectional research study was conducted among 471 medical undergraduates of 2nd, 4th and final years of University of Colombo, Sri Lanka. Students were rated on self administered Perceived Stress Scale (PSS) and Schutte Self-Report Emotional Intelligence Test (SEIT). Examination results were used as the dichotomous outcome variable in a logistic regression analysis. Females had higher mean EI scores (p = 0.014). A positive correlation was found between the EI score and the number of extracurricular activities (r = 0.121, p = 0.008). Those who were satisfied regarding their choice to study medicine, and who were planning to do postgraduate studies had significantly higher EI scores and lower PSS scores (p <0.001). Among final year undergraduates, those who passed the Clinical Sciences examination in the first attempt had a higher EI score (p <0.001) and a lower PSS score (p <0.05). Results of the binary logistic-regression analysis in the entire study population indicated that female gender (OR:1.98) and being satisfied regarding their choice of the medical undergraduate programme (OR:3.69) were significantly associated with passing the examinations. However, PSS Score and engagement in extracurricular activities were not associated with 'Examination Results'. Higher EI was associated with better academic performance amongst final year medical students. In addition a higher EI was observed in those who had a higher level of self satisfaction. Self-perceived stress was lower in those with a higher EI. Enhancing EI might help to improve academic performance among final year medical student and also help to reduce the stress levels and cultivate

  8. Effect of emotion and articulation of speech on the Uncanny Valley in virtual characters

    DEFF Research Database (Denmark)

    Tinwell, Angela; Grimshaw, Mark Nicholas; Abdel Nabi, Debbie

    2011-01-01

    This paper presents a study of how exaggerated facial expression in the lower face region affects perception of emotion and the Uncanny Valley phenomenon in realistic, human-like, virtual characters. Characters communicated the six basic emotions, anger, disgust, fear, sadness and surprise...... with normal and exaggerated mouth movements. Measures were taken for perceived familiarity and human-likness. the results showed that: an increased intensity of articulation significantly reduced the uncanny for anger, yet increased perception of the uncanny for characters expressing happiness...

  9. How stable is activation in the amygdala and prefrontal cortex in adolescence? A study of emotional face processing across three measurements

    NARCIS (Netherlands)

    van den Bulk, B.G.; Koolschijn, P.C.M.P.; Meens, P.H.F.; van Lang, N.D.J.; van der Wee, N.J.A.; Rombouts, S.A.R.B.; Vermeiren, R.R.J.M.; Crone, E.A.

    2013-01-01

    Prior developmental functional magnetic resonance imaging (fMRI) studies have demonstrated elevated activation patterns in the amygdala and prefrontal cortex (PFC) in response to viewing emotional faces. As adolescence is a time of substantial variability in mood and emotional responsiveness, the

  10. The impact of face skin tone on perceived facial attractiveness: A study realized with an innovative methodology.

    Science.gov (United States)

    Vera Cruz, Germano

    2017-12-19

    This study aimed to assess the impact of target faces' skin tone and perceivers' skin tone on the participants' attractiveness judgment regarding a symmetrical representative range of target faces as stimuli. Presented with a set of facial features, 240 Mozambican adults rated their attractiveness along a continuous scale. ANOVA and Chi-square were used to analyze the data. The results revealed that the skin tone of the target faces had an impact on the participants' attractiveness judgment. Overall, participants preferred light-skinned faces over dark-skinned ones. This finding is not only consistent with previous results on skin tone preferences, but it is even more powerful because it demonstrates that the light skin tone preference occurs regardless of the symmetry and baseline attractiveness of the stimuli.

  11. A Model of the Perception of Facial Expressions of Emotion by Humans: Research Overview and Perspectives.

    Science.gov (United States)

    Martinez, Aleix; Du, Shichuan

    2012-05-01

    In cognitive science and neuroscience, there have been two leading models describing how humans perceive and classify facial expressions of emotion-the continuous and the categorical model. The continuous model defines each facial expression of emotion as a feature vector in a face space. This model explains, for example, how expressions of emotion can be seen at different intensities. In contrast, the categorical model consists of C classifiers, each tuned to a specific emotion category. This model explains, among other findings, why the images in a morphing sequence between a happy and a surprise face are perceived as either happy or surprise but not something in between. While the continuous model has a more difficult time justifying this latter finding, the categorical model is not as good when it comes to explaining how expressions are recognized at different intensities or modes. Most importantly, both models have problems explaining how one can recognize combinations of emotion categories such as happily surprised versus angrily surprised versus surprise. To resolve these issues, in the past several years, we have worked on a revised model that justifies the results reported in the cognitive science and neuroscience literature. This model consists of C distinct continuous spaces. Multiple (compound) emotion categories can be recognized by linearly combining these C face spaces. The dimensions of these spaces are shown to be mostly configural. According to this model, the major task for the classification of facial expressions of emotion is precise, detailed detection of facial landmarks rather than recognition. We provide an overview of the literature justifying the model, show how the resulting model can be employed to build algorithms for the recognition of facial expression of emotion, and propose research directions in machine learning and computer vision researchers to keep pushing the state of the art in these areas. We also discuss how the model can

  12. Processing Distracting Non-face Emotional Images: No Evidence of an Age-Related Positivity Effect.

    Science.gov (United States)

    Madill, Mark; Murray, Janice E

    2017-01-01

    Cognitive aging may be accompanied by increased prioritization of social and emotional goals that enhance positive experiences and emotional states. The socioemotional selectivity theory suggests this may be achieved by giving preference to positive information and avoiding or suppressing negative information. Although there is some evidence of a positivity bias in controlled attention tasks, it remains unclear whether a positivity bias extends to the processing of affective stimuli presented outside focused attention. In two experiments, we investigated age-related differences in the effects of to-be-ignored non-face affective images on target processing. In Experiment 1, 27 older (64-90 years) and 25 young adults (19-29 years) made speeded valence judgments about centrally presented positive or negative target images taken from the International Affective Picture System. To-be-ignored distractor images were presented above and below the target image and were either positive, negative, or neutral in valence. The distractors were considered task relevant because they shared emotional characteristics with the target stimuli. Both older and young adults responded slower to targets when distractor valence was incongruent with target valence relative to when distractors were neutral. Older adults responded faster to positive than to negative targets but did not show increased interference effects from positive distractors. In Experiment 2, affective distractors were task irrelevant as the target was a three-digit array and did not share emotional characteristics with the distractors. Twenty-six older (63-84 years) and 30 young adults (18-30 years) gave speeded responses on a digit disparity task while ignoring the affective distractors positioned in the periphery. Task performance in either age group was not influenced by the task-irrelevant affective images. In keeping with the socioemotional selectivity theory, these findings suggest that older adults preferentially

  13. Emotional Intelligence and Social-Emotional Learning: An Overview

    Science.gov (United States)

    Basu, Anamitra; Mermillod, Martial

    2011-01-01

    The term "EI (emotional intelligence)" was first used in 1990 by Salovey and Mayer. EI involves: (1) the ability to perceive accurately, appraise and express emotion; (2) the ability to access and/or generate feelings when they facilitate thought; (3) the ability to understand emotion and emotional knowledge; and (4) the ability to regulate…

  14. Inferior Frontal Gyrus Activation Underlies the Perception of Emotions, While Precuneus Activation Underlies the Feeling of Emotions during Music Listening

    Science.gov (United States)

    Tabei, Ken-ichi

    2015-01-01

    While music triggers many physiological and psychological reactions, the underlying neural basis of perceived and experienced emotions during music listening remains poorly understood. Therefore, using functional magnetic resonance imaging (fMRI), I conducted a comparative study of the different brain areas involved in perceiving and feeling emotions during music listening. I measured fMRI signals while participants assessed the emotional expression of music (perceived emotion) and their emotional responses to music (felt emotion). I found that cortical areas including the prefrontal, auditory, cingulate, and posterior parietal cortices were consistently activated by the perceived and felt emotional tasks. Moreover, activity in the inferior frontal gyrus increased more during the perceived emotion task than during a passive listening task. In addition, the precuneus showed greater activity during the felt emotion task than during a passive listening task. The findings reveal that the bilateral inferior frontal gyri and the precuneus are important areas for the perception of the emotional content of music as well as for the emotional response evoked in the listener. Furthermore, I propose that the precuneus, a brain region associated with self-representation, might be involved in assessing emotional responses. PMID:26504353

  15. Inferior Frontal Gyrus Activation Underlies the Perception of Emotions, While Precuneus Activation Underlies the Feeling of Emotions during Music Listening.

    Science.gov (United States)

    Tabei, Ken-ichi

    2015-01-01

    While music triggers many physiological and psychological reactions, the underlying neural basis of perceived and experienced emotions during music listening remains poorly understood. Therefore, using functional magnetic resonance imaging (fMRI), I conducted a comparative study of the different brain areas involved in perceiving and feeling emotions during music listening. I measured fMRI signals while participants assessed the emotional expression of music (perceived emotion) and their emotional responses to music (felt emotion). I found that cortical areas including the prefrontal, auditory, cingulate, and posterior parietal cortices were consistently activated by the perceived and felt emotional tasks. Moreover, activity in the inferior frontal gyrus increased more during the perceived emotion task than during a passive listening task. In addition, the precuneus showed greater activity during the felt emotion task than during a passive listening task. The findings reveal that the bilateral inferior frontal gyri and the precuneus are important areas for the perception of the emotional content of music as well as for the emotional response evoked in the listener. Furthermore, I propose that the precuneus, a brain region associated with self-representation, might be involved in assessing emotional responses.

  16. Momentary Desire for Sexual Intercourse and Momentary Emotional Intimacy Associated With Perceived Relationship Quality and Physical Intimacy in Heterosexual Emerging Adult Couples.

    Science.gov (United States)

    Shrier, Lydia A; Blood, Emily A

    2015-11-25

    Sexual desire and emotional intimacy are central to relationships, yet little is known about how these feelings vary within and between partners or relate to dyad functioning. We explored magnitude and stability of momentary sexual desire and emotional intimacy in relation to quality and functioning of heterosexual relationships. After reporting perceived relationship quality and physical intimacy enjoyment, members of 18 emerging adult heterosexual couples reported momentary partner-specific sexual desire and emotional intimacy several times a day for two weeks (2,224 reports). Mean and mean squared successive difference (MSSD) characterized magnitude and stability, respectively, of the momentary states. Regression models of relationship outcomes examined influence of the male versus female partner having greater or more stable desire and intimacy. Sexual desire and emotional intimacy magnitude and stability were associated with relationship quality and physical intimacy enjoyment differently for men versus women. Gender-specific differences between partners also predicted relationship outcomes. Men particularly perceived higher relationship quality and enjoyed physical intimacy more when they had higher and more stable sexual desire and their female partners had more stable emotional intimacy. Partner differences in momentary sexual desire and emotional intimacy may contribute to understanding quality and functioning of heterosexual relationships.

  17. Emotion recognition through static faces and moving bodies: a comparison between typically-developed adults and individuals with high level of autistic traits

    OpenAIRE

    Rossana eActis-Grosso; Rossana eActis-Grosso; Francesco eBossi; Paola eRicciardelli; Paola eRicciardelli

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits (HAT group) or with High Functioning Autism Spectrum Disorder was compared in the recognition of four emotions (Happiness, Anger, Fear and Sadness) either shown in static faces or c...

  18. Emotion recognition through static faces and moving bodies: a comparison between typically developed adults and individuals with high level of autistic traits

    OpenAIRE

    Actis-Grosso, Rossana; Bossi, Francesco; Ricciardelli, Paola

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits or with High Functioning Autism Spectrum Disorder (HAT group) was compared in the recognition of four emotions (Happiness, Anger, Fear, and Sadness) either shown in static faces or ...

  19. Effects of aging on identifying emotions conveyed by point-light walkers.

    Science.gov (United States)

    Spencer, Justine M Y; Sekuler, Allison B; Bennett, Patrick J; Giese, Martin A; Pilz, Karin S

    2016-02-01

    The visual system is able to recognize human motion simply from point lights attached to the major joints of an actor. Moreover, it has been shown that younger adults are able to recognize emotions from such dynamic point-light displays. Previous research has suggested that the ability to perceive emotional stimuli changes with age. For example, it has been shown that older adults are impaired in recognizing emotional expressions from static faces. In addition, it has been shown that older adults have difficulties perceiving visual motion, which might be helpful to recognize emotions from point-light displays. In the current study, 4 experiments were completed in which older and younger adults were asked to identify 3 emotions (happy, sad, and angry) displayed by 4 types of point-light walkers: upright and inverted normal walkers, which contained both local motion and global form information; upright scrambled walkers, which contained only local motion information; and upright random-position walkers, which contained only global form information. Overall, emotion discrimination accuracy was lower in older participants compared with younger participants, specifically when identifying sad and angry point-light walkers. In addition, observers in both age groups were able to recognize emotions from all types of point-light walkers, suggesting that both older and younger adults are able to recognize emotions from point-light walkers on the basis of local motion or global form. (c) 2016 APA, all rights reserved).

  20. Trustworthy-looking face meets brown eyes.

    Directory of Open Access Journals (Sweden)

    Karel Kleisner

    Full Text Available We tested whether eye color influences perception of trustworthiness. Facial photographs of 40 female and 40 male students were rated for perceived trustworthiness. Eye color had a significant effect, the brown-eyed faces being perceived as more trustworthy than the blue-eyed ones. Geometric morphometrics, however, revealed significant correlations between eye color and face shape. Thus, face shape likewise had a significant effect on perceived trustworthiness but only for male faces, the effect for female faces not being significant. To determine whether perception of trustworthiness was being influenced primarily by eye color or by face shape, we recolored the eyes on the same male facial photos and repeated the test procedure. Eye color now had no effect on perceived trustworthiness. We concluded that although the brown-eyed faces were perceived as more trustworthy than the blue-eyed ones, it was not brown eye color per se that caused the stronger perception of trustworthiness but rather the facial features associated with brown eyes.

  1. Seeing mixed emotions: The specificity of emotion perception from static and dynamic facial expressions across cultures

    NARCIS (Netherlands)

    Fang, X.; Sauter, D.A.; van Kleef, G.A.

    2018-01-01

    Although perceivers often agree about the primary emotion that is conveyed by a particular expression, observers may concurrently perceive several additional emotions from a given facial expression. In the present research, we compared the perception of two types of nonintended emotions in Chinese

  2. Seeing Mixed Emotions: The Specificity of Emotion Perception From Static and Dynamic Facial Expressions Across Cultures.

    Science.gov (United States)

    Fang, Xia; Sauter, Disa A; Van Kleef, Gerben A

    2018-01-01

    Although perceivers often agree about the primary emotion that is conveyed by a particular expression, observers may concurrently perceive several additional emotions from a given facial expression. In the present research, we compared the perception of two types of nonintended emotions in Chinese and Dutch observers viewing facial expressions: emotions which were morphologically similar to the intended emotion and emotions which were morphologically dissimilar to the intended emotion. Findings were consistent across two studies and showed that (a) morphologically similar emotions were endorsed to a greater extent than dissimilar emotions and (b) Chinese observers endorsed nonintended emotions more than did Dutch observers. Furthermore, the difference between Chinese and Dutch observers was more pronounced for the endorsement of morphologically similar emotions than of dissimilar emotions. We also obtained consistent evidence that Dutch observers endorsed nonintended emotions that were congruent with the preceding expressions to a greater degree. These findings suggest that culture and morphological similarity both influence the extent to which perceivers see several emotions in a facial expression.

  3. Seeing Mixed Emotions: The Specificity of Emotion Perception From Static and Dynamic Facial Expressions Across Cultures

    Science.gov (United States)

    Fang, Xia; Sauter, Disa A.; Van Kleef, Gerben A.

    2017-01-01

    Although perceivers often agree about the primary emotion that is conveyed by a particular expression, observers may concurrently perceive several additional emotions from a given facial expression. In the present research, we compared the perception of two types of nonintended emotions in Chinese and Dutch observers viewing facial expressions: emotions which were morphologically similar to the intended emotion and emotions which were morphologically dissimilar to the intended emotion. Findings were consistent across two studies and showed that (a) morphologically similar emotions were endorsed to a greater extent than dissimilar emotions and (b) Chinese observers endorsed nonintended emotions more than did Dutch observers. Furthermore, the difference between Chinese and Dutch observers was more pronounced for the endorsement of morphologically similar emotions than of dissimilar emotions. We also obtained consistent evidence that Dutch observers endorsed nonintended emotions that were congruent with the preceding expressions to a greater degree. These findings suggest that culture and morphological similarity both influence the extent to which perceivers see several emotions in a facial expression. PMID:29386689

  4. Personality, Attentional Biases towards Emotional Faces and Symptoms of Mental Disorders in an Adolescent Sample.

    Science.gov (United States)

    O'Leary-Barrett, Maeve; Pihl, Robert O; Artiges, Eric; Banaschewski, Tobias; Bokde, Arun L W; Büchel, Christian; Flor, Herta; Frouin, Vincent; Garavan, Hugh; Heinz, Andreas; Ittermann, Bernd; Mann, Karl; Paillère-Martinot, Marie-Laure; Nees, Frauke; Paus, Tomas; Pausova, Zdenka; Poustka, Luise; Rietschel, Marcella; Robbins, Trevor W; Smolka, Michael N; Ströhle, Andreas; Schumann, Gunter; Conrod, Patricia J

    2015-01-01

    To investigate the role of personality factors and attentional biases towards emotional faces, in establishing concurrent and prospective risk for mental disorder diagnosis in adolescence. Data were obtained as part of the IMAGEN study, conducted across 8 European sites, with a community sample of 2257 adolescents. At 14 years, participants completed an emotional variant of the dot-probe task, as well two personality measures, namely the Substance Use Risk Profile Scale and the revised NEO Personality Inventory. At 14 and 16 years, participants and their parents were interviewed to determine symptoms of mental disorders. Personality traits were general and specific risk indicators for mental disorders at 14 years. Increased specificity was obtained when investigating the likelihood of mental disorders over a 2-year period, with the Substance Use Risk Profile Scale showing incremental validity over the NEO Personality Inventory. Attentional biases to emotional faces did not characterise or predict mental disorders examined in the current sample. Personality traits can indicate concurrent and prospective risk for mental disorders in a community youth sample, and identify at-risk youth beyond the impact of baseline symptoms. This study does not support the hypothesis that attentional biases mediate the relationship between personality and psychopathology in a community sample. Task and sample characteristics that contribute to differing results among studies are discussed.

  5. Gaze Cueing by Pareidolia Faces

    OpenAIRE

    Kohske Takahashi; Katsumi Watanabe

    2013-01-01

    Visual images that are not faces are sometimes perceived as faces (the pareidolia phenomenon). While the pareidolia phenomenon provides people with a strong impression that a face is present, it is unclear how deeply pareidolia faces are processed as faces. In the present study, we examined whether a shift in spatial attention would be produced by gaze cueing of face-like objects. A robust cueing effect was observed when the face-like objects were perceived as faces. The magnitude of the cuei...

  6. Double attention bias for positive and negative emotional faces in clinical depression: evidence from an eye-tracking study.

    Science.gov (United States)

    Duque, Almudena; Vázquez, Carmelo

    2015-03-01

    According to cognitive models, attentional biases in depression play key roles in the onset and subsequent maintenance of the disorder. The present study examines the processing of emotional facial expressions (happy, angry, and sad) in depressed and non-depressed adults. Sixteen unmedicated patients with Major Depressive Disorder (MDD) and 34 never-depressed controls (ND) completed an eye-tracking task to assess different components of visual attention (orienting attention and maintenance of attention) in the processing of emotional faces. Compared to ND, participants with MDD showed a negative attentional bias in attentional maintenance indices (i.e. first fixation duration and total fixation time) for sad faces. This attentional bias was positively associated with the severity of depressive symptoms. Furthermore, the MDD group spent a marginally less amount of time viewing happy faces compared with the ND group. No differences were found between the groups with respect to angry faces and orienting attention indices. The current study is limited by its cross-sectional design. These results support the notion that attentional biases in depression are specific to depression-related information and that they operate in later stages in the deployment of attention. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Variations in the serotonin-transporter gene are associated with attention bias patterns to positive and negative emotion faces.

    Science.gov (United States)

    Pérez-Edgar, Koraly; Bar-Haim, Yair; McDermott, Jennifer Martin; Gorodetsky, Elena; Hodgkinson, Colin A; Goldman, David; Ernst, Monique; Pine, Daniel S; Fox, Nathan A

    2010-03-01

    Both attention biases to threat and a serotonin-transporter gene polymorphism (5-HTTLPR) have been linked to heightened neural activation to threat and the emergence of anxiety. The short allele of 5-HTTLPR may act via its effect on neurotransmitter availability, while attention biases shape broad patterns of cognitive processing. We examined individual differences in attention bias to emotion faces as a function of 5-HTTLPR genotype. Adolescents (N=117) were classified for presumed SLC6A4 expression based on 5-HTTLPR-low (SS, SL(G), or L(G)L(G)), intermediate (SL(A) or L(A)L(G)), or high (L(A)L(A)). Participants completed the dot-probe task, measuring attention biases toward or away from angry and happy faces. Biases for angry faces increased with the genotype-predicted neurotransmission levels (low>intermediate>high). The reverse pattern was evident for happy faces. The data indicate a linear relation between 5-HTTLPR allelic status and attention biases to emotion, demonstrating a genetic mechanism for biased attention using ecologically valid stimuli that target socioemotional adaptation. Copyright 2009 Elsevier B.V. All rights reserved.

  8. Correlation among perceived stress, emotional intelligence, and burnout of resident doctors in a medical college of West Bengal: A mediation analysis.

    Science.gov (United States)

    Mitra, Satabdi; Sarkar, Aditya Prasad; Haldar, Dibakar; Saren, Asit Baren; Lo, Sourav; Sarkar, Gautam Narayan

    2018-01-01

    Perceived stress and burnout are by-products of powerless responsibility imposed on resident doctors. Emotional intelligence (EI) works as an adapting and coping tool. The objective of this study is to find out the role of work-related perceived stress on burnout and influence of EI on it. A descriptive cross-sectional study was conducted from February to April 2016 among 63 resident doctors of different departments of Bankura Sammilani Medical College and Hospital. Data were collected through a self-administered questionnaire for background characteristics and work-related variables. Cohen perceived stress scale, Trait EI, and Shirom-Melamed burnout questionnaire were applied for measuring perceived stress, EI, and burnout, respectively. Statistical analysis was done with of SPSS version 22.0, and for mediation analysis, Andrew F. Hyne's SPSS macro was adopted. Nonparametric bootstrapping was done assuming small sample. Out of complete responses, 67%, 22.9%, and 9.8% were from clinical, paraclinical, and preclinical specialties, respectively. Burnout had a significant positive correlation with perceived stress and in negative correlation with EI-well-being and positive correlation with EI-self-control and sociability. Physical fatigue factor of burnout had a significant positive correlation with EI-emotionality. Perceived stress had a negative correlation with EI-well-being. On mediation analysis, assuming EI as a mediator, total, direct, and indirect effects of perceived stress on burnout were significant (burnout by perceived stress among resident doctors. This necessitates more attention by decision-makers toward this burning problem for the sake of care of caregivers.

  9. Differential Interactions between Identity and Emotional Expression in Own and Other-Race Faces: Effects of Familiarity Revealed through Redundancy Gains

    Science.gov (United States)

    Yankouskaya, Alla; Humphreys, Glyn W.; Rotshtein, Pia

    2014-01-01

    We examined relations between the processing of facial identity and emotion in own- and other-race faces, using a fully crossed design with participants from 3 different ethnicities. The benefits of redundant identity and emotion signals were evaluated and formally tested in relation to models of independent and coactive feature processing and…

  10. Characterization and recognition of mixed emotional expressions in thermal face image

    Science.gov (United States)

    Saha, Priya; Bhattacharjee, Debotosh; De, Barin K.; Nasipuri, Mita

    2016-05-01

    Facial expressions in infrared imaging have been introduced to solve the problem of illumination, which is an integral constituent of visual imagery. The paper investigates facial skin temperature distribution on mixed thermal facial expressions of our created face database where six are basic expressions and rest 12 are a mixture of those basic expressions. Temperature analysis has been performed on three facial regions of interest (ROIs); periorbital, supraorbital and mouth. Temperature variability of the ROIs in different expressions has been measured using statistical parameters. The temperature variation measurement in ROIs of a particular expression corresponds to a vector, which is later used in recognition of mixed facial expressions. Investigations show that facial features in mixed facial expressions can be characterized by positive emotion induced facial features and negative emotion induced facial features. Supraorbital is a useful facial region that can differentiate basic expressions from mixed expressions. Analysis and interpretation of mixed expressions have been conducted with the help of box and whisker plot. Facial region containing mixture of two expressions is generally less temperature inducing than corresponding facial region containing basic expressions.

  11. Proactive and reactive control depends on emotional valence: a Stroop study with emotional expressions and words.

    Science.gov (United States)

    Kar, Bhoomika Rastogi; Srinivasan, Narayanan; Nehabala, Yagyima; Nigam, Richa

    2018-03-01

    We examined proactive and reactive control effects in the context of task-relevant happy, sad, and angry facial expressions on a face-word Stroop task. Participants identified the emotion expressed by a face that contained a congruent or incongruent emotional word (happy/sad/angry). Proactive control effects were measured in terms of the reduction in Stroop interference (difference between incongruent and congruent trials) as a function of previous trial emotion and previous trial congruence. Reactive control effects were measured in terms of the reduction in Stroop interference as a function of current trial emotion and previous trial congruence. Previous trial negative emotions exert greater influence on proactive control than the positive emotion. Sad faces in the previous trial resulted in greater reduction in the Stroop interference for happy faces in the current trial. However, current trial angry faces showed stronger adaptation effects compared to happy faces. Thus, both proactive and reactive control mechanisms are dependent on emotional valence of task-relevant stimuli.

  12. Personal and Environmental Resources Mediate the Positivity-Emotional Dysfunction Relationship.

    Science.gov (United States)

    Lehrer, H Matthew; Janus, Katherine C; Gloria, Christian T; Steinhardt, Mary A

    2017-03-01

    We investigated the relationships among positivity, perceived personal and environmental resources, and emotional dysfunction in adolescent girls. We hypothesized that perceived resources would mediate the relationship between positivity and emotional dysfunction. Participants (N = 510) attending an all-girls public school completed a survey assessing emotional dysfunction (depressive symptoms and perceived stress), positivity (positive/negative emotions), and personal/ environmental resources (resilience, hope, percent adaptive coping, community connectedness, social support, and school connectedness). Perceived resources were combined into one latent variable, and structural equation modeling tested the mediating effect of perceived resources on the relationship between positivity and emotional dysfunction. The model accounted for 63% of the variance in emotional dysfunction. Positivity exerted a significant direct effect on emotional dysfunction (β = -.14, p emotional dysfunction is primarily but not entirely mediated by perceived personal and environmental resources. Schools should consider strategies to enhance experiences of positive emotions and/or decrease experiences of negative emotions, in conjunction with encouraging student awareness and development of personal and environmental resources.

  13. Patterns of feelings in face to face negotiation: a Sino-Dutch pilot study

    NARCIS (Netherlands)

    Ulijn, J.M.; Rutkowski, A.F.; Kumar, Rajesh; Zhu, Y.

    2005-01-01

    We conducted a pilot study to compare the emotions experienced by Dutch and Chinese students during a face-to-face negotiation role play. Emotions play an important role in negotiations because they influence the behaviour and judgments of negotiators The Data Printer case developed by Greenhalgh

  14. The human body odor compound androstadienone leads to anger-dependent effects in an emotional Stroop but not dot-probe task using human faces.

    Science.gov (United States)

    Hornung, Jonas; Kogler, Lydia; Wolpert, Stephan; Freiherr, Jessica; Derntl, Birgit

    2017-01-01

    The androgen derivative androstadienone is a substance found in human sweat and thus is a putative human chemosignal. Androstadienone has been studied with respect to effects on mood states, attractiveness ratings, physiological and neural activation. With the current experiment, we aimed to explore in which way androstadienone affects attention to social cues (human faces). Moreover, we wanted to test whether effects depend on specific emotions, the participants' sex and individual sensitivity to smell androstadienone. To do so, we investigated 56 healthy individuals (thereof 29 females taking oral contraceptives) with two attention tasks on two consecutive days (once under androstadienone, once under placebo exposure in pseudorandomized order). With an emotional dot-probe task we measured visuo-spatial cueing while an emotional Stroop task allowed us to investigate interference control. Our results suggest that androstadienone acts in a sex, task and emotion-specific manner as a reduction in interference processes in the emotional Stroop task was only apparent for angry faces in men under androstadienone exposure. More specifically, men showed a smaller difference in reaction times for congruent compared to incongruent trials. At the same time also women were slightly affected by smelling androstadienone as they classified angry faces more often correctly under androstadienone. For the emotional dot-probe task no modulation by androstadienone was observed. Furthermore, in both attention paradigms individual sensitivity to androstadienone was neither correlated with reaction times nor error rates in men and women. To conclude, exposure to androstadienone seems to potentiate the relevance of angry faces in both men and women in connection with interference control, while processes of visuo-spatial cueing remain unaffected.

  15. Project PAVE (Personality And Vision Experimentation: Role of personal and interpersonal resilience in the perception of emotional facial expression.

    Directory of Open Access Journals (Sweden)

    Michal eTanzer

    2014-08-01

    Full Text Available The aim of the proposed theoretical model is to illuminate personal and interpersonal resilience by drawing from the field of emotional face perception. We suggest that perception/recognition of emotional facial expressions serves as a central link between subjective, self-related processes and the social context. Emotional face perception constitutes a salient social cue underlying interpersonal communication and behavior. Because problems in communication and interpersonal behavior underlie most, if not all, forms of psychopathology, it follows that perception/recognition of emotional facial expressions impacts psychopathology. The ability to accurately interpret one's facial expression is crucial in subsequently deciding on an appropriate course of action. However, perception in general, and of emotional facial expressions in particular, is highly influenced by individuals’ personality and the self-concept. Herein we briefly outline well-established theories of personal and interpersonal resilience and link them to the neuro-cognitive basis of face perception. We then describe the findings of our ongoing program of research linking two well-established resilience factors, general self-efficacy (GSE and perceived social support (PSS, with face perception. We conclude by pointing out avenues for future research focusing on possible genetic markers and patterns of brain connectivity associated with the proposed model. Implications of our integrative model to psychotherapy are discussed.

  16. Emotional Status, Perceived Control of Pain, and Pain Coping Strategies in Episodic and Chronic Cluster Headache

    Directory of Open Access Journals (Sweden)

    Dominique Valade

    2012-08-01

    Full Text Available Cluster headache (CH is a chronic syndrome characterized by excruciatingly painful attacks occurring with circadian and circannual periodicity. The objectives of the present study were, in CH patients, to determine by principal component analysis the factor structure of two instruments commonly used in clinics to evaluate pain locus of control (Cancer Locus of Control Scale–CLCS and coping strategies (Coping Strategies Questionnaire–CSQ, to examine the relationship between internal pain controllability and emotional distress, and to compare psychosocial distress and coping strategies between two subsets of patients with episodic or chronic CH. Results indicate, for CLCS, a 3-factor structure (internal controllability, medical controllability, religious controllability noticeably different in CH patients from the structure reported in patients with other painful pathologies and, for CSQ, a 5-factor structure of CSQ which did not markedly diverge from the classical structure. Perceived internal controllability of pain was strongly correlated with study measures of depression (HAD depression/anhedonia subscale, Beck Depression Inventory. Comparison between subsets of patients with episodic or chronic CH of emotional status, pain locus of control, perceived social support and coping strategies did not reveal significant differences apart for the Reinterpreting pain sensations strategy which was more often used by episodic CH patients. Observed tendencies for increased anxiety and perceived social support in patients with episodic CH, and for increased depression and more frequent use of the Ignoring pain sensations strategy in patients with chronic CH, warrant confirmation in larger groups of patients.

  17. Facial expression recognition and emotional regulation in narcolepsy with cataplexy.

    Science.gov (United States)

    Bayard, Sophie; Croisier Langenier, Muriel; Dauvilliers, Yves

    2013-04-01

    Cataplexy is pathognomonic of narcolepsy with cataplexy, and defined by a transient loss of muscle tone triggered by strong emotions. Recent researches suggest abnormal amygdala function in narcolepsy with cataplexy. Emotion treatment and emotional regulation strategies are complex functions involving cortical and limbic structures, like the amygdala. As the amygdala has been shown to play a role in facial emotion recognition, we tested the hypothesis that patients with narcolepsy with cataplexy would have impaired recognition of facial emotional expressions compared with patients affected with central hypersomnia without cataplexy and healthy controls. We also aimed to determine whether cataplexy modulates emotional regulation strategies. Emotional intensity, arousal and valence ratings on Ekman faces displaying happiness, surprise, fear, anger, disgust, sadness and neutral expressions of 21 drug-free patients with narcolepsy with cataplexy were compared with 23 drug-free sex-, age- and intellectual level-matched adult patients with hypersomnia without cataplexy and 21 healthy controls. All participants underwent polysomnography recording and multiple sleep latency tests, and completed depression, anxiety and emotional regulation questionnaires. Performance of patients with narcolepsy with cataplexy did not differ from patients with hypersomnia without cataplexy or healthy controls on both intensity rating of each emotion on its prototypical label and mean ratings for valence and arousal. Moreover, patients with narcolepsy with cataplexy did not use different emotional regulation strategies. The level of depressive and anxious symptoms in narcolepsy with cataplexy did not differ from the other groups. Our results demonstrate that narcolepsy with cataplexy accurately perceives and discriminates facial emotions, and regulates emotions normally. The absence of alteration of perceived affective valence remains a major clinical interest in narcolepsy with cataplexy

  18. [Relationship between perceived emotional intelligence and professional quality of life with the achievement of occupational objectives in the costa del sol primary health care district].

    Science.gov (United States)

    Macías Fernández, Antonio José; Gutiérrez-Castañeda, Carlos; Carmona González, Francisco Jesús; Crespillo Vílchez, Daniel

    2016-05-01

    To examine the relationship between "Quality of Professional Life" and "Perceived Emotional Intelligence" and the relationship of both of these with the level of achievement of occupational objectives in the Costa del Sol Primary Health Care District. Multicentre descriptive cross-sectional observational study. The Costa del Sol Primary Health Care District in the province of Málaga. Sample of Employees of all categories in fixed and contracted employment in the Management Units of the Costa del Sol District. (N=303). Respondents 247 (81.5%) The data collected was that of the percentage of achievement of objectives in 2010 and the socio-demographic data of the participants, using ad hoc designed self-report questionnaires. The TMMS -24 questionnaire was used to measure the "Perceived Emotional Intelligence", with the following dimensions: Perception, comprehension, and emotional control, and the CVP-35 measuring: management support, work demands, and intrinsic motivation. Significant correlationas were observed between Quality of Professional Life and Emotional Intelligence in the Regulation (p<.01) and Comprehension categories (p<0.05). There were also significant correlations between the profession and the type of contract in the achievement of objectives (p<.005), and quality of professional life and type of contract (p<.05). The perceived quality of professional life is related to perception and regulation dimensions of Emotional Intelligence. Knowledge of emotion management methods should be promoted by management organisations for all employees. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.

  19. A longitudinal examination of perceived discrimination and depressive symptoms in ethnic minority youth: The roles of attributional style, positive ethnic/racial affect, and emotional reactivity.

    Science.gov (United States)

    Stein, Gabriela L; Supple, Andrew J; Huq, Nadia; Dunbar, Angel S; Prinstein, Mitchell J

    2016-02-01

    Although perceived ethnic/racial discrimination is well established as a risk factor for depressive symptoms in ethnic minority youth, few studies have examined their longitudinal relationship over time. This study examined whether a negative attributional style, positive ethnic/racial affect, and emotional reactivity moderated the longitudinal relationship of perceived peer or adult discrimination and depressive symptoms in a sample of African American and Latino high school students (n = 155). African American and Latino youth who experienced increases in perceived peer discrimination also reported greater depressive symptoms over time, but positive ethnic/racial affect buffered the longitudinal association. Emotional reactivity also served as a significant moderator but only of the baseline association between perceived peer discrimination and depressive symptoms. Thus, perceived ethnic/racial discrimination appears to play a significant role in the development of depressive symptoms for ethnic minority youth, especially those who start high school with lower levels of positive ethnic/racial affect. PsycINFO Database Record (c) 2016 APA, all rights reserved.

  20. Does a single session of electroconvulsive therapy alter the neural response to emotional faces in depression? A randomised sham-controlled functional magnetic resonance imaging study

    DEFF Research Database (Denmark)

    Miskowiak, Kamilla W; Kessing, Lars V; Ott, Caroline V

    2017-01-01

    neurocognitive bias in major depressive disorder. Patients with major depressive disorder were randomised to one active ( n=15) or sham electroconvulsive therapy ( n=12). The following day they underwent whole-brain functional magnetic resonance imaging at 3T while viewing emotional faces and performed facial...... expression recognition and dot-probe tasks. A single electroconvulsive therapy session had no effect on amygdala response to emotional faces. Whole-brain analysis revealed no effects of electroconvulsive therapy versus sham therapy after family-wise error correction at the cluster level, using a cluster...... to faces after a single electroconvulsive therapy session, the observed trend changes after a single electroconvulsive therapy session point to an early shift in emotional processing that may contribute to antidepressant effects of electroconvulsive therapy....