WorldWideScience

Sample records for repeated emotional faces

  1. Steroids facing emotions

    NARCIS (Netherlands)

    Putman, P.L.J.

    2006-01-01

    The studies reported in this thesis have been performed to gain a better understanding about motivational mediators of selective attention and memory for emotionally relevant stimuli, and about the roles that some steroid hormones play in regulation of human motivation and emotion. The stimuli used

  2. Emotion-independent face recognition

    Science.gov (United States)

    De Silva, Liyanage C.; Esther, Kho G. P.

    2000-12-01

    Current face recognition techniques tend to work well when recognizing faces under small variations in lighting, facial expression and pose, but deteriorate under more extreme conditions. In this paper, a face recognition system to recognize faces of known individuals, despite variations in facial expression due to different emotions, is developed. The eigenface approach is used for feature extraction. Classification methods include Euclidean distance, back propagation neural network and generalized regression neural network. These methods yield 100% recognition accuracy when the training database is representative, containing one image representing the peak expression for each emotion of each person apart from the neutral expression. The feature vectors used for comparison in the Euclidean distance method and for training the neural network must be all the feature vectors of the training set. These results are obtained for a face database consisting of only four persons.

  3. Influence of Emotional Facial Expressions on 3-5-Year-Olds' Face Recognition

    Science.gov (United States)

    Freitag, Claudia; Schwarzer, Gudrun

    2011-01-01

    Three experiments examined 3- and 5-year-olds' recognition of faces in constant and varied emotional expressions. Children were asked to identify repeatedly presented target faces, distinguishing them from distractor faces, during an immediate recognition test and during delayed assessments after 10 min and one week. Emotional facial expression…

  4. Influence of Emotional Facial Expressions on 3-5-Year-Olds' Face Recognition

    Science.gov (United States)

    Freitag, Claudia; Schwarzer, Gudrun

    2011-01-01

    Three experiments examined 3- and 5-year-olds' recognition of faces in constant and varied emotional expressions. Children were asked to identify repeatedly presented target faces, distinguishing them from distractor faces, during an immediate recognition test and during delayed assessments after 10 min and one week. Emotional facial expression…

  5. Processing of emotional faces in social phobia

    Directory of Open Access Journals (Sweden)

    Nicole Kristjansen Rosenberg

    2011-02-01

    Full Text Available Previous research has found that individuals with social phobia differ from controls in their processing of emotional faces. For instance, people with social phobia show increased attention to briefly presented threatening faces. However, when exposure times are increased, the direction of this attentional bias is more unclear. Studies investigating eye movements have found both increased as well as decreased attention to threatening faces in socially anxious participants. The current study investigated eye movements to emotional faces in eight patients with social phobia and 34 controls. Three different tasks with different exposure durations were used, which allowed for an investigation of the time course of attention. At the early time interval, patients showed a complex pattern of both vigilance and avoidance of threatening faces. At the longest time interval, patients avoided the eyes of sad, disgust, and neutral faces more than controls, whereas there were no group differences for angry faces.

  6. Aging and attentional biases for emotional faces.

    Science.gov (United States)

    Mather, Mara; Carstensen, Laura L

    2003-09-01

    We examined age differences in attention to and memory for faces expressing sadness, anger, and happiness. Participants saw a pair of faces, one emotional and one neutral, and then a dot probe that appeared in the location of one of the faces. In two experiments, older adults responded faster to the dot if it was presented on the same side as a neutral face than if it was presented on the same side as a negative face. Younger adults did not exhibit this attentional bias. Interactions of age and valence were also found for memory for the faces, with older adults remembering positive better than negative faces. These findings reveal that in their initial attention, older adults avoid negative information. This attentional bias is consistent with older adults' generally better emotional well-being and their tendency to remember negative less well than positive information.

  7. Morphed emotional faces: Emotion detection and misinterpretation in social anxiety

    NARCIS (Netherlands)

    Heuer, K.; Lange, W.G.; Isaac, L.; Rinck, M.; Becker, E.S.

    2010-01-01

    The current study investigated detection and interpretation of emotional facial expressions in high socially anxious (HSA) individuals compared to non-anxious controls (NAC). A version of the morphed faces task was implemented to assess emotion onset perception, decoding accuracy and interpretation,

  8. Metacognition of emotional face recognition.

    Science.gov (United States)

    Kelly, Karen J; Metcalfe, Janet

    2011-08-01

    While humans are adept at recognizing emotional states conveyed by facial expressions, the current literature suggests that they lack accurate metacognitions about their performance in this domain. This finding comes from global trait-based questionnaires that assess the extent to which an individual perceives him or herself as empathic, as compared to other people. Those who rate themselves as empathically accurate are no better than others at recognizing emotions. Metacognition of emotion recognition can also be assessed using relative measures that evaluate how well a person thinks s/he has understood the emotion in a particular facial display as compared to other displays. While this is the most common method of metacognitive assessment of people's judgments of learning or their feelings of knowing, this kind of metacognition--"relative meta-accuracy"--has not been studied within the domain of emotion. As well as asking for global metacognitive judgments, we asked people to provide relative, trial-by-trial prospective and retrospective judgments concerning whether they would be right or wrong in recognizing the expressions conveyed in particular facial displays. Our question was: Do people know when they will be correct in knowing what expression is conveyed, and do they know when they do not know? Although we, like others, found that global meta-accuracy was unpredictive of performance, relative meta-accuracy, given by the correlation between participants' trial-by-trial metacognitive judgments and performance on each item, were highly accurate both on the Mind in the Eyes task (Experiment 1) and on the Ekman Emotional Expression Multimorph task (in Experiment 2). 2011 APA, all rights reserved

  9. Are emotions contagious? Evoked emotions while viewing emotionally expressive faces: quality, quantity, time course and gender differences.

    Science.gov (United States)

    Wild, B; Erb, M; Bartels, M

    2001-06-01

    In human interactions, frequently one individual becomes 'infected' with emotions displayed by his or her partner. We tested the predictions by Hatfield et al. (1992) (Primitive emotional contagion. Review of Personal and Social Psychology 14, 151-177) that the automatic, mostly unconscious component of this process, called 'primitive emotional contagion', is repeatable and fast, that stronger facial expressions of the sender evoke stronger emotions in the viewer and that women are more susceptible to emotional contagion than men. We presented photos from the Pictures of Facial Affect (Ekman and Friesen, 1976). (Pictures of Facial Affect. Consulting Psychologists Press, Palo Alto) on a PC varying the affective content (happy and sad), the expressive strength and the duration of presentation. After each photo, subjects rated the strength of experienced happiness, sadness, anger, disgust, surprise, fear and pleasure. Feelings of happiness or sadness were significantly, specifically and repeatedly evoked in the viewer - even with presentations lasting only 500 ms. Stronger expressions evoked more emotion. The gender of the viewer had weak effects. We hypothesize that this fast and repeatable reaction is likely to have a 'prewired' neural basis. We propose that the induction of emotional processes within a subject by the perception of emotionally expressive faces is a powerful instrument in the detection of emotional states in others and as the basis for one's own reactions. Detailed knowledge of emotional reactions to faces is also valuable as a basis for psychiatric studies of disorders in affect and/or communication and in studies using functional imaging (fMRI or PET) where faces are increasingly used as stimuli.

  10. Serotonergic modulation of face-emotion recognition.

    Science.gov (United States)

    Del-Ben, C M; Ferreira, C A Q; Alves-Neto, W C; Graeff, F G

    2008-04-01

    Facial expressions of basic emotions have been widely used to investigate the neural substrates of emotion processing, but little is known about the exact meaning of subjective changes provoked by perceiving facial expressions. Our assumption was that fearful faces would be related to the processing of potential threats, whereas angry faces would be related to the processing of proximal threats. Experimental studies have suggested that serotonin modulates the brain processes underlying defensive responses to environmental threats, facilitating risk assessment behavior elicited by potential threats and inhibiting fight or flight responses to proximal threats. In order to test these predictions about the relationship between fearful and angry faces and defensive behaviors, we carried out a review of the literature about the effects of pharmacological probes that affect 5-HT-mediated neurotransmission on the perception of emotional faces. The hypothesis that angry faces would be processed as a proximal threat and that, as a consequence, their recognition would be impaired by an increase in 5-HT function was not supported by the results reviewed. In contrast, most of the studies that evaluated the behavioral effects of serotonin challenges showed that increased 5-HT neurotransmission facilitates the recognition of fearful faces, whereas its decrease impairs the same performance. These results agree with the hypothesis that fearful faces are processed as potential threats and that 5-HT enhances this brain processing.

  11. Serotonergic modulation of face-emotion recognition

    Directory of Open Access Journals (Sweden)

    C.M. Del-Ben

    2008-04-01

    Full Text Available Facial expressions of basic emotions have been widely used to investigate the neural substrates of emotion processing, but little is known about the exact meaning of subjective changes provoked by perceiving facial expressions. Our assumption was that fearful faces would be related to the processing of potential threats, whereas angry faces would be related to the processing of proximal threats. Experimental studies have suggested that serotonin modulates the brain processes underlying defensive responses to environmental threats, facilitating risk assessment behavior elicited by potential threats and inhibiting fight or flight responses to proximal threats. In order to test these predictions about the relationship between fearful and angry faces and defensive behaviors, we carried out a review of the literature about the effects of pharmacological probes that affect 5-HT-mediated neurotransmission on the perception of emotional faces. The hypothesis that angry faces would be processed as a proximal threat and that, as a consequence, their recognition would be impaired by an increase in 5-HT function was not supported by the results reviewed. In contrast, most of the studies that evaluated the behavioral effects of serotonin challenges showed that increased 5-HT neurotransmission facilitates the recognition of fearful faces, whereas its decrease impairs the same performance. These results agree with the hypothesis that fearful faces are processed as potential threats and that 5-HT enhances this brain processing.

  12. Digitizing the moving face: asymmetries of emotion and gender

    Directory of Open Access Journals (Sweden)

    Ashish Desai

    2009-04-01

    Full Text Available In a previous study with dextral males, Richardson and Bowers (1999 digitized real time video signals and found movement asymmetries over the left lower face for emotional, but not non-emotional expressions. These findings correspond to observations, based on subjective ratings of static pictures, that the left side of the face is more intensely expressive than the right (Sackeim, 1978. From a neuropsychological perspective, one possible interpretation of these findings is that emotional priming of the right hemisphere of the brain results in more muscular activity over the contralateral left than ipsilateral right side of the lower face. The purpose of the present study was to use computer-imaging methodology to determine whether there were gender differences in movement asymmetries across the face. We hypothesized that females would show less evidence of facial movement asymmetries during the expression of emotion. This hypothesis was based on findings of gender differences in the degree to which specific cognitive functions may be lateralized in the brain (i.e., females less lateralized than males. Forty-eight normal dextral college students (25 females, 23 males were videotaped while they displayed voluntary emotional expressions. A quantitative measure of movement change (called entropy was computed by subtracting the values of corresponding pixel intensities between adjacent frames and summing their differences. The upper and lower hemiface regions were examined separately due to differences in the cortical enervation of facial muscles in the upper (bilateral versus lower face (contralateral. Repeated measures ANOVA’s were used to analyze for the amount of overall facial movement and for facial asymmetries. Certain emotions were associated with significantly greater overall facial movement than others (p fear > (angry =sad > neutral. Both males and females showed this same pattern, with no gender differences in the total amount of facial

  13. Face or body? Oxytocin improves perception of emotions from facial expressions in incongruent emotional body context.

    Science.gov (United States)

    Perry, Anat; Aviezer, Hillel; Goldstein, Pavel; Palgi, Sharon; Klein, Ehud; Shamay-Tsoory, Simone G

    2013-11-01

    The neuropeptide oxytocin (OT) has been repeatedly reported to play an essential role in the regulation of social cognition in humans in general, and specifically in enhancing the recognition of emotions from facial expressions. The later was assessed in different paradigms that rely primarily on isolated and decontextualized emotional faces. However, recent evidence has indicated that the perception of basic facial expressions is not context invariant and can be categorically altered by context, especially body context, at early perceptual levels. Body context has a strong effect on our perception of emotional expressions, especially when the actual target face and the contextually expected face are perceptually similar. To examine whether and how OT affects emotion recognition, we investigated the role of OT in categorizing facial expressions in incongruent body contexts. Our results show that in the combined process of deciphering emotions from facial expressions and from context, OT gives an advantage to the face. This advantage is most evident when the target face and the contextually expected face are perceptually similar.

  14. Autism and emotional face-viewing.

    Science.gov (United States)

    Åsberg Johnels, Jakob; Hovey, Daniel; Zürcher, Nicole; Hippolyte, Loyse; Lemonnier, Eric; Gillberg, Christopher; Hadjikhani, Nouchine

    2016-11-28

    Atypical patterns of face-scanning in individuals with autism spectrum disorder (ASD) may contribute to difficulties in social interactions, but there is little agreement regarding what exactly characterizes face-viewing in ASD. In addition, little research has examined how face-viewing is modulated by the emotional expression of the stimuli, in individuals with or without ASD. We used eye-tracking to explore viewing patterns during perception of dynamic emotional facial expressions in relatively large groups of individuals with (n = 57) and without ASD (n = 58) and examined diagnostic- and age-related effects, after subgrouping children and adolescents (≤18 years), on the one hand, and adults (>18 years), on the other. Results showed that children/adolescents with ASD fixated the mouth of happy and angry faces less than their typically developing (TD) peers, and conversely looked more to the eyes of happy faces. Moreover, while all groups fixated the mouth in happy faces more than in other expressions, children/adolescents with ASD did relatively less so. Correlation analysis showed a similar lack of relative orientation toward the mouth of smiling faces in TD children/adolescents with high autistic traits, as measured by the Autism-Spectrum Quotient (AQ). Among adults, participants with ASD attended less to the eyes only for neutral faces. Our study shows that the emotional content of a face influences gaze behavior, and that this effect is not fully developed in children/adolescents with ASD. Interestingly, this lack of differentiation observed in the younger ASD group was also seen in younger TD individuals with higher AQ scores. Autism Res 2016. © 2016 International Society for Autism Research, Wiley Periodicals, Inc.

  15. Mapping the emotional face. How individual face parts contribute to successful emotion recognition.

    Science.gov (United States)

    Wegrzyn, Martin; Vogt, Maria; Kireclioglu, Berna; Schneider, Julia; Kissler, Johanna

    2017-01-01

    Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation.

  16. Modulation of the composite face effect by unintended emotion cues.

    Science.gov (United States)

    Gray, Katie L H; Murphy, Jennifer; Marsh, Jade E; Cook, Richard

    2017-04-01

    When upper and lower regions from different emotionless faces are aligned to form a facial composite, observers 'fuse' the two halves together, perceptually. The illusory distortion induced by task-irrelevant ('distractor') halves hinders participants' judgements about task-relevant ('target') halves. This composite-face effect reveals a tendency to integrate feature information from disparate regions of intact upright faces, consistent with theories of holistic face processing. However, observers frequently perceive emotion in ostensibly neutral faces, contrary to the intentions of experimenters. This study sought to determine whether this 'perceived emotion' influences the composite-face effect. In our first experiment, we confirmed that the composite effect grows stronger as the strength of distractor emotion increased. Critically, effects of distractor emotion were induced by weak emotion intensities, and were incidental insofar as emotion cues hindered image matching, not emotion labelling per se. In Experiment 2, we found a correlation between the presence of perceived emotion in a set of ostensibly neutral distractor regions sourced from commonly used face databases, and the strength of illusory distortion they induced. In Experiment 3, participants completed a sequential matching composite task in which half of the distractor regions were rated high and low for perceived emotion, respectively. Significantly stronger composite effects were induced by the high-emotion distractor halves. These convergent results suggest that perceived emotion increases the strength of the composite-face effect induced by supposedly emotionless faces. These findings have important implications for the study of holistic face processing in typical and atypical populations.

  17. Recognition of Face and Emotional Facial Expressions in Autism

    Directory of Open Access Journals (Sweden)

    Muhammed Tayyib Kadak

    2013-03-01

    Full Text Available Autism is a genetically transferred neurodevelopmental disorder characterized by severe and permanent deficits in many interpersonal relation areas like communication, social interaction and emotional responsiveness. Patients with autism have deficits in face recognition, eye contact and recognition of emotional expression. Both recognition of face and expression of facial emotion carried on face processing. Structural and functional impairment in fusiform gyrus, amygdala, superior temporal sulcus and other brain regions lead to deficits in recognition of face and facial emotion. Therefore studies suggest that face processing deficits resulted in problems in areas of social interaction and emotion in autism. Studies revealed that children with autism had problems in recognition of facial expression and used mouth region more than eye region. It was also shown that autistic patients interpreted ambiguous expressions as negative emotion. In autism, deficits related in various stages of face processing like detection of gaze, face identity, recognition of emotional expression were determined, so far. Social interaction impairments in autistic spectrum disorders originated from face processing deficits during the periods of infancy, childhood and adolescence. Recognition of face and expression of facial emotion could be affected either automatically by orienting towards faces after birth, or by “learning” processes in developmental periods such as identity and emotion processing. This article aimed to review neurobiological basis of face processing and recognition of emotional facial expressions during normal development and in autism.

  18. Emotional Recognition in Autism Spectrum Conditions from Voices and Faces

    Science.gov (United States)

    Stewart, Mary E.; McAdam, Clair; Ota, Mitsuhiko; Peppe, Sue; Cleland, Joanne

    2013-01-01

    The present study reports on a new vocal emotion recognition task and assesses whether people with autism spectrum conditions (ASC) perform differently from typically developed individuals on tests of emotional identification from both the face and the voice. The new test of vocal emotion contained trials in which the vocal emotion of the sentence…

  19. What’s in a Face? How Face Gender and Current Affect Influence Perceived Emotion

    Science.gov (United States)

    Harris, Daniel A.; Hayes-Skelton, Sarah A.; Ciaramitaro, Vivian M.

    2016-01-01

    Faces drive our social interactions. A vast literature suggests an interaction between gender and emotional face perception, with studies using different methodologies demonstrating that the gender of a face can affect how emotions are processed. However, how different is our perception of affective male and female faces? Furthermore, how does our current affective state when viewing faces influence our perceptual biases? We presented participants with a series of faces morphed along an emotional continuum from happy to angry. Participants judged each face morph as either happy or angry. We determined each participant’s unique emotional ‘neutral’ point, defined as the face morph judged to be perceived equally happy and angry, separately for male and female faces. We also assessed how current state affect influenced these perceptual neutral points. Our results indicate that, for both male and female participants, the emotional neutral point for male faces is perceptually biased to be happier than for female faces. This bias suggests that more happiness is required to perceive a male face as emotionally neutral, i.e., we are biased to perceive a male face as more negative. Interestingly, we also find that perceptual biases in perceiving female faces are correlated with current mood, such that positive state affect correlates with perceiving female faces as happier, while we find no significant correlation between negative state affect and the perception of facial emotion. Furthermore, we find reaction time biases, with slower responses for angry male faces compared to angry female faces. PMID:27733839

  20. Interpersonal self-support and attentional disengagement from emotional faces.

    Science.gov (United States)

    Xia, Ling-Xiang; Shi, Xu-Liang; Zhang, Ran-Ran; Hollon, Steven D

    2015-01-08

    Prior studies have shown that interpersonal self-support is related to emotional symptoms. The present study explored the relationship between interpersonal self-support and attentional disengagement from emotional faces. A spatial cueing task was administrated to 21 high and 24 low interpersonal self-support Chinese undergraduate students to assess difficulty in shifting away from emotional faces. The Sidak corrected multiple pairwise tests revealed that the low interpersonal self-support group had greater response latencies on negative faces than neutral faces or positive faces in the invalid cues condition, F(2, 41) = 5.68, p interpersonal self-support group responded more slowly than the high interpersonal self-support group to negative faces, F(1, 42) = 7.63, p interpersonal self-support is related to difficulty disengaging from negative emotional information and suggest that interpersonal self-support may refer to emotional dispositions, especially negative emotional dispositions.

  1. Dogs can discriminate emotional expressions of human faces.

    Science.gov (United States)

    Müller, Corsin A; Schmitt, Kira; Barber, Anjuli L A; Huber, Ludwig

    2015-03-01

    The question of whether animals have emotions and respond to the emotional expressions of others has become a focus of research in the last decade [1-9]. However, to date, no study has convincingly shown that animals discriminate between emotional expressions of heterospecifics, excluding the possibility that they respond to simple cues. Here, we show that dogs use the emotion of a heterospecific as a discriminative cue. After learning to discriminate between happy and angry human faces in 15 picture pairs, whereby for one group only the upper halves of the faces were shown and for the other group only the lower halves of the faces were shown, dogs were tested with four types of probe trials: (1) the same half of the faces as in the training but of novel faces, (2) the other half of the faces used in training, (3) the other half of novel faces, and (4) the left half of the faces used in training. We found that dogs for which the happy faces were rewarded learned the discrimination more quickly than dogs for which the angry faces were rewarded. This would be predicted if the dogs recognized an angry face as an aversive stimulus. Furthermore, the dogs performed significantly above chance level in all four probe conditions and thus transferred the training contingency to novel stimuli that shared with the training set only the emotional expression as a distinguishing feature. We conclude that the dogs used their memories of real emotional human faces to accomplish the discrimination task.

  2. Time course of implicit processing and explicit processing of emotional faces and emotional words.

    Science.gov (United States)

    Frühholz, Sascha; Jellinghaus, Anne; Herrmann, Manfred

    2011-05-01

    Facial expressions are important emotional stimuli during social interactions. Symbolic emotional cues, such as affective words, also convey information regarding emotions that is relevant for social communication. Various studies have demonstrated fast decoding of emotions from words, as was shown for faces, whereas others report a rather delayed decoding of information about emotions from words. Here, we introduced an implicit (color naming) and explicit task (emotion judgment) with facial expressions and words, both containing information about emotions, to directly compare the time course of emotion processing using event-related potentials (ERP). The data show that only negative faces affected task performance, resulting in increased error rates compared to neutral faces. Presentation of emotional faces resulted in a modulation of the N170, the EPN and the LPP components and these modulations were found during both the explicit and implicit tasks. Emotional words only affected the EPN during the explicit task, but a task-independent effect on the LPP was revealed. Finally, emotional faces modulated source activity in the extrastriate cortex underlying the generation of the N170, EPN and LPP components. Emotional words led to a modulation of source activity corresponding to the EPN and LPP, but they also affected the N170 source on the right hemisphere. These data show that facial expressions affect earlier stages of emotion processing compared to emotional words, but the emotional value of words may have been detected at early stages of emotional processing in the visual cortex, as was indicated by the extrastriate source activity.

  3. 5-HTTLPR differentially predicts brain network responses to emotional faces

    DEFF Research Database (Denmark)

    Fisher, Patrick M; Grady, Cheryl L; Madsen, Martin K

    2015-01-01

    The effects of the 5-HTTLPR polymorphism on neural responses to emotionally salient faces have been studied extensively, focusing on amygdala reactivity and amygdala-prefrontal interactions. Despite compelling evidence that emotional face paradigms engage a distributed network of brain regions in...

  4. Interpretation of emotionally ambiguous faces in older adults.

    Science.gov (United States)

    Bucks, Romola S; Garner, Matthew; Tarrant, Louise; Bradley, Brendan P; Mogg, Karin

    2008-11-01

    Research suggests that there is an age-related decline in the processing of negative emotional information, which may contribute to the reported decline in emotional problems in older people. We used a signal detection approach to investigate the effect of normal aging on the interpretation of ambiguous emotional facial expressions. High-functioning older and younger adults indicated which emotion they perceived when presented with morphed faces containing a 60% to 40% blend of two emotions (mixtures of happy, sad, or angry faces). They also completed measures of mood, perceptual ability, and cognitive functioning. Older and younger adults did not differ significantly in their ability to discriminate between positive and negative emotions. Response-bias measures indicated that older adults were significantly less likely than younger adults to report the presence of anger in angry-happy face blends. Results are discussed in relation to other research into age-related effects on emotion processing.

  5. Method for Face-Emotion Retrieval Using A Cartoon Emotional Expression Approach

    Science.gov (United States)

    Kostov, Vlaho; Yanagisawa, Hideyoshi; Johansson, Martin; Fukuda, Shuichi

    A simple method for extracting emotion from a human face, as a form of non-verbal communication, was developed to cope with and optimize mobile communication in a globalized and diversified society. A cartoon face based model was developed and used to evaluate emotional content of real faces. After a pilot survey, basic rules were defined and student subjects were asked to express emotion using the cartoon face. Their face samples were then analyzed using principal component analysis and the Mahalanobis distance method. Feature parameters considered as having relations with emotions were extracted and new cartoon faces (based on these parameters) were generated. The subjects evaluated emotion of these cartoon faces again and we confirmed these parameters were suitable. To confirm how these parameters could be applied to real faces, we asked subjects to express the same emotions which were then captured electronically. Simple image processing techniques were also developed to extract these features from real faces and we then compared them with the cartoon face parameters. It is demonstrated via the cartoon face that we are able to express the emotions from very small amounts of information. As a result, real and cartoon faces correspond to each other. It is also shown that emotion could be extracted from still and dynamic real face images using these cartoon-based features.

  6. Neural correlates of recognition memory for emotional faces and scenes.

    Science.gov (United States)

    Keightley, Michelle L; Chiew, Kimberly S; Anderson, John A E; Grady, Cheryl L

    2011-01-01

    We examined the influence of emotional valence and type of item to be remembered on brain activity during recognition, using faces and scenes. We used multivariate analyses of event-related fMRI data to identify whole-brain patterns, or networks of activity. Participants demonstrated better recognition for scenes vs faces and for negative vs neutral and positive items. Activity was increased in extrastriate cortex and inferior frontal gyri for emotional scenes, relative to neutral scenes and all face types. Increased activity in these regions also was seen for negative faces relative to positive faces. Correct recognition of negative faces and scenes (hits vs correct rejections) was associated with increased activity in amygdala, hippocampus, extrastriate, frontal and parietal cortices. Activity specific to correctly recognized emotional faces, but not scenes, was found in sensorimotor areas and rostral prefrontal cortex. These results suggest that emotional valence and type of visual stimulus both modulate brain activity at recognition, and influence multiple networks mediating visual, memory and emotion processing. The contextual information in emotional scenes may facilitate memory via additional visual processing, whereas memory for emotional faces may rely more on cognitive control mediated by rostrolateral prefrontal regions.

  7. Emotion-attention interactions in recognition memory for distractor faces.

    Science.gov (United States)

    Srinivasan, Narayanan; Gupta, Rashmi

    2010-04-01

    Effective filtering of distractor information has been shown to be dependent on perceptual load. Given the salience of emotional information and the presence of emotion-attention interactions, we wanted to explore the recognition memory for emotional distractors especially as a function of focused attention and distributed attention by manipulating load and the spatial spread of attention. We performed two experiments to study emotion-attention interactions by measuring recognition memory performance for distractor neutral and emotional faces. Participants performed a color discrimination task (low-load) or letter identification task (high-load) with a letter string display in Experiment 1 and a high-load letter identification task with letters presented in a circular array in Experiment 2. The stimuli were presented against a distractor face background. The recognition memory results show that happy faces were recognized better than sad faces under conditions of less focused or distributed attention. When attention is more spatially focused, sad faces were recognized better than happy faces. The study provides evidence for emotion-attention interactions in which specific emotional information like sad or happy is associated with focused or distributed attention respectively. Distractor processing with emotional information also has implications for theories of attention.

  8. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias.

    Science.gov (United States)

    Invitto, Sara; Calcagnì, Antonio; Mignozzi, Arianna; Scardino, Rosanna; Piraino, Giulia; Turchi, Daniele; De Feudis, Irio; Brunetti, Antonio; Bevilacqua, Vitoantonio; de Tommaso, Marina

    2017-01-01

    Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians). Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP) and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment). A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.

  9. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias

    Directory of Open Access Journals (Sweden)

    Sara Invitto

    2017-08-01

    Full Text Available Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians. Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment. A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.

  10. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias

    Science.gov (United States)

    Invitto, Sara; Calcagnì, Antonio; Mignozzi, Arianna; Scardino, Rosanna; Piraino, Giulia; Turchi, Daniele; De Feudis, Irio; Brunetti, Antonio; Bevilacqua, Vitoantonio; de Tommaso, Marina

    2017-01-01

    Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians). Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP) and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment). A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal. PMID:28824392

  11. Association with emotional information alters subsequent processing of neutral faces.

    Science.gov (United States)

    Riggs, Lily; Fujioka, Takako; Chan, Jessica; McQuiggan, Douglas A; Anderson, Adam K; Ryan, Jennifer D

    2014-01-01

    The processing of emotional as compared to neutral information is associated with different patterns in eye movement and neural activity. However, the 'emotionality' of a stimulus can be conveyed not only by its physical properties, but also by the information that is presented with it. There is very limited work examining the how emotional information may influence the immediate perceptual processing of otherwise neutral information. We examined how presenting an emotion label for a neutral face may influence subsequent processing by using eye movement monitoring (EMM) and magnetoencephalography (MEG) simultaneously. Participants viewed a series of faces with neutral expressions. Each face was followed by a unique negative or neutral sentence to describe that person, and then the same face was presented in isolation again. Viewing of faces paired with a negative sentence was associated with increased early viewing of the eye region and increased neural activity between 600 and 1200 ms in emotion processing regions such as the cingulate, medial prefrontal cortex, and amygdala, as well as posterior regions such as the precuneus and occipital cortex. Viewing of faces paired with a neutral sentence was associated with increased activity in the parahippocampal gyrus during the same time window. By monitoring behavior and neural activity within the same paradigm, these findings demonstrate that emotional information alters subsequent visual scanning and the neural systems that are presumably invoked to maintain a representation of the neutral information along with its emotional details.

  12. Neural correlates of recognition memory for emotional faces and scenes

    OpenAIRE

    Keightley, Michelle L.; Chiew, Kimberly S.; Anderson, John A. E.; Grady, Cheryl L.

    2010-01-01

    We examined the influence of emotional valence and type of item to be remembered on brain activity during recognition, using faces and scenes. We used multivariate analyses of event-related fMRI data to identify whole-brain patterns, or networks of activity. Participants demonstrated better recognition for scenes vs faces and for negative vs neutral and positive items. Activity was increased in extrastriate cortex and inferior frontal gyri for emotional scenes, relative to neutral scenes and ...

  13. Visual Afterimages of Emotional Faces in High Functioning Autism

    Science.gov (United States)

    Rutherford, M. D.; Troubridge, Erin K.; Walsh, Jennifer

    2012-01-01

    Fixating an emotional facial expression can create afterimages, such that subsequent faces are seen as having the opposite expression of that fixated. Visual afterimages have been used to map the relationships among emotion categories, and this method was used here to compare ASD and matched control participants. Participants adapted to a facial…

  14. Age-related differences in attentional bias for emotional faces.

    Science.gov (United States)

    Tomaszczyk, Jennifer C; Fernandes, Myra A

    2014-01-01

    Past research suggests an aging-related positivity effect in orienting to faces. However, these studies have eschewed direct comparison of orienting when positive and negative faces are presented simultaneously, thereby potentially underestimating the degree to which emotional valence influences such effects. In the current study younger and older adults viewed face pairs for 1000 ms, and upon face-pair offset indicated the location of a dot that appeared in the former location of one of the faces, to assess attentional orienting. When shown negative-neutral pairs, both age groups were biased to attend to negative faces, but when shown positive-negative pairs only younger adults showed a bias toward negative; older adults showed a lack of orienting toward either emotional face. Results suggest younger adults have a negativity bias in attention orienting regardless of the valence of nearby stimuli, whereas older adults show an absence of this bias when positive information is present.

  15. Multimodal processing of emotional information in 9-month-old infants I: emotional faces and voices.

    Science.gov (United States)

    Otte, R A; Donkers, F C L; Braeken, M A K A; Van den Bergh, B R H

    2015-04-01

    Making sense of emotions manifesting in human voice is an important social skill which is influenced by emotions in other modalities, such as that of the corresponding face. Although processing emotional information from voices and faces simultaneously has been studied in adults, little is known about the neural mechanisms underlying the development of this ability in infancy. Here we investigated multimodal processing of fearful and happy face/voice pairs using event-related potential (ERP) measures in a group of 84 9-month-olds. Infants were presented with emotional vocalisations (fearful/happy) preceded by the same or a different facial expression (fearful/happy). The ERP data revealed that the processing of emotional information appearing in human voice was modulated by the emotional expression appearing on the corresponding face: Infants responded with larger auditory ERPs after fearful compared to happy facial primes. This finding suggests that infants dedicate more processing capacities to potentially threatening than to non-threatening stimuli.

  16. Facial emotion recognition, face scan paths, and face perception in children with neurofibromatosis type 1.

    Science.gov (United States)

    Lewis, Amelia K; Porter, Melanie A; Williams, Tracey A; Bzishvili, Samantha; North, Kathryn N; Payne, Jonathan M

    2017-05-01

    This study aimed to investigate face scan paths and face perception abilities in children with Neurofibromatosis Type 1 (NF1) and how these might relate to emotion recognition abilities in this population. The authors investigated facial emotion recognition, face scan paths, and face perception in 29 children with NF1 compared to 29 chronological age-matched typically developing controls. Correlations between facial emotion recognition, face scan paths, and face perception in children with NF1 were examined. Children with NF1 displayed significantly poorer recognition of fearful expressions compared to controls, as well as a nonsignificant trend toward poorer recognition of anger. Although there was no significant difference between groups in time spent viewing individual core facial features (eyes, nose, mouth, and nonfeature regions), children with NF1 spent significantly less time than controls viewing the face as a whole. Children with NF1 also displayed significantly poorer face perception abilities than typically developing controls. Facial emotion recognition deficits were not significantly associated with aberrant face scan paths or face perception abilities in the NF1 group. These results suggest that impairments in the perception, identification, and interpretation of information from faces are important aspects of the social-cognitive phenotype of NF1. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. Visual attention to emotional face in schizophrenia: an eye tracking study.

    Directory of Open Access Journals (Sweden)

    Mania Asgharpour

    2015-03-01

    Full Text Available Deficits in the processing of facial emotions have been reported extensively in patients with schizophrenia. To explore whether restricted attention is the cause of impaired emotion processing in these patients, we examined visual attention through tracking eye movements in response to emotional and neutral face stimuli in a group of patients with schizophrenia and healthy individuals. We also examined the correlation between visual attention allocation and symptoms severity in our patient group.Thirty adult patients with schizophrenia and 30 matched healthy controls participated in this study. Visual attention data were recorded while participants passively viewed emotional-neutral face pairs for 500 ms. The relationship between the visual attention and symptoms severity were assessed by the Positive and Negative Syndrome Scale (PANSS in the schizophrenia group. Repeated Measures ANOVAs were used to compare the groups.Comparing the number of fixations made during face-pairs presentation, we found that patients with schizophrenia made fewer fixations on faces, regardless of the expression of the face. Analysis of the number of fixations on negative-neutral pairs also revealed that the patients made fewer fixations on both neutral and negative faces. Analysis of number of fixations on positive-neutral pairs only showed more fixations on positive relative to neutral expressions in both groups. We found no correlations between visual attention pattern to faces and symptom severity in schizophrenic patients.The results of this study suggest that the facial recognition deficit in schizophrenia is related to decreased attention to face stimuli. Finding of no difference in visual attention for positive-neutral face pairs between the groups is in line with studies that have shown increased ability to positive emotional perception in these patients.

  18. Emotion recognition: the role of featural and configural face information.

    Science.gov (United States)

    Bombari, Dario; Schmid, Petra C; Schmid Mast, Marianne; Birri, Sandra; Mast, Fred W; Lobmaier, Janek S

    2013-01-01

    Several studies investigated the role of featural and configural information when processing facial identity. A lot less is known about their contribution to emotion recognition. In this study, we addressed this issue by inducing either a featural or a configural processing strategy (Experiment 1) and by investigating the attentional strategies in response to emotional expressions (Experiment 2). In Experiment 1, participants identified emotional expressions in faces that were presented in three different versions (intact, blurred, and scrambled) and in two orientations (upright and inverted). Blurred faces contain mainly configural information, and scrambled faces contain mainly featural information. Inversion is known to selectively hinder configural processing. Analyses of the discriminability measure (A') and response times (RTs) revealed that configural processing plays a more prominent role in expression recognition than featural processing, but their relative contribution varies depending on the emotion. In Experiment 2, we qualified these differences between emotions by investigating the relative importance of specific features by means of eye movements. Participants had to match intact expressions with the emotional cues that preceded the stimulus. The analysis of eye movements confirmed that the recognition of different emotions rely on different types of information. While the mouth is important for the detection of happiness and fear, the eyes are more relevant for anger, fear, and sadness.

  19. In Your Face: Startle to Emotional Facial Expressions Depends on Face Direction

    Science.gov (United States)

    Michalsen, Henriette; Øvervoll, Morten

    2017-01-01

    Although faces are often included in the broad category of emotional visual stimuli, the affective impact of different facial expressions is not well documented. The present experiment investigated startle electromyographic responses to pictures of neutral, happy, angry, and fearful facial expressions, with a frontal face direction (directed) and at a 45° angle to the left (averted). Results showed that emotional facial expressions interact with face direction to produce startle potentiation: Greater responses were found for angry expressions, compared with fear and neutrality, with directed faces. When faces were averted, fear and neutrality produced larger responses compared with anger and happiness. These results are in line with the notion that startle is potentiated to stimuli signaling threat. That is, a forward directed angry face may signal a threat toward the observer, and a fearful face directed to the side may signal a possible threat in the environment.

  20. A negative compatibility effect in priming of emotional faces.

    Science.gov (United States)

    Bennett, Jennifer D; Lleras, Alejandro; Oriet, Chris; Enns, James T

    2007-10-01

    The negative compatibility effect (NCE) is the surprising result that low-visibility prime arrows facilitate responses to opposite-direction target arrows. Here we compare the priming obtained with simple arrows to the priming of emotions when categorizing human faces, which represents a more naturalistic set of stimuli and for which there are no preexisting response biases. When inverted faces with neutral expressions were presented alongside emotional prime and target faces, only strong positive priming occurred. However, when the neutral faces were made to resemble the target faces in geometry (upright orientation), time (flashing briefly), and space (appearing in the same location), positive priming gradually weakened and became negative priming. Implications for theories of the NCE are discussed.

  1. Passing faces: sequence-dependent variations in the perceptual processing of emotional faces.

    Science.gov (United States)

    Karl, Christian; Hewig, Johannes; Osinsky, Roman

    2016-10-01

    There is broad evidence that contextual factors influence the processing of emotional facial expressions. Yet temporal-dynamic aspects, inter alia how face processing is influenced by the specific order of neutral and emotional facial expressions, have been largely neglected. To shed light on this topic, we recorded electroencephalogram from 168 healthy participants while they performed a gender-discrimination task with angry and neutral faces. Our event-related potential (ERP) analyses revealed a strong emotional modulation of the N170 component, indicating that the basic visual encoding and emotional analysis of a facial stimulus happen, at least partially, in parallel. While the N170 and the late positive potential (LPP; 400-600 ms) were only modestly affected by the sequence of preceding faces, we observed a strong influence of face sequences on the early posterior negativity (EPN; 200-300 ms). Finally, the differing response patterns of the EPN and LPP indicate that these two ERPs represent distinct processes during face analysis: while the former seems to represent the integration of contextual information in the perception of a current face, the latter appears to represent the net emotional interpretation of a current face.

  2. Social and emotional attachment in the neural representation of faces.

    Science.gov (United States)

    Gobbini, M Ida; Leibenluft, Ellen; Santiago, Neil; Haxby, James V

    2004-08-01

    To dissociate the role of visual familiarity from the role of social and emotional factors in recognizing familiar individuals, we measured neural activity using functional magnetic resonance imaging (fMRI) while subjects viewed (1) faces of personally familiar individuals (i.e. friends and family), (2) faces of famous individuals, and (3) faces of strangers. Personally familiar faces evoked a stronger response than did famous familiar faces and unfamiliar faces in areas that have been associated with 'theory of mind', and a weaker response in the amygdala. These response modulations may reflect the spontaneous activation of social knowledge about the personality and attitudes of close friends and relatives and the less guarded attitude one has around these people. These results suggest that familiarity causes changes in neural response that extend beyond a visual memory for a face.

  3. Emotional signals from faces, bodies and scenes influence observers' face expressions, fixations and pupil-size

    NARCIS (Netherlands)

    Kret, M.E.; Roelofs, K.; Stekelenburg, J.J.; de Gelder, B.

    2013-01-01

    We receive emotional signals from different sources, including the face, the whole body, and the natural scene. Previous research has shown the importance of context provided by the whole body and the scene on the recognition of facial expressions. This study measured physiological responses to face

  4. Emotional signals from faces, bodies and scenes influence observers' face expressions, fixations and pupil-size.

    Science.gov (United States)

    Kret, Mariska E; Roelofs, Karin; Stekelenburg, Jeroen J; de Gelder, Beatrice

    2013-01-01

    We receive emotional signals from different sources, including the face, the whole body, and the natural scene. Previous research has shown the importance of context provided by the whole body and the scene on the recognition of facial expressions. This study measured physiological responses to face-body-scene combinations. Participants freely viewed emotionally congruent and incongruent face-body and body-scene pairs whilst eye fixations, pupil-size, and electromyography (EMG) responses were recorded. Participants attended more to angry and fearful vs. happy or neutral cues, independent of the source and relatively independent from whether the face body and body scene combinations were emotionally congruent or not. Moreover, angry faces combined with angry bodies and angry bodies viewed in aggressive social scenes elicited greatest pupil dilation. Participants' face expressions matched the valence of the stimuli but when face-body compounds were shown, the observed facial expression influenced EMG responses more than the posture. Together, our results show that the perception of emotional signals from faces, bodies and scenes depends on the natural context, but when threatening cues are presented, these threats attract attention, induce arousal, and evoke congruent facial reactions.

  5. Categorical Perception of emotional faces is not affected by aging

    Directory of Open Access Journals (Sweden)

    Mandy Rossignol

    2009-11-01

    Full Text Available Effects of normal aging on categorical perception (CP of facial emotional expressions were investigated. One-hundred healthy participants (20 to 70 years old; five age groups had to identify morphed expressions ranging from neutrality to happiness, sadness and fear. We analysed percentages and latencies of correct recognition for nonmorphed emotional expressions, percentages and latencies of emotional recognition for morphed-faces, locus of the boundaries along the different continua and the number of intrusions. The results showed that unmorphed happy and fearful faces were better processed than unmorphed sad and neutral faces. For morphed faces, CP was confirmed, as latencies increased as a function of the distance between the displayed morph and the original unmorphed photograph. The locus of categorical boundaries was not affected by age. Aging did not alter the accuracy of recognition for original pictures, no more than the emotional recognition of morphed faces or the rate of intrusions. However, latencies of responses increased with age, for both unmorphed and morphed pictures. In conclusion, CP of facial expressions appears to be spared in aging.

  6. Cross-modal perception (face and voice in emotions. ERPs and behavioural measures

    Directory of Open Access Journals (Sweden)

    Michela Balconi

    2007-04-01

    Full Text Available Emotion decoding constitutes a case of multimodal processing of cues from multiple channels. Previous behavioural and neuropsychological studies indicated that, when we have to decode emotions on the basis of multiple perceptive information, a cross-modal integration has place. The present study investigates the simultaneous processing of emotional tone of voice and emotional facial expression by event-related potentials (ERPs, through an ample range of different emotions (happiness, sadness, fear, anger, surprise, and disgust. Auditory emotional stimuli (a neutral word pronounced in an affective tone and visual patterns (emotional facial expressions were matched in congruous (the same emotion in face and voice and incongruous (different emotions pairs. Subjects (N=30 were required to process the stimuli and to indicate their comprehension (by stimpad. ERPs variations and behavioural data (response time, RTs were submitted to repeated measures analysis of variance (ANOVA. We considered two time intervals (150-250; 250-350 ms post-stimulus, in order to explore the ERP variations. ANOVA showed two different ERP effects, a negative deflection (N2, more anterior-distributed (Fz, and a positive deflection (P2, more posterior-distributed, with different cognitive functions. In the first case N2 may be considered a marker of the emotional content (sensitive to type of emotion, whereas P2 may represent a cross-modal integration marker, it being varied as a function of the congruous/incongruous condition, showing a higher peak for congruous stimuli than incongruous stimuli. Finally, a RT reduction was found for some emotion types for congruous condition (i.e. sadness and an inverted effect for other emotions (i.e. fear, anger, and surprise.

  7. Risk for Bipolar Disorder is Associated with Face-Processing Deficits across Emotions

    Science.gov (United States)

    Brotman, Melissa A.; Skup, Martha; Rich, Brendan A.; Blair, Karina S.; Pine, Daniel S.; Blair, James R.; Leibenluft, Ellen

    2008-01-01

    The relationship between the risks for face-emotion labeling deficits and bipolar disorder (BD) among youths is examined. Findings show that youths at risk for BD did not show specific face-emotion recognition deficits. The need to provide more intense emotional information for face-emotion labeling of patients and at-risk youths is also discussed.

  8. Risk for Bipolar Disorder is Associated with Face-Processing Deficits across Emotions

    Science.gov (United States)

    Brotman, Melissa A.; Skup, Martha; Rich, Brendan A.; Blair, Karina S.; Pine, Daniel S.; Blair, James R.; Leibenluft, Ellen

    2008-01-01

    The relationship between the risks for face-emotion labeling deficits and bipolar disorder (BD) among youths is examined. Findings show that youths at risk for BD did not show specific face-emotion recognition deficits. The need to provide more intense emotional information for face-emotion labeling of patients and at-risk youths is also discussed.

  9. Emotional cues during simultaneous face and voice processing: electrophysiological insights.

    Directory of Open Access Journals (Sweden)

    Taosheng Liu

    Full Text Available Both facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and angry valence, within the context of recognizing monkey faces and voices task. To investigate the temporal unfolding of the processing of affective information from human face-voice pairings, we recorded event-related potentials (ERPs to these audiovisual test stimuli in 18 normal healthy subjects; N100, P200, N250, P300 components were observed at electrodes in the frontal-central region, while P100, N170, P270 were observed at electrodes in the parietal-occipital region. Results indicated a significant audiovisual stimulus effect on the amplitudes and latencies of components in frontal-central (P200, P300, and N250 but not the parietal occipital region (P100, N170 and P270. Specifically, P200 and P300 amplitudes were more positive for emotional relative to neutral audiovisual stimuli, irrespective of valence, whereas N250 amplitude was more negative for neutral relative to emotional stimuli. No differentiation was observed between angry and happy conditions. The results suggest that the general effect of emotion on audiovisual processing can emerge as early as 200 msec (P200 peak latency post stimulus onset, in spite of implicit affective processing task demands, and that such effect is mainly distributed in the frontal-central region.

  10. Emotional cues during simultaneous face and voice processing: electrophysiological insights.

    Science.gov (United States)

    Liu, Taosheng; Pinheiro, Ana; Zhao, Zhongxin; Nestor, Paul G; McCarley, Robert W; Niznikiewicz, Margaret A

    2012-01-01

    Both facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and angry valence, within the context of recognizing monkey faces and voices task. To investigate the temporal unfolding of the processing of affective information from human face-voice pairings, we recorded event-related potentials (ERPs) to these audiovisual test stimuli in 18 normal healthy subjects; N100, P200, N250, P300 components were observed at electrodes in the frontal-central region, while P100, N170, P270 were observed at electrodes in the parietal-occipital region. Results indicated a significant audiovisual stimulus effect on the amplitudes and latencies of components in frontal-central (P200, P300, and N250) but not the parietal occipital region (P100, N170 and P270). Specifically, P200 and P300 amplitudes were more positive for emotional relative to neutral audiovisual stimuli, irrespective of valence, whereas N250 amplitude was more negative for neutral relative to emotional stimuli. No differentiation was observed between angry and happy conditions. The results suggest that the general effect of emotion on audiovisual processing can emerge as early as 200 msec (P200 peak latency) post stimulus onset, in spite of implicit affective processing task demands, and that such effect is mainly distributed in the frontal-central region.

  11. Child's recognition of emotions in robot's face and body

    NARCIS (Netherlands)

    Cohen, I.; Looije, R.; Neerincx, M.A.

    2011-01-01

    Social robots can comfort and support children who have to cope with chronic diseases. In previous studies, a "facial robot", the iCat, proved to show well-recognized emotional expressions that are important in social interactions. The question is if a mobile robot without a face, the Nao, can expre

  12. Measuring and testing awareness of emotional face expressions

    DEFF Research Database (Denmark)

    Sandberg, Kristian; Bibby, Bo Martin; Overgaard, Morten

    2013-01-01

    with emotional content (fearful vs. neutral faces). Although we find the study interesting, we disagree with the conclusion that CR is superior to PAS because of two methodological issues. First, the conclusion is not based on a formal test. We performed this test and found no evidence that CR predicted accuracy...

  13. The Development of Emotional Face and Eye Gaze Processing

    Science.gov (United States)

    Hoehl, Stefanie; Striano, Tricia

    2010-01-01

    Recent research has demonstrated that infants' attention towards novel objects is affected by an adult's emotional expression and eye gaze toward the object. The current event-related potential (ERP) study investigated how infants at 3, 6, and 9 months of age process fearful compared to neutral faces looking toward objects or averting gaze away…

  14. Emotionally anesthetized: media violence induces neural changes during emotional face processing.

    Science.gov (United States)

    Stockdale, Laura A; Morrison, Robert G; Kmiecik, Matthew J; Garbarino, James; Silton, Rebecca L

    2015-10-01

    Media violence exposure causes increased aggression and decreased prosocial behavior, suggesting that media violence desensitizes people to the emotional experience of others. Alterations in emotional face processing following exposure to media violence may result in desensitization to others' emotional states. This study used scalp electroencephalography methods to examine the link between exposure to violence and neural changes associated with emotional face processing. Twenty-five participants were shown a violent or nonviolent film clip and then completed a gender discrimination stop-signal task using emotional faces. Media violence did not affect the early visual P100 component; however, decreased amplitude was observed in the N170 and P200 event-related potentials following the violent film, indicating that exposure to film violence leads to suppression of holistic face processing and implicit emotional processing. Participants who had just seen a violent film showed increased frontal N200/P300 amplitude. These results suggest that media violence exposure may desensitize people to emotional stimuli and thereby require fewer cognitive resources to inhibit behavior. © The Author (2015). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  15. Processing of emotional faces in congenital amusia: An emotional music priming event-related potential study.

    Science.gov (United States)

    Zhishuai, Jin; Hong, Liu; Daxing, Wu; Pin, Zhang; Xuejing, Lu

    2017-01-01

    Congenital amusia is characterized by lifelong impairments in music perception and processing. It is unclear whether pitch detection deficits impact amusic individuals' perception of musical emotion. In the current work, 19 amusics and 21 healthy controls were subjected to electroencephalography (EEG) while being exposed to music excerpts and emotional faces. We assessed each individual's ability to discriminate positive- and negative-valenced emotional faces and analyzed electrophysiological indices, in the form of event-related potentials (ERPs) recorded at 32 sites, following exposure to emotionally positive or negative music excerpts. We observed smaller N2 amplitudes in response to facial expressions in the amusia group than in the control group, suggesting that amusics were less affected by the musical stimuli. The late-positive component (LPC) in amusics was similar to that in controls. Our results suggest that the neurocognitive deficit characteristic of congenital amusia is fundamentally an impairment in musical information processing rather than an impairment in emotional processing.

  16. Emotion in the neutral face: a mechanism for impression formation?

    Science.gov (United States)

    Adams, Reginald B; Nelson, Anthony J; Soto, José A; Hess, Ursula; Kleck, Robert E

    2012-01-01

    The current work examined contributions of emotion-resembling facial cues to impression formation. There exist common facial cues that make people look emotional, male or female, and from which we derive personality inferences. We first conducted a Pilot Study to assess these effects. We found that neutral female versus neutral male faces were rated as more submissive, affiliative, naïve, honest, cooperative, babyish, fearful, happy, and less angry than neutral male faces. In our Primary Study, we then "warped" these same neutral faces over their corresponding anger and fear displays so the resultant facial appearance cues now structurally resembled emotion while retaining a neutral visage (e.g., no wrinkles, furrows, creases, etc.). The gender effects found in the Pilot Study were replicated in the Primary Study, suggesting clear stereotype-driven impressions. Critically, ratings of the neutral-over-fear warps versus neutral-over-anger warps also revealed a profile similar to the gender-based ratings, revealing perceptually driven impressions directly attributable to emotion overgeneralisation.

  17. Emotion processing in chimeric faces: hemispheric asymmetries in expression and recognition of emotions.

    Science.gov (United States)

    Indersmitten, Tim; Gur, Ruben C

    2003-05-01

    Since the discovery of facial asymmetries in emotional expressions of humans and other primates, hypotheses have related the greater left-hemiface intensity to right-hemispheric dominance in emotion processing. However, the difficulty of creating true frontal views of facial expressions in two-dimensional photographs has confounded efforts to better understand the phenomenon. We have recently described a method for obtaining three-dimensional photographs of posed and evoked emotional expressions and used these stimuli to investigate both intensity of expression and accuracy of recognizing emotion in chimeric faces constructed from only left- or right-side composites. The participant population included 38 (19 male, 19 female) African-American, Caucasian, and Asian adults. They were presented with chimeric composites generated from faces of eight actors and eight actresses showing four emotions: happiness, sadness, anger, and fear, each in posed and evoked conditions. We replicated the finding that emotions are expressed more intensely in the left hemiface for all emotions and conditions, with the exception of evoked anger, which was expressed more intensely in the right hemiface. In contrast, the results indicated that emotional expressions are recognized more efficiently in the right hemiface, indicating that the right hemiface expresses emotions more accurately. The double dissociation between the laterality of expression intensity and that of recognition efficiency supports the notion that the two kinds of processes may have distinct neural substrates. Evoked anger is uniquely expressed more intensely and accurately on the side of the face that projects to the viewer's right hemisphere, dominant in emotion recognition.

  18. Down Syndrome and Automatic Processing of Familiar and Unfamiliar Emotional Faces

    Science.gov (United States)

    Morales, Guadalupe E.; Lopez, Ernesto O.

    2010-01-01

    Participants with Down syndrome (DS) were required to participate in a face recognition experiment to recognize familiar (DS faces) and unfamiliar emotional faces (non DS faces), by using an affective priming paradigm. Pairs of emotional facial stimuli were presented (one face after another) with a short Stimulus Onset Asynchrony of 300…

  19. Emotion perception accuracy and bias in face-to-face versus cyberbullying.

    Science.gov (United States)

    Ciucci, Enrica; Baroncelli, Andrea; Nowicki, Stephen

    2014-01-01

    The authors investigated the association of traditional and cyber forms of bullying and victimization with emotion perception accuracy and emotion perception bias. Four basic emotions were considered (i.e., happiness, sadness, anger, and fear); 526 middle school students (280 females; M age = 12.58 years, SD = 1.16 years) were recruited, and emotionality was controlled. Results indicated no significant findings for girls. Boys with higher levels of traditional bullying did not show any deficit in perception accuracy of emotions, but they were prone to identify happiness and fear in faces when a different emotion was expressed; in addition, male cyberbullying was related to greater accuracy in recognizing fear. In terms of the victims, cyber victims had a global problem in recognizing emotions and a specific problem in processing anger and fear. It was concluded that emotion perception accuracy and bias were associated with bullying and victimization for boys not only in traditional settings but also in the electronic ones. Implications of these findings for possible intervention are discussed.

  20. Facing mixed emotions: Analytic and holistic perception of facial emotion expressions engages separate brain networks.

    Science.gov (United States)

    Meaux, Emilie; Vuilleumier, Patrik

    2016-11-01

    The ability to decode facial emotions is of primary importance for human social interactions; yet, it is still debated how we analyze faces to determine their expression. Here we compared the processing of emotional face expressions through holistic integration and/or local analysis of visual features, and determined which brain systems mediate these distinct processes. Behavioral, physiological, and brain responses to happy and angry faces were assessed by presenting congruent global configurations of expressions (e.g., happy top+happy bottom), incongruent composite configurations (e.g., angry top+happy bottom), and isolated features (e.g. happy top only). Top and bottom parts were always from the same individual. Twenty-six healthy volunteers were scanned using fMRI while they classified the expression in either the top or the bottom face part but ignored information in the other non-target part. Results indicate that the recognition of happy and anger expressions is neither strictly holistic nor analytic Both routes were involved, but with a different role for analytic and holistic information depending on the emotion type, and different weights of local features between happy and anger expressions. Dissociable neural pathways were engaged depending on emotional face configurations. In particular, regions within the face processing network differed in their sensitivity to holistic expression information, which predominantly activated fusiform, inferior occipital areas and amygdala when internal features were congruent (i.e. template matching), whereas more local analysis of independent features preferentially engaged STS and prefrontal areas (IFG/OFC) in the context of full face configurations, but early visual areas and pulvinar when seen in isolated parts. Collectively, these findings suggest that facial emotion recognition recruits separate, but interactive dorsal and ventral routes within the face processing networks, whose engagement may be shaped by

  1. Emotional signals from faces, bodies and scenes influence observers' face expressions, fixations and pupil-size

    NARCIS (Netherlands)

    Kret, M.E.; Roelofs, K.; Stekelenburg, J.J.; de Gelder, B.

    2013-01-01

    We receive emotional signals from different sources, including the face, the whole body, and the natural scene. Previous research has shown the importance of context provided by the whole body and the scene on the recognition of facial expressions. This study measured physiological responses to

  2. Emotional expressions of old faces are perceived as more positive and less negative than young faces in young adults

    OpenAIRE

    2015-01-01

    Interpreting the emotions of others through their facial expressions can provide important social information, yet the way in which we judge an emotion is subject to psychosocial factors. We hypothesized that the age of a face would bias how the emotional expressions are judged, with older faces generally more likely to be viewed as having more positive and less negative expressions than younger faces. Using two-alternative forced-choice perceptual decision tasks, participants sorted young an...

  3. Emotional Expressions of Old Faces Are Perceived as More Positive and Less Negative than Young Faces in Young Adults

    OpenAIRE

    2015-01-01

    Interpreting the emotions of others through their facial expressions can provide important social information, yet the way in which we judge an emotion is subject to psychosocial factors. We hypothesized that the age of a face would bias how the emotional expressions are judged, with older faces generally more likely to be viewed as having more positive and less negative expressions than younger faces. Using two-alternative forced-choice perceptual decision tasks, participants sorted young an...

  4. Grounding context in face processing: color, emotion, and gender.

    Science.gov (United States)

    Gil, Sandrine; Le Bigot, Ludovic

    2015-01-01

    In recent years, researchers have become interested in the way that the affective quality of contextual information transfers to a perceived target. We therefore examined the effect of a red (vs. green, mixed red/green, and achromatic) background - known to be valenced - on the processing of stimuli that play a key role in human interactions, namely facial expressions. We also examined whether the valenced-color effect can be modulated by gender, which is also known to be valenced. Female and male adult participants performed a categorization task of facial expressions of emotion in which the faces of female and male posers expressing two ambiguous emotions (i.e., neutral and surprise) were presented against the four different colored backgrounds. Additionally, this task was completed by collecting subjective ratings for each colored background in the form of five semantic differential scales corresponding to both discrete and dimensional perspectives of emotion. We found that the red background resulted in more negative face perception than the green background, whether the poser was female or male. However, whereas this valenced-color effect was the only effect for female posers, for male posers, the effect was modulated by both the nature of the ambiguous emotion and the decoder's gender. Overall, our findings offer evidence that color and gender have a common valence-based dimension.

  5. Grounding Context in Face Processing: Color, Emotion and Gender

    Directory of Open Access Journals (Sweden)

    Sandrine eGil

    2015-03-01

    Full Text Available In recent years, researchers have become interested in the way that the affective quality of contextual information transfers to a perceived target. We therefore examined the effect of a red (versus green, mixed red/green and achromatic background–known to be valenced−on the processing of stimuli that play a key role in human interactions, namely facial expressions. We also examined whether the valenced-color effect can be modulated by gender, which is also known to be valenced. Female and male adult participants performed a categorization task of facial expressions of emotion in which the faces of female and male posers expressing two ambiguous emotions (i.e., neutral and surprise were presented against the four different colored backgrounds. Additionally, this task was completed by collecting subjective ratings for each colored background in the form of five semantic differential scales corresponding to both discrete and dimensional perspectives of emotion. We found that the red background resulted in more negative face perception than the green background, whether the poser was female or male. However, whereas this valenced-color effect was the only effect for female posers, for male posers, the effect was modulated by both the nature of the ambiguous emotion and the decoder’s gender. Overall, our findings offer evidence that color and gender have a common valence-based dimension.

  6. Social and emotional relevance in face processing: Happy faces of future interaction partners enhance the LPP

    Directory of Open Access Journals (Sweden)

    Florian eBublatzky

    2014-07-01

    Full Text Available Human face perception is modulated by both emotional valence and social relevance, but their interaction has rarely been examined. Event-related brain potentials (ERP to happy, neutral, and angry facial expressions with different degrees of social relevance were recorded. Social relevance was manipulated by presenting pictures of two specific face actors as future interaction partners (meet condition, whereas two other face actors remained non-relevant. As a further control condition all stimuli were presented without specific task instructions (passive viewing condition. A within-subject design (Facial Expression x Relevance x Task was implemented, where randomly ordered face stimuli of four actors (2 women, from the KDEF were presented for 1s to 26 participants (16 female. Results showed an augmented N170, early posterior negativity (EPN, and late positive potential (LPP for emotional in contrast to neutral facial expressions. Of particular interest, face processing varied as a function of instructed social relevance. Whereas the meet condition was accompanied with unspecific effects regardless of relevance (P1, EPN, viewing potential interaction partners was associated with increased LPP amplitudes. The LPP was specifically enhanced for happy facial expressions of the future interaction partners. This underscores that social relevance can impact face processing already at an early stage of visual processing. These findings are discussed within the framework of motivated attention and face processing theories.

  7. Reading emotions from faces in two indigenous societies.

    Science.gov (United States)

    Crivelli, Carlos; Jarillo, Sergio; Russell, James A; Fernández-Dols, José-Miguel

    2016-07-01

    That all humans recognize certain specific emotions from their facial expression-the Universality Thesis-is a pillar of research, theory, and application in the psychology of emotion. Its most rigorous test occurs in indigenous societies with limited contact with external cultural influences, but such tests are scarce. Here we report 2 such tests. Study 1 was of children and adolescents (N = 68; aged 6-16 years) of the Trobriand Islands (Papua New Guinea, South Pacific) with a Western control group from Spain (N = 113, of similar ages). Study 2 was of children and adolescents (N = 36; same age range) of Matemo Island (Mozambique, Africa). In both studies, participants were shown an array of prototypical facial expressions and asked to point to the person feeling a specific emotion: happiness, fear, anger, disgust, or sadness. The Spanish control group matched faces to emotions as predicted by the Universality Thesis: matching was seen on 83% to 100% of trials. For the indigenous societies, in both studies, the Universality Thesis was moderately supported for happiness: smiles were matched to happiness on 58% and 56% of trials, respectively. For other emotions, however, results were even more modest: 7% to 46% in the Trobriand Islands and 22% to 53% in Matemo Island. These results were robust across age, gender, static versus dynamic display of the facial expressions, and between- versus within-subjects design. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  8. Emotional Expressions of Old Faces Are Perceived as More Positive and Less Negative than Young Faces in Young Adults

    Directory of Open Access Journals (Sweden)

    Norah C Hass

    2015-08-01

    Full Text Available Interpreting the emotions of others through their facial expressions can provide important social information, yet the way in which we judge an emotion is subject to psychosocial factors. We hypothesized that the age of a face would bias how the emotional expressions are judged, with older faces generally more likely to be viewed as having more positive and less negative expressions than younger faces. Using two-alternative forced-choice perceptual decision tasks, participants sorted young and old faces of which emotional expressions were gradually morphed into one of two categories - neutral vs. happy and neutral vs. angry. The results indicated that old faces were more frequently perceived as having a happy expression at the lower emotional intensity levels, and less frequently perceived as having an angry expression at the higher emotional intensity levels than younger faces in young adults. Critically, the perceptual decision threshold at which old faces were judged as happy was lower than for young faces, and higher for angry old faces compared to young faces. These findings suggest that the age of the face influences how its emotional expression is interpreted in social interactions.

  9. Emotional expressions of old faces are perceived as more positive and less negative than young faces in young adults.

    Science.gov (United States)

    Hass, Norah C; Schneider, Erik J S; Lim, Seung-Lark

    2015-01-01

    Interpreting the emotions of others through their facial expressions can provide important social information, yet the way in which we judge an emotion is subject to psychosocial factors. We hypothesized that the age of a face would bias how the emotional expressions are judged, with older faces generally more likely to be viewed as having more positive and less negative expressions than younger faces. Using two-alternative forced-choice perceptual decision tasks, participants sorted young and old faces of which emotional expressions were gradually morphed into one of two categories-"neutral vs. happy" and "neutral vs. angry." The results indicated that old faces were more frequently perceived as having a happy expression at the lower emotional intensity levels, and less frequently perceived as having an angry expression at the higher emotional intensity levels than younger faces in young adults. Critically, the perceptual decision threshold at which old faces were judged as happy was lower than for young faces, and higher for angry old faces compared to young faces. These findings suggest that the age of the face influences how its emotional expression is interpreted in social interactions.

  10. A motivational determinant of facial emotion recognition: regulatory focus affects recognition of emotions in faces.

    Science.gov (United States)

    Sassenrath, Claudia; Sassenberg, Kai; Ray, Devin G; Scheiter, Katharina; Jarodzka, Halszka

    2014-01-01

    Two studies examined an unexplored motivational determinant of facial emotion recognition: observer regulatory focus. It was predicted that a promotion focus would enhance facial emotion recognition relative to a prevention focus because the attentional strategies associated with promotion focus enhance performance on well-learned or innate tasks - such as facial emotion recognition. In Study 1, a promotion or a prevention focus was experimentally induced and better facial emotion recognition was observed in a promotion focus compared to a prevention focus. In Study 2, individual differences in chronic regulatory focus were assessed and attention allocation was measured using eye tracking during the facial emotion recognition task. Results indicated that the positive relation between a promotion focus and facial emotion recognition is mediated by shorter fixation duration on the face which reflects a pattern of attention allocation matched to the eager strategy in a promotion focus (i.e., striving to make hits). A prevention focus did not have an impact neither on perceptual processing nor on facial emotion recognition. Taken together, these findings demonstrate important mechanisms and consequences of observer motivational orientation for facial emotion recognition.

  11. Passive and motivated perception of emotional faces: qualitative and quantitative changes in the face processing network.

    Directory of Open Access Journals (Sweden)

    Laurie R Skelly

    Full Text Available Emotionally expressive faces are processed by a distributed network of interacting sub-cortical and cortical brain regions. The components of this network have been identified and described in large part by the stimulus properties to which they are sensitive, but as face processing research matures interest has broadened to also probe dynamic interactions between these regions and top-down influences such as task demand and context. While some research has tested the robustness of affective face processing by restricting available attentional resources, it is not known whether face network processing can be augmented by increased motivation to attend to affective face stimuli. Short videos of people expressing emotions were presented to healthy participants during functional magnetic resonance imaging. Motivation to attend to the videos was manipulated by providing an incentive for improved recall performance. During the motivated condition, there was greater coherence among nodes of the face processing network, more widespread correlation between signal intensity and performance, and selective signal increases in a task-relevant subset of face processing regions, including the posterior superior temporal sulcus and right amygdala. In addition, an unexpected task-related laterality effect was seen in the amygdala. These findings provide strong evidence that motivation augments co-activity among nodes of the face processing network and the impact of neural activity on performance. These within-subject effects highlight the necessity to consider motivation when interpreting neural function in special populations, and to further explore the effect of task demands on face processing in healthy brains.

  12. Mutual influences of pain and emotional face processing

    Directory of Open Access Journals (Sweden)

    Matthias J Wieser

    2014-10-01

    Full Text Available The perception of unpleasant stimuli enhances whereas the perception of pleasant stimuli decreases pain perception. In contrast, the effects of pain on the processing of emotional stimuli are much less known. Especially given the recent interest in facial expressions of pain as a special category of emotional stimuli, a main topic in this research line is the mutual influence of pain and facial expression processing. Therefore, in this mini-review we selectively summarize research on the effects of emotional stimuli on pain, but more extensively turn to the opposite direction namely how pain influences concurrent processing of affective stimuli such as facial expressions. Based on the motivational priming theory one may hypothesize that the perception of pain enhances the processing of unpleasant stimuli and decreases the processing of pleasant stimuli. This review reveals that the literature is only partly consistent with this assumption: Pain reduces the processing of pleasant pictures and happy facial expressions, but does not - or only partly - affect processing of unpleasant pictures. However, it was demonstrated that pain selectively enhances the processing of facial expressions if these are pain-related (i.e. facial expressions of pain. Extending a mere affective modulation theory, the latter results suggest pain-specific effects which may be explained by the perception-action model of empathy. Together, these results underscore the important mutual influence of pain and emotional face processing.

  13. Mutual influences of pain and emotional face processing.

    Science.gov (United States)

    Wieser, Matthias J; Gerdes, Antje B M; Reicherts, Philipp; Pauli, Paul

    2014-01-01

    The perception of unpleasant stimuli enhances whereas the perception of pleasant stimuli decreases pain perception. In contrast, the effects of pain on the processing of emotional stimuli are much less known. Especially given the recent interest in facial expressions of pain as a special category of emotional stimuli, a main topic in this research line is the mutual influence of pain and facial expression processing. Therefore, in this mini-review we selectively summarize research on the effects of emotional stimuli on pain, but more extensively turn to the opposite direction namely how pain influences concurrent processing of affective stimuli such as facial expressions. Based on the motivational priming theory one may hypothesize that the perception of pain enhances the processing of unpleasant stimuli and decreases the processing of pleasant stimuli. This review reveals that the literature is only partly consistent with this assumption: pain reduces the processing of pleasant pictures and happy facial expressions, but does not - or only partly - affect processing of unpleasant pictures. However, it was demonstrated that pain selectively enhances the processing of facial expressions if these are pain-related (i.e., facial expressions of pain). Extending a mere affective modulation theory, the latter results suggest pain-specific effects which may be explained by the perception-action model of empathy. Together, these results underscore the important mutual influence of pain and emotional face processing.

  14. Emotion identification method using RGB information of human face

    Science.gov (United States)

    Kita, Shinya; Mita, Akira

    2015-03-01

    Recently, the number of single households is drastically increased due to the growth of the aging society and the diversity of lifestyle. Therefore, the evolution of building spaces is demanded. Biofied Building we propose can help to avoid this situation. It helps interaction between the building and residents' conscious and unconscious information using robots. The unconscious information includes emotion, condition, and behavior. One of the important information is thermal comfort. We assume we can estimate it from human face. There are many researchs about face color analysis, but a few of them are conducted in real situations. In other words, the existing methods were not used with disturbance such as room lumps. In this study, Kinect was used with face-tracking. Room lumps and task lumps were used to verify that our method could be applicable to real situation. In this research, two rooms at 22 and 28 degrees C were prepared. We showed that the transition of thermal comfort by changing temperature can be observed from human face. Thus, distinction between the data of 22 and 28 degrees C condition from face color was proved to be possible.

  15. Young and Older Emotional Faces: Are there Age-Group Differences in Expression Identification and Memory?

    OpenAIRE

    2009-01-01

    Studies finding that older compared to young adults are less able to identify facial expressions and have worse memory for negative than positive faces have used only young faces. Studies finding that both age groups are more accurate at recognizing faces of their own than other ages have used mostly neutral faces. Thus, age-differences in processing faces may not extend to older faces, and preferential memory for own-age faces may not extend to emotional faces. To investigate these possibili...

  16. Adaptation to emotional conflict: evidence from a novel face emotion paradigm.

    Directory of Open Access Journals (Sweden)

    Peter E Clayson

    Full Text Available The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression. Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth or incongruent (happy eyes, angry mouth while high-density event-related potentials (ERPs were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs. Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words.

  17. Face-Memory and Emotion: Associations with Major Depression in Children and Adolescents

    Science.gov (United States)

    Pine, Daniel S.; Lissek, Shmuel; Klein, Rachel G.; Mannuzza, Salvatore; Moulton, John L., III; Guardino, Mary; Woldehawariat, Girma

    2004-01-01

    Background: Studies in adults with major depressive disorder (MDD) document abnormalities in both memory and face-emotion processing. The current study used a novel face-memory task to test the hypothesis that adolescent MDD is associated with a deficit in memory for face-emotions. The study also examines the relationship between parental MDD and…

  18. Faces and emotions: brain electric field sources during covert emotional processing.

    Science.gov (United States)

    Pizzagalli, D; Koenig, T; Regard, M; Lehmann, D

    1998-04-01

    Covert brain activity related to task-free, spontaneous (i.e. unrequested), emotional evaluation of human face images was analysed in 27-channel averaged event-related potential (ERP) map series recorded from 18 healthy subjects while observing random sequences of face images without further instructions. After recording, subjects self-rated each face image on a scale from "liked" to "disliked". These ratings were used to dichotomize the face images into the affective evaluation categories of "liked" and "disliked" for each subject and the subjects into the affective attitudes of "philanthropists" and "misanthropists" (depending on their mean rating across images). Event-related map series were averaged for "liked" and "disliked" face images and for "philanthropists" and "misanthropists". The spatial configuration (landscape) of the electric field maps was assessed numerically by the electric gravity center, a conservative estimate of the mean location of all intracerebral, active, electric sources. Differences in electric gravity center location indicate activity of different neuronal populations. The electric gravity center locations of all event-related maps were averaged over the entire stimulus-on time (450 ms). The mean electric gravity center for disliked faces was located (significant across subjects) more to the right and somewhat more posterior than for liked faces. Similar differences were found between the mean electric gravity centers of misanthropists (more right and posterior) and philanthropists. Our neurophysiological findings are in line with neuropsychological findings, revealing visual emotional processing to depend on affective evaluation category and affective attitude, and extending the conclusions to a paradigm without directed task.

  19. Faces and bodies: perception and mimicry of emotionally congruent and incongruent facial and bodily expressions

    OpenAIRE

    2013-01-01

    Traditional emotion theories stress the importance of the face in the expression of emotions but bodily expressions are becoming increasingly important. Here we tested the hypothesis that similar physiological responses can be evoked by observing emotional face and body signals and that the reaction to angry signals is amplified in anxious individuals. We designed three experiments in which participants categorized emotional expressions from isolated facial and bodily expressions and from emo...

  20. Facial emotion recognition in myotonic dystrophy type 1 correlates with CTG repeat expansion

    Directory of Open Access Journals (Sweden)

    Stefan Winblad

    2009-04-01

    Full Text Available We investigated the ability of patients with myotonic dystrophy type 1 to recognise basic facial emotions. We also explored the relationship between facial emotion recognition, neuropsychological data, personality, and CTG repeat expansion data in the DM-1 group. In total, 50 patients with DM-1 (28 women and 22 men participated, with 41 healthy controls. Recognition of facial emotional expressions was assessed using photographs of basic emotions. A set of tests measured cognition and personality dimensions, and CTG repeat size was quantified in blood lymphocytes. Patients with DM-1 showed impaired recognition of facial emotions compared with controls. A significant negative correlation was found between total score of emotion recognition in a forced choice task and CTG repeat size. Furthermore, specific cognitive functions (vocabulary, visuospatial construction ability, and speed and personality dimensions (reward dependence and cooperativeness correlated with scores on the forced choice emotion recognition task.These findings revealed a CTG repeat dependent facial emotion recognition deficit in the DM-1 group, which was associated with specific neuropsychological functions. Furthermore, a correlation was found between facial emotional recognition ability and personality dimensions associated with sociability. This adds a new clinically relevant dimension in the cognitive deficits associated with DM-1.

  1. Adult age-differences in subjective impression of emotional faces are reflected in emotion-related attention and memory tasks

    Directory of Open Access Journals (Sweden)

    Joakim eSvard

    2014-05-01

    Full Text Available Although younger and older adults appear to attend to and remember emotional faces differently, less is known about age-related differences in the subjective emotional impression (arousal, potency, and valence of emotional faces and how these differences, in turn, are reflected in age differences in various emotional tasks. In the current study, we used the same facial emotional stimuli (angry and happy faces in four tasks: emotional rating, attention, categorical perception, and visual short-term memory (VSTM. The aim of this study was to investigate effects of age on the subjective emotional impression of angry and happy faces and to examine whether any age differences were mirrored in measures of emotional behavior (attention, categorical perception, and memory.In addition, regression analyses were used to further study impression-behavior associations. Forty younger adults (range 20-30 years and thirty-nine older adults (range 65-75 years participated in the experiment. The emotional rating task showed that older adults perceived less arousal, potency, and valence than younger adults and that the difference was more pronounced for angry than happy faces. Similarly, the results of the attention and memory tasks demonstrated interaction effects between emotion and age, and age differences on these measures were larger for angry than for happy faces. Regression analyses confirmed that in both age groups, higher potency ratings predicted both visual search and visual short-term memory efficiency. Future studies should consider the possibility that age differences in the subjective emotional impression of facial emotional stimuli may explain age differences in attention to and memory of such stimuli.

  2. Emotion Perception or Social Cognitive Complexity: What Drives Face Processing Deficits in Autism Spectrum Disorder?

    Science.gov (United States)

    Walsh, Jennifer A; Creighton, Sarah E; Rutherford, M D

    2016-02-01

    Some, but not all, relevant studies have revealed face processing deficits among those with autism spectrum disorder (ASD). In particular, deficits are revealed in face processing tasks that involve emotion perception. The current study examined whether either deficits in processing emotional expression or deficits in processing social cognitive complexity drive face processing deficits in ASD. We tested adults with and without ASD on a battery of face processing tasks that varied with respect to emotional expression processing and social cognitive complexity. Results revealed significant group differences on tasks involving emotional expression processing, but typical performance on a non-emotional but socially complex task. These results support an emotion processing rather than a social complexity explanation for face processing deficits in ASD.

  3. Emotional scenes elicit more pronounced self-reported emotional experience and greater EPN and LPP modulation when compared to emotional faces.

    Science.gov (United States)

    Thom, Nathaniel; Knight, Justin; Dishman, Rod; Sabatinelli, Dean; Johnson, Douglas C; Clementz, Brett

    2014-06-01

    Emotional faces and scenes carry a wealth of overlapping and distinct perceptual information. Despite widespread use in the investigation of emotional perception, expressive face and evocative scene stimuli are rarely assessed in the same experiment. Here, we evaluated self-reports of arousal and pleasantness, as well as early and late event-related potentials (e.g., N170, early posterior negativity [EPN], late positive potential [LPP]) as subjects viewed neutral and emotional faces and scenes, including contents representing anger, fear, and joy. Results demonstrate that emotional scenes were rated as more evocative than emotional faces, as only scenes produced elevated self-reports of arousal. In addition, viewing scenes resulted in more extreme ratings of pleasantness (and unpleasantness) than did faces. EEG results indicate that both expressive faces and emotional scenes evoke enhanced negativity in the N170 component, while the EPN and LPP components show significantly enhanced modulation only by scene, relative to face stimuli. These data suggest that viewing emotional scenes results in a more pronounced emotional experience that is associated with reliable modulation of visual event-related potentials that are implicated in emotional circuits in the brain.

  4. Face recognition in emotional scenes: observers remember the eye shape but forget the nose.

    Science.gov (United States)

    Ryan, Kaitlin F; Schwartz, Noah Z

    2013-01-01

    Face recognition is believed to be a highly specialized process that allows individuals to recognize faces faster and more accurately than ordinary objects. However, when faces are viewed in highly emotional contexts, the process becomes slower and less accurate. This suggests a change in recognition strategy compared to recognition in non-arousing contexts. Here we explore this finding by using a novel paradigm to determine which face dimensions are most important for recognizing faces that were initially encoded in highly emotional contexts. Participants were asked to recognize faces from a 3-alternative display after viewing a similar face that was embedded in either a neutral, positive, or negative emotional scene. Results showed that individuals rely on eye shape when recognizing faces that were encoded while embedded in either positive or negative emotional contexts, and ignore nose shape when recognizing faces that were encoded while embedded in negative emotional scenes. The findings suggest that, after encoding that face during heightened emotional arousal, individuals are more likely to commit errors when identifying a face on the basis of nose shape, and less likely to commit errors when identifying a face on the basis of eye shape.

  5. An ERP Study of Emotional Face Processing in the Adult and Infant Brain

    Science.gov (United States)

    Leppanen, Jukka M.; Moulson, Margaret C.; Vogel-Farley, Vanessa K.; Nelson, Charles A.

    2007-01-01

    To examine the ontogeny of emotional face processing, event-related potentials (ERPs) were recorded from adults and 7-month-old infants while viewing pictures of fearful, happy, and neutral faces. Face-sensitive ERPs at occipital-temporal scalp regions differentiated between fearful and neutral/happy faces in both adults (N170 was larger for fear)…

  6. Repeated Witnessing of Conspecifics in Pain : Effects on Emotional Contagion

    NARCIS (Netherlands)

    Carrillo, Maria; Migliorati, Filippo; Bruls, Rune; Han, Yingying; Heinemans, Mirjam; Pruis, Ilanah; Gazzola, V.; Keysers, C.

    2015-01-01

    Witnessing of conspecifics in pain has been shown to elicit socially triggered freezing in rodents. It is unknown how robust this response is to repeated exposure to a cage-mate experiencing painful stimulation. To address this question, shock-experienced Observer rats repeatedly witnessed familiar

  7. Neural Activation to Emotional Faces in Adolescents with Autism Spectrum Disorders

    Science.gov (United States)

    Weng, Shih-Jen; Carrasco, Melisa; Swartz, Johnna R.; Wiggins, Jillian Lee; Kurapati, Nikhil; Liberzon, Israel; Risi, Susan; Lord, Catherine; Monk, Christopher S.

    2011-01-01

    Background: Autism spectrum disorders (ASD) involve a core deficit in social functioning and impairments in the ability to recognize face emotions. In an emotional faces task designed to constrain group differences in attention, the present study used functional MRI to characterize activation in the amygdala, ventral prefrontal cortex (vPFC), and…

  8. The NIMH Child Emotional Faces Picture Set (NIMH-ChEFS): a new set of children's facial emotion stimuli.

    Science.gov (United States)

    Egger, Helen Link; Pine, Daniel S; Nelson, Eric; Leibenluft, Ellen; Ernst, Monique; Towbin, Kenneth E; Angold, Adrian

    2011-09-01

    With the emergence of new technologies, there has been an explosion of basic and clinical research on the affective and cognitive neuroscience of face processing and emotion perception. Adult emotional face stimuli are commonly used in these studies. For developmental research, there is a need for a validated set of child emotional faces. This paper describes the development of the National Institute of Mental Health Child Emotional Faces Picture Set (NIMH-ChEFS), a relatively large stimulus set with high quality, color images of the emotional faces of children. The set includes 482 photographs of fearful, angry, happy, sad and neutral child faces with two gaze conditions: direct and averted gaze. In this paper we describe the development of the NIMH-ChEFS and data on the set's validity based on ratings by 20 healthy adult raters. Agreement between the a priori emotion designation and the raters' labels was high and comparable with values reported for commonly used adult picture sets. Intensity, representativeness, and composite "goodness" ratings are also presented to guide researchers in their choice of specific stimuli for their studies. These data should give researchers confidence in the NIMH-ChEFS's validity for use in affective and social neuroscience research.

  9. Body expressions influence recognition of emotions in the face and voice.

    Science.gov (United States)

    Van den Stock, Jan; Righart, Ruthger; de Gelder, Beatrice

    2007-08-01

    The most familiar emotional signals consist of faces, voices, and whole-body expressions, but so far research on emotions expressed by the whole body is sparse. The authors investigated recognition of whole-body expressions of emotion in three experiments. In the first experiment, participants performed a body expression-matching task. Results indicate good recognition of all emotions, with fear being the hardest to recognize. In the second experiment, two alternative forced choice categorizations of the facial expression of a compound face-body stimulus were strongly influenced by the bodily expression. This effect was a function of the ambiguity of the facial expression. In the third experiment, recognition of emotional tone of voice was similarly influenced by task irrelevant emotional body expressions. Taken together, the findings illustrate the importance of emotional whole-body expressions in communication either when viewed on their own or, as is often the case in realistic circumstances, in combination with facial expressions and emotional voices.

  10. Faces and bodies: perception and mimicry of emotionally congruent and incongruent facial and bodily expressions

    Directory of Open Access Journals (Sweden)

    Mariska eKret

    2013-02-01

    Full Text Available Traditional emotion theories stress the importance of the face in the expression of emotions but bodily expressions are becoming increasingly important. Here we tested the hypothesis that similar physiological responses can be evoked by observing emotional face and body signals and that the reaction to angry signals is amplified in anxious individuals. We designed three experiments in which participants categorized emotional expressions from isolated facial and bodily expressions and from emotionally congruent and incongruent face-body compounds. Participants’ fixations were measured and their pupil size recorded with eye-tracking equipment, and their facial reactions measured with electromyography (EMG. The behavioral results support our prediction that the recognition of a facial expression is improved in the context of a matching posture and importantly, also vice versa. From their facial expression, it appeared that observers acted with signs of negative emotionality (increased corrugator activity to angry and fearful facial expressions and with positive emotionality (increased zygomaticus to happy facial expressions. What we predicted and found, was that angry and fearful cues from the face or the body, attracted more attention than happy cues. We further observed that responses evoked by angry cues were amplified in individuals with high anxiety scores. In sum, we show that people process bodily expressions of emotion in a similar fashion as facial expressions and that the congruency between the emotional signals from the face and body ameliorates the recognition of the emotion.

  11. Testing the effects of expression, intensity and age on emotional face processing in ASD.

    Science.gov (United States)

    Luyster, Rhiannon J; Bick, Johanna; Westerlund, Alissa; Nelson, Charles A

    2017-06-21

    Individuals with autism spectrum disorder (ASD) commonly show global deficits in the processing of facial emotion, including impairments in emotion recognition and slowed processing of emotional faces. Growing evidence has suggested that these challenges may increase with age, perhaps due to minimal improvement with age in individuals with ASD. In the present study, we explored the role of age, emotion type and emotion intensity in face processing for individuals with and without ASD. Twelve- and 18-22- year-old children with and without ASD participated. No significant diagnostic group differences were observed on behavioral measures of emotion processing for younger versus older individuals with and without ASD. However, there were significant group differences in neural responses to emotional faces. Relative to TD, at 12 years of age and during adulthood, individuals with ASD showed slower N170 to emotional faces. While the TD groups' P1 latency was significantly shorter in adults when compared to 12 year olds, there was no significant age-related difference in P1 latency among individuals with ASD. Findings point to potential differences in the maturation of cortical networks that support visual processing (whether of faces or stimuli more broadly), among individuals with and without ASD between late childhood and adulthood. Finally, associations between ERP amplitudes and behavioral responses on emotion processing tasks suggest possible neural markers for emotional and behavioral deficits among individuals with ASD. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. The social face of emotion recognition: Evaluations versus stereotypes

    NARCIS (Netherlands)

    Bijlstra, G.; Holland, R.W.; Wigboldus, D.H.J.

    2010-01-01

    The goal of the present paper was to demonstrate the influence of general evaluations and stereotype associations on emotion recognition. Earlier research has shown that evaluative connotations between social category members and emotional expression predict whether recognition of positive or

  13. Neural responses to emotional faces in women recovered from anorexia nervosa.

    Science.gov (United States)

    Cowdrey, Felicity A; Harmer, Catherine J; Park, Rebecca J; McCabe, Ciara

    2012-03-31

    Impairments in emotional processing have been associated with anorexia nervosa. However, it is unknown whether neural and behavioural differences in the processing of emotional stimuli persist following recovery. The aim of this study was to investigate the neural processing of emotional faces in individuals recovered from anorexia nervosa compared with healthy controls. Thirty-two participants (16 recovered anorexia nervosa, 16 healthy controls) underwent a functional magnetic resonance imaging (fMRI) scan. Participants viewed fearful and happy emotional faces and indicated the gender of the face presented. Whole brain analysis revealed no significant differences between the groups to the contrasts of fear versus happy and vice versa. Region of interest analysis demonstrated no significant differences in the neural response to happy or fearful stimuli between the groups in the amygdala or fusiform gyrus. These results suggest that processing of emotional faces may not be aberrant after recovery from anorexia nervosa. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  14. [Non-conscious perception of emotional faces affects the visual objects recognition].

    Science.gov (United States)

    Gerasimenko, N Iu; Slavutskaia, A V; Kalinin, S A; Mikhaĭlova, E S

    2013-01-01

    In 34 healthy subjects we have analyzed accuracy and reaction time (RT) during the recognition of complex visual images: pictures of animals and non-living objects. The target stimuli were preceded by brief presentation of masking non-target ones, which represented drawings of emotional (angry, fearful, happy) or neutral faces. We have revealed that in contrast to accuracy the RT depended on the emotional expression of the preceding faces. RT was significantly shorter if the target objects were paired with the angry and fearful faces as compared with the happy and neutral ones. These effects depended on the category of the target stimulus and were more prominent for objects than for animals. Further, the emotional faces' effects were determined by emotional and communication personality traits (defined by Cattell's Questionnaire) and were clearer defined in more sensitive, anxious and pessimistic introverts. The data are important for understanding the mechanisms of human visual behavior determination by non-consciously processing of emotional information.

  15. Emotional expressions preferentially elicit implicit evaluations of faces also varying in race or age.

    Science.gov (United States)

    Craig, Belinda M; Lipp, Ottmar V; Mallan, Kimberley M

    2014-10-01

    Both facial cues of group membership (race, age, and sex) and emotional expressions can elicit implicit evaluations to guide subsequent social behavior. There is, however, little research addressing whether group membership cues or emotional expressions are more influential in the formation of implicit evaluations of faces when both cues are simultaneously present. The current study aimed to determine this. Emotional expressions but not race or age cues elicited implicit evaluations in a series of affective priming tasks with emotional Caucasian and African faces (Experiments 1 and 2) and young and old faces (Experiment 3). Spontaneous evaluations of group membership cues of race and age only occurred when those cues were task relevant, suggesting the preferential influence of emotional expressions in the formation of implicit evaluations of others when cues of race or age are not salient. Implications for implicit prejudice, face perception, and person construal are discussed.

  16. No Differences in Emotion Recognition Strategies in Children with Autism Spectrum Disorder: Evidence from Hybrid Faces

    Directory of Open Access Journals (Sweden)

    Kris Evers

    2014-01-01

    Full Text Available Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD. However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness or in the mouth region (so-called bottom-emotions: sadness, anger, and fear. No stronger reliance on mouth information was found in children with ASD.

  17. The face-specific N170 component is modulated by emotional facial expression

    Directory of Open Access Journals (Sweden)

    Tottenham Nim

    2007-01-01

    Full Text Available Abstract Background According to the traditional two-stage model of face processing, the face-specific N170 event-related potential (ERP is linked to structural encoding of face stimuli, whereas later ERP components are thought to reflect processing of facial affect. This view has recently been challenged by reports of N170 modulations by emotional facial expression. This study examines the time-course and topography of the influence of emotional expression on the N170 response to faces. Methods Dense-array ERPs were recorded in response to a set (n = 16 of fear and neutral faces. Stimuli were normalized on dimensions of shape, size and luminance contrast distribution. To minimize task effects related to facial or emotional processing, facial stimuli were irrelevant to a primary task of learning associative pairings between a subsequently presented visual character and a spoken word. Results N170 to faces showed a strong modulation by emotional facial expression. A split half analysis demonstrates that this effect was significant both early and late in the experiment and was therefore not associated with only the initial exposures of these stimuli, demonstrating a form of robustness against habituation. The effect of emotional modulation of the N170 to faces did not show significant interaction with the gender of the face stimulus, or hemisphere of recording sites. Subtracting the fear versus neutral topography provided a topography that itself was highly similar to the face N170. Conclusion The face N170 response can be influenced by emotional expressions contained within facial stimuli. The topography of this effect is consistent with the notion that fear stimuli exaggerates the N170 response itself. This finding stands in contrast to previous models suggesting that N170 processes linked to structural analysis of faces precede analysis of emotional expression, and instead may reflect early top-down modulation from neural systems involved in

  18. Automatic Processing of Emotional Faces in High-Functioning Pervasive Developmental Disorders: An Affective Priming Study

    Science.gov (United States)

    Kamio, Yoko; Wolf, Julie; Fein, Deborah

    2006-01-01

    This study examined automatic processing of emotional faces in individuals with high-functioning Pervasive Developmental Disorders (HFPDD) using an affective priming paradigm. Sixteen participants (HFPDD and matched controls) were presented with happy faces, fearful faces or objects in both subliminal and supraliminal exposure conditions, followed…

  19. Conscious and Non-conscious Representations of Emotional Faces in Asperger's Syndrome.

    Science.gov (United States)

    Chien, Vincent S C; Tsai, Arthur C; Yang, Han Hsuan; Tseng, Yi-Li; Savostyanov, Alexander N; Liou, Michelle

    2016-07-31

    Several neuroimaging studies have suggested that the low spatial frequency content in an emotional face mainly activates the amygdala, pulvinar, and superior colliculus especially with fearful faces(1-3). These regions constitute the limbic structure in non-conscious perception of emotions and modulate cortical activity either directly or indirectly(2). In contrast, the conscious representation of emotions is more pronounced in the anterior cingulate, prefrontal cortex, and somatosensory cortex for directing voluntary attention to details in faces(3,4). Asperger's syndrome (AS)(5,6) represents an atypical mental disturbance that affects sensory, affective and communicative abilities, without interfering with normal linguistic skills and intellectual ability. Several studies have found that functional deficits in the neural circuitry important for facial emotion recognition can partly explain social communication failure in patients with AS(7-9). In order to clarify the interplay between conscious and non-conscious representations of emotional faces in AS, an EEG experimental protocol is designed with two tasks involving emotionality evaluation of either photograph or line-drawing faces. A pilot study is introduced for selecting face stimuli that minimize the differences in reaction times and scores assigned to facial emotions between the pretested patients with AS and IQ/gender-matched healthy controls. Information from the pretested patients was used to develop the scoring system used for the emotionality evaluation. Research into facial emotions and visual stimuli with different spatial frequency contents has reached discrepant findings depending on the demographic characteristics of participants and task demands(2). The experimental protocol is intended to clarify deficits in patients with AS in processing emotional faces when compared with healthy controls by controlling for factors unrelated to recognition of facial emotions, such as task difficulty, IQ and

  20. Social and emotional relevance in face processing: happy faces of future interaction partners enhance the late positive potential.

    Science.gov (United States)

    Bublatzky, Florian; Gerdes, Antje B M; White, Andrew J; Riemer, Martin; Alpers, Georg W

    2014-01-01

    Human face perception is modulated by both emotional valence and social relevance, but their interaction has rarely been examined. Event-related brain potentials (ERP) to happy, neutral, and angry facial expressions with different degrees of social relevance were recorded. To implement a social anticipation task, relevance was manipulated by presenting faces of two specific actors as future interaction partners (socially relevant), whereas two other face actors remained non-relevant. In a further control task all stimuli were presented without specific relevance instructions (passive viewing). Face stimuli of four actors (2 women, from the KDEF) were randomly presented for 1s to 26 participants (16 female). Results showed an augmented N170, early posterior negativity (EPN), and late positive potential (LPP) for emotional in contrast to neutral facial expressions. Of particular interest, face processing varied as a function of experimental tasks. Whereas task effects were observed for P1 and EPN regardless of instructed relevance, LPP amplitudes were modulated by emotional facial expression and relevance manipulation. The LPP was specifically enhanced for happy facial expressions of the anticipated future interaction partners. This underscores that social relevance can impact face processing already at an early stage of visual processing. These findings are discussed within the framework of motivated attention and face processing theories.

  1. Repeated Witnessing of Conspecifics in Pain: Effects on Emotional Contagion.

    Directory of Open Access Journals (Sweden)

    Maria Carrillo

    Full Text Available Witnessing of conspecifics in pain has been shown to elicit socially triggered freezing in rodents. It is unknown how robust this response is to repeated exposure to a cage-mate experiencing painful stimulation. To address this question, shock-experienced Observer rats repeatedly witnessed familiar Demonstrators receive painful footshocks (six sessions. Results confirm that Observers freeze during the first testing session. The occurrence of this behaviour however gradually diminished as the experimental sessions progressed, reaching minimal freezing levels by the end of the experiments. In contrast, the appearance and continuous increase in the frequency of yawning, a behavior that was inhibited by metyrapone (i.e,. a glucocorticoid synthesis blocker, might represent an alternative coping strategy, suggesting that the observer's reduced freezing does not necessarily indicate a disappearance in the affective response to the Demonstrator's distress.

  2. Social anhedonia is associated with neural abnormalities during face emotion processing.

    Science.gov (United States)

    Germine, Laura T; Garrido, Lucia; Bruce, Lori; Hooker, Christine

    2011-10-01

    Human beings are social organisms with an intrinsic desire to seek and participate in social interactions. Social anhedonia is a personality trait characterized by a reduced desire for social affiliation and reduced pleasure derived from interpersonal interactions. Abnormally high levels of social anhedonia prospectively predict the development of schizophrenia and contribute to poorer outcomes for schizophrenia patients. Despite the strong association between social anhedonia and schizophrenia, the neural mechanisms that underlie individual differences in social anhedonia have not been studied and are thus poorly understood. Deficits in face emotion recognition are related to poorer social outcomes in schizophrenia, and it has been suggested that face emotion recognition deficits may be a behavioral marker for schizophrenia liability. In the current study, we used functional magnetic resonance imaging (fMRI) to see whether there are differences in the brain networks underlying basic face emotion processing in a community sample of individuals low vs. high in social anhedonia. We isolated the neural mechanisms related to face emotion processing by comparing face emotion discrimination with four other baseline conditions (identity discrimination of emotional faces, identity discrimination of neutral faces, object discrimination, and pattern discrimination). Results showed a group (high/low social anhedonia) × condition (emotion discrimination/control condition) interaction in the anterior portion of the rostral medial prefrontal cortex, right superior temporal gyrus, and left somatosensory cortex. As predicted, high (relative to low) social anhedonia participants showed less neural activity in face emotion processing regions during emotion discrimination as compared to each control condition. The findings suggest that social anhedonia is associated with abnormalities in networks responsible for basic processes associated with social cognition, and provide a

  3. Aging and the perception of emotion: processing vocal expressions alone and with faces.

    Science.gov (United States)

    Ryan, Melissa; Murray, Janice; Ruffman, Ted

    2010-01-01

    This study investigated whether the difficulties older adults experience when recognizing specific emotions from facial expressions also occur with vocal expressions of emotion presented in isolation or in combination with facial expressions. When matching vocal expressions of six emotions to emotion labels, older adults showed worse performance on sadness and anger. When matching vocal expressions to facial expressions, older adults showed worse performance on sadness, anger, happiness, and fear. Older adults' poorer performance when matching faces to voices was independent of declines in fluid ability. Results are interpreted with reference to the neuropsychology of emotion recognition and the aging brain.

  4. In search of the emotional face: anger versus happiness superiority in visual search.

    Science.gov (United States)

    Savage, Ruth A; Lipp, Ottmar V; Craig, Belinda M; Becker, Stefanie I; Horstmann, Gernot

    2013-08-01

    Previous research has provided inconsistent results regarding visual search for emotional faces, yielding evidence for either anger superiority (i.e., more efficient search for angry faces) or happiness superiority effects (i.e., more efficient search for happy faces), suggesting that these results do not reflect on emotional expression, but on emotion (un-)related low-level perceptual features. The present study investigated possible factors mediating anger/happiness superiority effects; specifically search strategy (fixed vs. variable target search; Experiment 1), stimulus choice (Nimstim database vs. Ekman & Friesen database; Experiments 1 and 2), and emotional intensity (Experiment 3 and 3a). Angry faces were found faster than happy faces regardless of search strategy using faces from the Nimstim database (Experiment 1). By contrast, a happiness superiority effect was evident in Experiment 2 when using faces from the Ekman and Friesen database. Experiment 3 employed angry, happy, and exuberant expressions (Nimstim database) and yielded anger and happiness superiority effects, respectively, highlighting the importance of the choice of stimulus materials. Ratings of the stimulus materials collected in Experiment 3a indicate that differences in perceived emotional intensity, pleasantness, or arousal do not account for differences in search efficiency. Across three studies, the current investigation indicates that prior reports of anger or happiness superiority effects in visual search are likely to reflect on low-level visual features associated with the stimulus materials used, rather than on emotion.

  5. Older Adults' Trait Impressions of Faces Are Sensitive to Subtle Resemblance to Emotions.

    Science.gov (United States)

    Franklin, Robert G; Zebrowitz, Leslie A

    2013-09-01

    Younger adults (YA) attribute emotion-related traits to people whose neutral facial structure resembles an emotion (emotion overgeneralization). The fact that older adults (OA) show deficits in accurately labeling basic emotions suggests that they may be relatively insensitive to variations in the emotion resemblance of neutral expression faces that underlie emotion overgeneralization effects. On the other hand, the fact that OA, like YA, show a 'pop-out' effect for anger, more quickly locating an angry than a happy face in a neutral array, suggests that both age groups may be equally sensitive to emotion resemblance. We used computer modeling to assess the degree to which neutral faces objectively resembled emotions and assessed whether that resemblance predicted trait impressions. We found that both OA and YA showed anger and surprise overgeneralization in ratings of danger and naiveté, respectively, with no significant differences in the strength of the effects for the two age groups. These findings suggest that well-documented OA deficits on emotion recognition tasks may be more due to processing demands than to an insensitivity to the social affordances of emotion expressions.

  6. Effects of acute psychosocial stress on neural activity to emotional and neutral faces in a face recognition memory paradigm.

    Science.gov (United States)

    Li, Shijia; Weerda, Riklef; Milde, Christopher; Wolf, Oliver T; Thiel, Christiane M

    2014-12-01

    Previous studies have shown that acute psychosocial stress impairs recognition of declarative memory and that emotional material is especially sensitive to this effect. Animal studies suggest a central role of the amygdala which modulates memory processes in hippocampus, prefrontal cortex and other brain areas. We used functional magnetic resonance imaging (fMRI) to investigate neural correlates of stress-induced modulation of emotional recognition memory in humans. Twenty-seven healthy, right-handed, non-smoker male volunteers performed an emotional face recognition task. During encoding, participants were presented with 50 fearful and 50 neutral faces. One hour later, they underwent either a stress (Trier Social Stress Test) or a control procedure outside the scanner which was followed immediately by the recognition session inside the scanner, where participants had to discriminate between 100 old and 50 new faces. Stress increased salivary cortisol, blood pressure and pulse, and decreased the mood of participants but did not impact recognition memory. BOLD data during recognition revealed a stress condition by emotion interaction in the left inferior frontal gyrus and right hippocampus which was due to a stress-induced increase of neural activity to fearful and a decrease to neutral faces. Functional connectivity analyses revealed a stress-induced increase in coupling between the right amygdala and the right fusiform gyrus, when processing fearful as compared to neutral faces. Our results provide evidence that acute psychosocial stress affects medial temporal and frontal brain areas differentially for neutral and emotional items, with a stress-induced privileged processing of emotional stimuli.

  7. A review of brain oscillations in perception of faces and emotional pictures.

    Science.gov (United States)

    Güntekin, Bahar; Başar, Erol

    2014-05-01

    The differentiation of faces, facial expressions and affective pictures involves processes of higher mental activity that have considerable applications in the psychology of moods and emotions. At present, the search for functional correlates of brain oscillations is an important trend in neuroscience. Furthermore, analyses of oscillatory responses provide key knowledge on the physiology of brain dynamics. Studies analysing oscillatory dynamics in face perception and emotional pictures have increased in recent years; however, the literature lacks a review of the current state of the art. This study provides a comprehensive review of the delta, theta, alpha, beta and gamma oscillatory responses on presentation of faces, facial expressions and affective pictures (International Affective Picture System, IAPS). The reviewed literature revealed that the brain is more sensitive to emotional stimuli than neutral stimuli. A common and reliable finding from all reviewed studies was the increased brain responsiveness towards negative emotional pictures (face expression or IAPS). Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Memory for faces and voices varies as a function of sex and expressed emotion.

    Science.gov (United States)

    S Cortes, Diana; Laukka, Petri; Lindahl, Christina; Fischer, Håkan

    2017-01-01

    We investigated how memory for faces and voices (presented separately and in combination) varies as a function of sex and emotional expression (anger, disgust, fear, happiness, sadness, and neutral). At encoding, participants judged the expressed emotion of items in forced-choice tasks, followed by incidental Remember/Know recognition tasks. Results from 600 participants showed that accuracy (hits minus false alarms) was consistently higher for neutral compared to emotional items, whereas accuracy for specific emotions varied across the presentation modalities (i.e., faces, voices, and face-voice combinations). For the subjective sense of recollection ("remember" hits), neutral items received the highest hit rates only for faces, whereas for voices and face-voice combinations anger and fear expressions instead received the highest recollection rates. We also observed better accuracy for items by female expressers, and own-sex bias where female participants displayed memory advantage for female faces and face-voice combinations. Results further suggest that own-sex bias can be explained by recollection, rather than familiarity, rates. Overall, results show that memory for faces and voices may be influenced by the expressions that they carry, as well as by the sex of both items and participants. Emotion expressions may also enhance the subjective sense of recollection without enhancing memory accuracy.

  9. The Effect of Self-Referential Expectation on Emotional Face Processing.

    Directory of Open Access Journals (Sweden)

    Mel McKendrick

    Full Text Available The role of self-relevance has been somewhat neglected in static face processing paradigms but may be important in understanding how emotional faces impact on attention, cognition and affect. The aim of the current study was to investigate the effect of self-relevant primes on processing emotional composite faces. Sentence primes created an expectation of the emotion of the face before sad, happy, neutral or composite face photos were viewed. Eye movements were recorded and subsequent responses measured the cognitive and affective impact of the emotion expressed. Results indicated that primes did not guide attention, but impacted on judgments of valence intensity and self-esteem ratings. Negative self-relevant primes led to the most negative self-esteem ratings, although the effect of the prime was qualified by salient facial features. Self-relevant expectations about the emotion of a face and subsequent attention to a face that is congruent with these expectations strengthened the affective impact of viewing the face.

  10. Emotional contexts modulate intentional memory suppression of neutral faces: Insights from ERPs.

    Science.gov (United States)

    Pierguidi, Lapo; Righi, Stefania; Gronchi, Giorgio; Marzi, Tessa; Caharel, Stephanie; Giovannelli, Fabio; Viggiano, Maria Pia

    2016-08-01

    The main goal of present work is to gain new insight into the temporal dynamics underlying the voluntary memory control for neutral faces associated with neutral, positive and negative contexts. A directed forgetting (DF) procedure was used during the recording of EEG to answer the question whether is it possible to forget a face that has been encoded within a particular emotional context. A face-scene phase in which a neutral face was showed in a neutral or emotional scene (positive, negative) was followed by the voluntary memory cue (cue phase) indicating whether the face had to-be remember or to-be-forgotten (TBR and TBF). Memory for faces was then assessed with an old/new recognition task. Behaviorally, we found that it is harder to suppress faces-in-positive-scenes compared to faces-in-negative and neutral-scenes. The temporal information obtained by the ERPs showed: 1) during the face-scene phase, the Late Positive Potential (LPP), which indexes motivated emotional attention, was larger for faces-in-negative-scenes compared to faces-in-neutral-scenes. 2) Remarkably, during the cue phase, ERPs were significantly modulated by the emotional contexts. Faces-in-neutral scenes showed an ERP pattern that has been typically associated to DF effect whereas faces-in-positive-scenes elicited the reverse ERP pattern. Faces-in-negative scenes did not show differences in the DF-related neural activities but larger N1 amplitude for TBF vs. TBR faces may index early attentional deployment. These results support the hypothesis that the pleasantness or unpleasantness of the contexts (through attentional broadening and narrowing mechanisms, respectively) may modulate the effectiveness of intentional memory suppression for neutral information. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Emotional expressions evoke a differential response in the fusiform face area

    Directory of Open Access Journals (Sweden)

    Bronson Blake Harry

    2013-10-01

    Full Text Available It is widely assumed that the fusiform face area (FFA, a brain region specialised for face perception, is not involved in processing emotional expressions. This assumption is based on the proposition that the FFA is involved in face identification and only processes features that are invariant across changes due to head movements, speaking and expressing emotions. The present study tested this proposition by examining whether the response in the human FFA varies across emotional expressions with functional magnetic resonance imaging and brain decoding analysis techniques (n = 11. A one versus all classification analysis showed that most emotional expressions that participants perceived could be reliably predicted from the neural pattern of activity in left and the right FFA, suggesting that the perception of different emotional expressions recruit partially non-overlaping neural mechanisms. In addition, emotional expressions could also be decoded from the pattern of activity in the early visual cortex (EVC, indicating that retinotopic cortex also shows a differential response to emotional expressions. These results cast doubt on the idea that the FFA is involved in expression invariant face processing, and instead indicate that emotional expressions evoke partially de-correlated signals throughout occipital and posterior temporal cortex.

  12. The development of emotion perception in face and voice during infancy.

    Science.gov (United States)

    Grossmann, Tobias

    2010-01-01

    Interacting with others by reading their emotional expressions is an essential social skill in humans. How this ability develops during infancy and what brain processes underpin infants' perception of emotion in different modalities are the questions dealt with in this paper. Literature review. The first part provides a systematic review of behavioral findings on infants' developing emotion-reading abilities. The second part presents a set of new electrophysiological studies that provide insights into the brain processes underlying infants' developing abilities. Throughout, evidence from unimodal (face or voice) and multimodal (face and voice) processing of emotion is considered. The implications of the reviewed findings for our understanding of developmental models of emotion processing are discussed. The reviewed infant data suggest that (a) early in development, emotion enhances the sensory processing of faces and voices, (b) infants' ability to allocate increased attentional resources to negative emotional information develops earlier in the vocal domain than in the facial domain, and (c) at least by the age of 7 months, infants reliably match and recognize emotional information across face and voice.

  13. Nonverbal interpersonal sensitivity and persistence of depression : Perception of emotions in schematic faces

    NARCIS (Netherlands)

    Bouhuys, AL; Geerts, E; Mersch, PPA; Jenner, JA

    1996-01-01

    Deficits ill the decoding of facial emotional expressions may play a role in the persistence of depression. In a prospective longitudinal study, 33 depressed outpatients (30 major depression, 2 dysthymia, and 1 cyclothymic disorder) judged schematic faces with respect to the emotions they expressed

  14. What the Face and Body Reveal: In-Group Emotion Effects and Stereotyping of Emotion in African American and European American Children

    Science.gov (United States)

    Tuminello, Elizabeth R.; Davidson, Denise

    2011-01-01

    This study examined whether 3- to 7-year-old African American and European American children's assessment of emotion in face-only, face + body, and body-only photographic stimuli was affected by in-group emotion recognition effects and racial or gender stereotyping of emotion. Evidence for racial in-group effects was found, with European American…

  15. Investigating emotional top down modulation of ambiguous faces by single pulse TMS on early visual cortices

    Directory of Open Access Journals (Sweden)

    Zachary Adam Yaple

    2016-06-01

    Full Text Available Top-down processing is a mechanism in which memory, context and expectation are used to perceive stimuli. For this study we investigated how emotion content, induced by music mood, influences perception of happy and sad emoticons. Using single pulse TMS we stimulated right occipital face area (rOFA, primary visual cortex (V1 and vertex while subjects performed a face-detection task and listened to happy and sad music. At baseline, incongruent audio-visual pairings decreased performance, demonstrating dependence of emotion while perceiving ambiguous faces. However, performance of face identification decreased during rOFA stimulation regardless of emotional content. No effects were found between Cz and V1 stimulation. These results suggest that while rOFA is important for processing faces regardless of emotion, V1 stimulation had no effect. Our findings suggest that early visual cortex activity may not integrate emotional auditory information with visual information during emotion top-down modulation of faces.

  16. Young and older emotional faces: are there age group differences in expression identification and memory?

    Science.gov (United States)

    Ebner, Natalie C; Johnson, Marcia K

    2009-06-01

    Studies have found that older compared with young adults are less able to identify facial expressions and have worse memory for negative than for positive faces, but those studies have used only young faces. Studies finding that both age groups are more accurate at recognizing faces of their own than other ages have used mostly neutral faces. Thus, age differences in processing faces may not extend to older faces, and preferential memory for own age faces may not extend to emotional faces. To investigate these possibilities, young and older participants viewed young and older faces presented either with happy, angry, or neutral expressions; participants identified the expressions displayed and then completed a surprise face recognition task. Older compared with young participants were less able to identify expressions of angry young and older faces and (based on participants' categorizations) remembered angry faces less well than happy faces. There was no evidence of an own age bias in memory, but self-reported frequency of contact with young and older adults and awareness of own emotions played a role in expression identification of and memory for young and older faces.

  17. Music-Elicited Emotion Identification Using Optical Flow Analysis of Human Face

    Science.gov (United States)

    Kniaz, V. V.; Smirnova, Z. N.

    2015-05-01

    Human emotion identification from image sequences is highly demanded nowadays. The range of possible applications can vary from an automatic smile shutter function of consumer grade digital cameras to Biofied Building technologies, which enables communication between building space and residents. The highly perceptual nature of human emotions leads to the complexity of their classification and identification. The main question arises from the subjective quality of emotional classification of events that elicit human emotions. A variety of methods for formal classification of emotions were developed in musical psychology. This work is focused on identification of human emotions evoked by musical pieces using human face tracking and optical flow analysis. Facial feature tracking algorithm used for facial feature speed and position estimation is presented. Facial features were extracted from each image sequence using human face tracking with local binary patterns (LBP) features. Accurate relative speeds of facial features were estimated using optical flow analysis. Obtained relative positions and speeds were used as the output facial emotion vector. The algorithm was tested using original software and recorded image sequences. The proposed technique proves to give a robust identification of human emotions elicited by musical pieces. The estimated models could be used for human emotion identification from image sequences in such fields as emotion based musical background or mood dependent radio.

  18. Face Processing and Facial Emotion Recognition in Adults with Down Syndrome

    Science.gov (United States)

    Barisnikov, Koviljka; Hippolyte, Loyse; Van der Linden, Martial

    2008-01-01

    Face processing and facial expression recognition was investigated in 17 adults with Down syndrome, and results were compared with those of a child control group matched for receptive vocabulary. On the tasks involving faces without emotional content, the adults with Down syndrome performed significantly worse than did the controls. However, their…

  19. Face Processing and Facial Emotion Recognition in Adults with Down Syndrome

    Science.gov (United States)

    Barisnikov, Koviljka; Hippolyte, Loyse; Van der Linden, Martial

    2008-01-01

    Face processing and facial expression recognition was investigated in 17 adults with Down syndrome, and results were compared with those of a child control group matched for receptive vocabulary. On the tasks involving faces without emotional content, the adults with Down syndrome performed significantly worse than did the controls. However, their…

  20. Shared perceptual basis of emotional expressions and trustworthiness impressions from faces.

    Science.gov (United States)

    Oosterhof, Nikolaas N; Todorov, Alexander

    2009-02-01

    Using a dynamic stimuli paradigm, in which faces expressed either happiness or anger, the authors tested the hypothesis that perceptions of trustworthiness are related to these expressions. Although the same emotional intensity was added to both trustworthy and untrustworthy faces, trustworthy faces who expressed happiness were perceived as happier than untrustworthy faces, and untrustworthy faces who expressed anger were perceived as angrier than trustworthy faces. The authors also manipulated changes in face trustworthiness simultaneously with the change in expression. Whereas transitions in face trustworthiness in the direction of the expressed emotion (e.g., high-to-low trustworthiness and anger) increased the perceived intensity of the emotion, transitions in the opposite direction decreased this intensity. For example, changes from high to low trustworthiness increased the intensity of perceived anger but decreased the intensity of perceived happiness. These findings support the hypothesis that changes along the trustworthiness dimension correspond to subtle changes resembling expressions signaling whether the person displaying the emotion should be avoided or approached. (c) 2009 APA, all rights reserved

  1. Face-body integration of intense emotional expressions of victory and defeat

    Science.gov (United States)

    Wang, Lili; Xia, Lisheng; Zhang, Dandan

    2017-01-01

    Human facial expressions can be recognized rapidly and effortlessly. However, for intense emotions from real life, positive and negative facial expressions are difficult to discriminate and the judgment of facial expressions is biased towards simultaneously perceived body expressions. This study employed event-related potentials (ERPs) to investigate the neural dynamics involved in the integration of emotional signals from facial and body expressions of victory and defeat. Emotional expressions of professional players were used to create pictures of face-body compounds, with either matched or mismatched emotional expressions in faces and bodies. Behavioral results showed that congruent emotional information of face and body facilitated the recognition of facial expressions. ERP data revealed larger P1 amplitudes for incongruent compared to congruent stimuli. Also, a main effect of body valence on the P1 was observed, with enhanced amplitudes for the stimuli with losing compared to winning bodies. The main effect of body expression was also observed in N170 and N2, with winning bodies producing larger N170/N2 amplitudes. In the later stage, a significant interaction of congruence by body valence was found on the P3 component. Winning bodies elicited lager P3 amplitudes than losing bodies did when face and body conveyed congruent emotional signals. Beyond the knowledge based on prototypical facial and body expressions, the results of this study facilitate us to understand the complexity of emotion evaluation and categorization out of laboratory. PMID:28245245

  2. Electrophysiological correlates of emotional face processing after mild traumatic brain injury in preschool children.

    Science.gov (United States)

    D'Hondt, Fabien; Lassonde, Maryse; Thebault-Dagher, Fanny; Bernier, Annie; Gravel, Jocelyn; Vannasing, Phetsamone; Beauchamp, Miriam H

    2017-02-01

    Evidence suggests that social skills are affected by childhood mild traumatic brain injury (mTBI), but the neural and affective substrates of these difficulties are still underexplored. In particular, nothing is known about consequences on the perception of emotional facial expressions, despite its critical role in social interactions and the importance of the preschool period in the development of this ability. This study thus aimed to investigate the electrophysiological correlates of emotional facial expressions processing after early mTBI. To this end, 18 preschool children (mean age 53 ± 8 months) who sustained mTBI and 15 matched healthy controls (mean age 55 ± 11 months) were presented with pictures of faces expressing anger, happiness, or no emotion (neutral) while event-related potentials (ERP) were recorded. The main results revealed that P1 amplitude was higher for happy faces than for angry faces, and that N170 latency was shorter for emotional faces than for neutral faces in the control group only. These findings suggest that preschool children who sustain mTBI do not present the early emotional effects that are observed in healthy preschool children at visuospatial and visual expertise stages. This study provides new evidence regarding the consequences of childhood mTBI on socioemotional processing, by showing alterations of emotional facial expressions processing, an ability known to underlie social competence and appropriate social interactions.

  3. State-dependent alterations in inhibitory control and emotional face identification in seasonal affective disorder

    DEFF Research Database (Denmark)

    Hjordt, Liv V; Stenbæk, Dea S; Madsen, Kathrine Skak

    2017-01-01

    BACKGROUND: Depressed individuals often exhibit impaired inhibition to negative input and identification of positive stimuli, but it is unclear whether this is a state or trait feature. We here exploited a naturalistic model, namely individuals with seasonal affective disorder (SAD), to study...... this feature longitudinally. AIM: The goal of this study was to examine seasonal changes in inhibitory control and identification of emotional faces in individuals with SAD. METHOD: Twenty-nine individuals diagnosed with winter-SAD and 30 demographically matched controls with no seasonality symptoms completed...... an emotional Go/NoGo task, requiring inhibition of prepotent responses to emotional facial expressions and an emotional face identification task twice, in winter and summer. RESULTS: In winter, individuals with SAD showed impaired ability to inhibit responses to angry (p = .0006) and sad faces (p = .011...

  4. Cognitive emotion regulation in children: Reappraisal of emotional faces modulates neural source activity in a frontoparietal network

    Directory of Open Access Journals (Sweden)

    Ida Wessing

    2015-06-01

    Full Text Available Emotion regulation has an important role in child development and psychopathology. Reappraisal as cognitive regulation technique can be used effectively by children. Moreover, an ERP component known to reflect emotional processing called late positive potential (LPP can be modulated by children using reappraisal and this modulation is also related to children's emotional adjustment. The present study seeks to elucidate the neural generators of such LPP effects. To this end, children aged 8–14 years reappraised emotional faces, while neural activity in an LPP time window was estimated using magnetoencephalography-based source localization. Additionally, neural activity was correlated with two indexes of emotional adjustment and age. Reappraisal reduced activity in the left dorsolateral prefrontal cortex during down-regulation and enhanced activity in the right parietal cortex during up-regulation. Activity in the visual cortex decreased with increasing age, more adaptive emotion regulation and less anxiety. Results demonstrate that reappraisal changed activity within a frontoparietal network in children. Decreasing activity in the visual cortex with increasing age is suggested to reflect neural maturation. A similar decrease with adaptive emotion regulation and less anxiety implies that better emotional adjustment may be associated with an advance in neural maturation.

  5. Infants' Temperament and Mothers', and Fathers' Depression Predict Infants' Attention to Objects Paired with Emotional Faces.

    Science.gov (United States)

    Aktar, Evin; Mandell, Dorothy J; de Vente, Wieke; Majdandžić, Mirjana; Raijmakers, Maartje E J; Bögels, Susan M

    2016-07-01

    Between 10 and 14 months, infants gain the ability to learn about unfamiliar stimuli by observing others' emotional reactions to those stimuli, so called social referencing (SR). Joint processing of emotion and head/gaze direction is essential for SR. This study tested emotion and head/gaze direction effects on infants' attention via pupillometry in the period following the emergence of SR. Pupil responses of 14-to-17-month-old infants (N = 57) were measured during computerized presentations of unfamiliar objects alone, before-and-after being paired with emotional (happy, sad, fearful vs. neutral) faces gazing towards (vs. away) from objects. Additionally, the associations of infants' temperament, and parents' negative affect/depression/anxiety with infants' pupil responses were explored. Both mothers and fathers of participating infants completed questionnaires about their negative affect, depression and anxiety symptoms and their infants' negative temperament. Infants allocated more attention (larger pupils) to negative vs. neutral faces when the faces were presented alone, while they allocated less attention to objects paired with emotional vs. neutral faces independent of head/gaze direction. Sad (but not fearful) temperament predicted more attention to emotional faces. Infants' sad temperament moderated the associations of mothers' depression (but not anxiety) with infants' attention to objects. Maternal depression predicted more attention to objects paired with emotional expressions in infants low in sad temperament, while it predicted less attention in infants high in sad temperament. Fathers' depression (but not anxiety) predicted more attention to objects paired with emotional expressions independent of infants' temperament. We conclude that infants' own temperamental dispositions for sadness, and their exposure to mothers' and fathers' depressed moods may influence infants' attention to emotion-object associations in social learning contexts.

  6. Emotional facial expressions differentially influence predictions and performance for face recognition.

    Science.gov (United States)

    Nomi, Jason S; Rhodes, Matthew G; Cleary, Anne M

    2013-01-01

    This study examined how participants' predictions of future memory performance are influenced by emotional facial expressions. Participants made judgements of learning (JOLs) predicting the likelihood that they would correctly identify a face displaying a happy, angry, or neutral emotional expression in a future two-alternative forced-choice recognition test of identity (i.e., recognition that a person's face was seen before). JOLs were higher for studied faces with happy and angry emotional expressions than for neutral faces. However, neutral test faces with studied neutral expressions had significantly higher identity recognition rates than neutral test faces studied with happy or angry expressions. Thus, these data are the first to demonstrate that people believe happy and angry emotional expressions will lead to better identity recognition in the future relative to neutral expressions. This occurred despite the fact that neutral expressions elicited better identity recognition than happy and angry expressions. These findings contribute to the growing literature examining the interaction of cognition and emotion.

  7. A new method for face detection in colour images for emotional bio-robots

    Institute of Scientific and Technical Information of China (English)

    HAPESHI; Kevin

    2010-01-01

    Emotional bio-robots have become a hot research topic in last two decades. Though there have been some progress in research, design and development of various emotional bio-robots, few of them can be used in practical applications. The study of emotional bio-robots demands multi-disciplinary co-operation. It involves computer science, artificial intelligence, 3D computation, engineering system modelling, analysis and simulation, bionics engineering, automatic control, image processing and pattern recognition etc. Among them, face detection belongs to image processing and pattern recognition. An emotional robot must have the ability to recognize various objects, particularly, it is very important for a bio-robot to be able to recognize human faces from an image. In this paper, a face detection method is proposed for identifying any human faces in colour images using human skin model and eye detection method. Firstly, this method can be used to detect skin regions from the input colour image after normalizing its luminance. Then, all face candidates are identified using an eye detection method. Comparing with existing algorithms, this method only relies on the colour and geometrical data of human face rather than using training datasets. From experimental results, it is shown that this method is effective and fast and it can be applied to the development of an emotional bio-robot with further improvements of its speed and accuracy.

  8. An ERP Study of Emotional Face Processing in the Adult and Infant Brain

    OpenAIRE

    Leppänen, Jukka M.; Moulson, Margaret C.; Vogel-Farley, Vanessa K.; Nelson, Charles A.

    2007-01-01

    To examine the ontogeny of emotional face processing, event-related potentials (ERPs) were recorded from adults and 7-month-old infants while viewing pictures of fearful, happy, and neutral faces. Face-sensitive ERPs at occipital-temporal scalp regions differentiated between fearful and neutral/happy faces in both adults (N170 was larger for fear) and infants (P400 was larger for fear). Behavioral measures showed no overt attentional bias toward fearful faces in adults, but in infants, the du...

  9. Judging trustworthiness from faces: Emotion cues modulate trustworthiness judgments in young children

    OpenAIRE

    Caulfield, Frances; Ewing, Louise; Bank, Samantha; Rhodes, Gillian

    2016-01-01

    By adulthood, people judge trustworthiness from appearances rapidly and reliably. However, we know little about these judgments in children. This novel study investigates the developmental trajectory of explicit trust judgments from faces, and the contribution made by emotion cues across age groups. Five-, 7-, 10-year-olds, and adults rated the trustworthiness of trustworthy and untrustworthy faces with neutral expressions. The same participants also rated faces displaying overt happy and ang...

  10. The impact of emotional faces on social motivation in schizophrenia.

    Science.gov (United States)

    Radke, Sina; Pfersmann, Vera; Derntl, Birgit

    2015-10-01

    Impairments in emotion recognition and psychosocial functioning are a robust phenomenon in schizophrenia and may affect motivational behavior, particularly during socio-emotional interactions. To characterize potential deficits and their interplay, we assessed social motivation covering various facets, such as implicit and explicit approach-avoidance tendencies to facial expressions, in 27 patients with schizophrenia (SZP) and 27 matched healthy controls (HC). Moreover, emotion recognition abilities as well as self-reported behavioral activation and inhibition were evaluated. Compared to HC, SZP exhibited less pronounced approach-avoidance ratings to happy and angry expressions along with prolonged reactions during automatic approach-avoidance. Although deficits in emotion recognition were replicated, these were not associated with alterations in social motivation. Together with additional connections between psychopathology and several approach-avoidance processes, these results identify motivational impairments in SZP and suggest a complex relationship between different aspects of social motivation. In the context of specialized interventions aimed at improving social cognitive abilities in SZP, the link between such dynamic measures, motivational profiles and functional outcomes warrants further investigations, which can provide important leverage points for treatment. Crucially, our findings present first insights into the assessment and identification of target features of social motivation.

  11. Face value: Processing of emotional expressions in social anxiety

    NARCIS (Netherlands)

    Lange, Wolf-Gero

    2009-01-01

    People suffering from social anxiety disorder (SAD) are constantly worried about how they come across to other people. Their foremost fear is to be rejected and eventually abandoned. Cognitive theories suggest that SAD is characterized by a tendency to interpret (ambiguous) social cues such as (emot

  12. Eye spy: the predictive value of fixation patterns in detecting subtle and extreme emotions from faces.

    Science.gov (United States)

    Vaidya, Avinash R; Jin, Chenshuo; Fellows, Lesley K

    2014-11-01

    Successful social interaction requires recognizing subtle changes in the mental states of others. Deficits in emotion recognition are found in several neurological and psychiatric illnesses, and are often marked by disturbances in gaze patterns to faces, typically interpreted as a failure to fixate on emotionally informative facial features. However, there has been very little research on how fixations inform emotion recognition in healthy people. Here, we asked whether fixations predicted detection of subtle and extreme emotions in faces. We used a simple model to predict emotion detection scores from participants' fixation patterns. The best fit of this model heavily weighted fixations to the eyes in detecting subtle fear, disgust and surprise, with less weight, or zero weight, given to mouth and nose fixations. However, this model could not successfully predict detection of subtle happiness, or extreme emotional expressions, with the exception of fear. These findings argue that detection of most subtle emotions is best served by fixations to the eyes, with some contribution from nose and mouth fixations. In contrast, detection of extreme emotions and subtle happiness appeared to be less dependent on fixation patterns. The results offer a new perspective on some puzzling dissociations in the neuropsychological literature, and a novel analytic approach for the study of eye gaze in social or emotional settings.

  13. Interdependent mechanisms for processing gender and emotion:The special status of angry male faces

    Directory of Open Access Journals (Sweden)

    Daniel A Harris

    2016-07-01

    Full Text Available While some models of how various attributes of a face are processed have posited that face features, invariant physical cues such as gender or ethnicity as well as variant social cues such as emotion, may be processed independently (e.g., Bruce & Young, 1986, other models suggest a more distributed representation and interdependent processing (e.g., Haxby, Hoffman, & Gobbini, 2000. Here we use a contingent adaptation paradigm to investigate if mechanisms for processing the gender and emotion of a face are interdependent and symmetric across the happy-angry emotional continuum and regardless of the gender of the face. We simultaneously adapted participants to angry female faces and happy male faces (Experiment 1 or to happy female faces and angry male faces (Experiment 2. In Experiment 1 we found evidence for contingent adaptation, with simultaneous aftereffects in opposite directions: male faces were biased towards angry while female faces were biased towards happy. Interestingly, in the complementary Experiment 2 we did not find evidence for contingent adaptation, with both male and female faces biased towards angry. Our results highlight that evidence for contingent adaptation and the underlying interdependent face processing mechanisms that would allow for contingent adaptation may only be evident for certain combinations of face features. Such limits may be especially important in the case of social cues given how maladaptive it may be to stop responding to threatening information, with male angry faces considered to be the most threatening. The underlying neuronal mechanisms that could account for such asymmetric effects in contingent adaptation remain to be elucidated.

  14. Acute pharmacologically induced shifts in serotonin availability abolish emotion-selective responses to negative face emotions in distinct brain networks

    DEFF Research Database (Denmark)

    Grady, Cheryl Lynn; Siebner, Hartwig R; Hornboll, Bettina

    2013-01-01

    enhanced the neural response of this set of regions to angry faces, relative to Control, and CIT also enhanced activity for neutral faces. The net effect of these changes in both networks was to abolish the selective response to fearful expressions. These results suggest that a normal level of serotonin...... distributed brain responses identified two brain networks with modulations of activity related to face emotion and serotonin level. The first network included the left amygdala, bilateral striatum, and fusiform gyri. During the Control session this network responded only to fearful faces; increasing serotonin...... decreased this response to fear, whereas reducing serotonin enhanced the response of this network to angry faces. The second network involved bilateral amygdala and ventrolateral prefrontal cortex, and these regions also showed increased activity to fear during the Control session. Both drug challenges...

  15. Decreased sleep duration is associated with increased fMRI responses to emotional faces in children.

    Science.gov (United States)

    Reidy, Brooke L; Hamann, Stephan; Inman, Cory; Johnson, Katrina C; Brennan, Patricia A

    2016-04-01

    In adults and children, sleep loss is associated with affective dysregulation and increased responsivity to negative stimuli. Adult functional neuroimaging (fMRI) studies have demonstrated associations between restricted sleep and neural alterations in the amygdala and reward circuitry when viewing emotional picture and face stimuli. Despite this, few studies have examined the associations between short sleep duration and emotional responsivity in typically developing children, and no studies have investigated this relationship using fMRI. The current study examined the relationship between sleep duration and fMRI activation to emotional facial expressions in 15 male children (ages 7-11 years). During fMRI scanning, subjects viewed and made perceptual judgments regarding negative, neutral, and positive emotional faces. Maternal reported child sleep duration was negatively associated with (a) activation in the bilateral amygdala, left insula, and left temporal pole activation when viewing negative (i.e., fearful, disgust) vs. neutral faces, (b) right orbitofrontal and bilateral prefrontal activation when viewing disgust vs. neutral faces, and (c) bilateral orbitofrontal, right anterior cingulate, and left amygdala activation when viewing happy vs. neutral faces. Consistent with our prediction, we also noted that emotion-dependent functional connectivity between the bilateral amygdala and prefrontal cortex, cingulate, fusiform, and occipital cortex was positively associated with sleep duration. Paralleling similar studies in adults, these findings collectively suggest that decreased sleep duration in school-aged children may contribute to enhanced reactivity of brain regions involved in emotion and reward processing, as well as decreased emotion-dependent functional connectivity between the amygdala and brain regions associated with emotion regulation.

  16. Putting the face in context: Body expressions impact facial emotion processing in human infants

    Directory of Open Access Journals (Sweden)

    Purva Rajhans

    2016-06-01

    Full Text Available Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs. We primed infants with body postures (fearful, happy that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception.

  17. P2-22: Aging Effects on the Visual Scanning of Emotional Faces

    Directory of Open Access Journals (Sweden)

    Suzane Vassallo

    2012-10-01

    Full Text Available This study investigated the effect of aging on the accuracy of response, reaction time, and visual scanning strategy, while emotional faces were viewed. Forty-three neurologically healthy participants were assigned to either a young, middle, or older-aged group. Overall, older adults were significantly less accurate in recognising facial expressions, especially those demonstrating negative emotions. Further, the young and middle-aged adults took significantly less time to recognise an emotional face than the older adults. When assessing eye movements, it was noted that the older group generated a significantly greater number of fixations to the faces and spent more time overall in looking at the various facial features. Regardless of the emotional expression and participant age, all participants looked more frequently and for longer at the eye region; this was then followed by the nose and then mouth. The findings from this work support the existence of an age-related decline in emotion recognition. However, this study is the first to document reaction time differences in identifying facial affect across three different age groups. The eyes are the most important facial feature when identifying an emotional face, and this holds true across the age ranges investigated in this work. In essence, when looking at emotional faces, middle and older-aged adults demonstrate a similar scanning pattern compared to their younger counterparts–they just take longer to do so. Therefore, the normal age-related decline in recognising facial affect appears not due to impairment in the way the eyes move to look at faces.

  18. From Specificity to Sensitivity: Affective states modulate visual working memory for emotional expressive faces

    Directory of Open Access Journals (Sweden)

    Thomas eMaran

    2015-08-01

    Full Text Available Previous findings suggest that visual working memory preferentially remembers angry looking faces. However, the meaning of facial actions is construed in relation to context. To date, there are no studies investigating the role of perceiver-based context when processing emotional cues in visual working memory. To explore the influence of affective context on visual working memory for faces, we conducted two experiments using both a visual working memory task for emotionally expressive faces and a mood induction procedure. Affective context was manipulated by unpleasant (Experiment 1 and pleasant (Experiment 2 IAPS pictures in order to induce an affect high in motivational intensity (defensive or appetitive, respectively compared to a low arousal control condition. Results indicated specifically increased sensitivity of visual working memory for angry looking faces in the neutral condition. Enhanced visual working memory for angry faces was prevented by inducing affects of high motivational intensity. In both experiments, affective states led to a switch from specific enhancement of angry expressions in visual working memory to an equally sensitive representation of all emotional expressions. Our findings demonstrate that emotional expressions are of different behavioral relevance for the receiver depending on the affective context, supporting a functional organization of visual working memory along with flexible resource allocation. In visual working memory, stimulus processing adjusts to situational requirements and transitions from a specifically prioritizing default mode in predictable environments to a sensitive, hypervigilant mode in exposure to emotional events.

  19. Face and emotion expression processing and the serotonin transporter polymorphism 5-HTTLPR/rs22531.

    Science.gov (United States)

    Hildebrandt, A; Kiy, A; Reuter, M; Sommer, W; Wilhelm, O

    2016-06-01

    Face cognition, including face identity and facial expression processing, is a crucial component of socio-emotional abilities, characterizing humans as highest developed social beings. However, for these trait domains molecular genetic studies investigating gene-behavior associations based on well-founded phenotype definitions are still rare. We examined the relationship between 5-HTTLPR/rs25531 polymorphisms - related to serotonin-reuptake - and the ability to perceive and recognize faces and emotional expressions in human faces. For this aim we conducted structural equation modeling on data from 230 young adults, obtained by using a comprehensive, multivariate task battery with maximal effort tasks. By additionally modeling fluid intelligence and immediate and delayed memory factors, we aimed to address the discriminant relationships of the 5-HTTLPR/rs25531 polymorphisms with socio-emotional abilities. We found a robust association between the 5-HTTLPR/rs25531 polymorphism and facial emotion perception. Carriers of two long (L) alleles outperformed carriers of one or two S alleles. Weaker associations were present for face identity perception and memory for emotional facial expressions. There was no association between the 5-HTTLPR/rs25531 polymorphism and non-social abilities, demonstrating discriminant validity of the relationships. We discuss the implications and possible neural mechanisms underlying these novel findings. © 2016 John Wiley & Sons Ltd and International Behavioural and Neural Genetics Society.

  20. Interactions among the effects of head orientation, emotional expression, and physical attractiveness on face preferences.

    Science.gov (United States)

    Main, Julie C; DeBruine, Lisa M; Little, Anthony C; Jones, Benedict C

    2010-01-01

    Previous studies have shown that preferences for direct versus averted gaze are modulated by emotional expressions and physical attractiveness. For example, preferences for direct gaze are stronger when judging happy or physically attractive faces than when judging disgusted or physically unattractive faces. Here we show that preferences for front versus three-quarter views of faces, in which gaze direction was always congruent with head orientation, are also modulated by emotional expressions and physical attractiveness; participants demonstrated preferences for front views of faces over three-quarter views of faces when judging the attractiveness of happy, physically attractive individuals, but not when judging the attractiveness of relatively unattractive individuals or those with disgusted expressions. Moreover, further analyses indicated that these interactions did not simply reflect differential perceptions of the intensity of the emotional expressions shown in each condition. Collectively, these findings present novel evidence that the effect of the direction of the attention of others on attractiveness judgments is modulated by cues to the physical attractiveness and emotional state of the depicted individual, potentially reflecting psychological adaptations for efficient allocation of social effort. These data also present the first behavioural evidence that the effect of the direction of the attention of others on attractiveness judgments reflects viewer-referenced, rather than face-referenced, coding and/or processing of gaze direction.

  1. Can we distinguish emotions from faces? Investigation of implicit and explicit processes of peak facial expressions

    Directory of Open Access Journals (Sweden)

    Yanmei Wang

    2016-08-01

    Full Text Available Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes

  2. Face and Emotion Recognition in MCDD versus PDD-NOS

    Science.gov (United States)

    Herba, Catherine M.; de Bruin, Esther; Althaus, Monika; Verheij, Fop; Ferdinand, Robert F.

    2008-01-01

    Previous studies indicate that Multiple Complex Developmental Disorder (MCDD) children differ from PDD-NOS and autistic children on a symptom level and on psychophysiological functioning. Children with MCDD (n = 21) and PDD-NOS (n = 62) were compared on two facets of social-cognitive functioning: identification of neutral faces and facial…

  3. Reaction times and face discrimination with emotional content

    Directory of Open Access Journals (Sweden)

    ANA MARÍA MARTÍNEZ

    2002-07-01

    Full Text Available Sixty-two university subjects students located in two groups, with a stocking of age of 21.6 for thegroup of women and 22 for the group of men with the purpose to carry out a study upon visual timesof reaction TRV with emotional content keeping in mind the position: start, half and end; the emotionalcontent: neutral, friendly and threatening; and the combinations of the stimuli. The group of womenI present TR more prolonged than that of the men in all the experimental conditions. Also it wasobserved, that more are prolonged when the stimulus to discriminate this located in the half so muchin men as women.

  4. Emotional memory and perception of emotional faces in patients suffering from depersonalization disorder.

    Science.gov (United States)

    Montagne, Barbara; Sierra, Mauricio; Medford, Nick; Hunter, Elaine; Baker, Dawn; Kessels, Roy P C; de Haan, Edward H F; David, Anthony S

    2007-08-01

    Previous work has shown that patients with depersonalization disorder (DPD) have reduced physiological responses to emotional stimuli, which may be related to subjective emotional numbing. This study investigated two aspects of affective processing in 13 patients with DPD according to the DSM-IV criteria and healthy controls: the perception of emotional facial expressions (anger, disgust, fear, happiness, sadness, and surprise) and memory for emotional stimuli. Results revealed a specific lack of sensitivity to facial expression of anger in patients, but normal enhancement of memory for peripheral aspects of arousing emotional material. The results are consistent with altered processing of threat-related stimuli but intact consolidation processes, at least when the stimuli involved are potently arousing.

  5. The Not Face: A grammaticalization of facial expressions of emotion

    Science.gov (United States)

    Benitez-Quiroz, C. Fabian; Wilbur, Ronnie B.; Martinez, Aleix M.

    2016-01-01

    Facial expressions of emotion are thought to have evolved from the development of facial muscles used in sensory regulation and later adapted to express moral judgment. Negative moral judgment includes the expressions of anger, disgust and contempt. Here, we study the hypothesis that these facial expressions of negative moral judgment have further evolved into a facial expression of negation regularly used as a grammatical marker in human language. Specifically, we show that people from different cultures expressing negation use the same facial muscles as those employed to express negative moral judgment. We then show that this nonverbal signal is used as a co-articulator in speech and that, in American Sign Language, it has been grammaticalized as a non-manual marker. Furthermore, this facial expression of negation exhibits the theta oscillation (3–8 Hz) universally seen in syllable and mouthing production in speech and signing. These results provide evidence for the hypothesis that some components of human language have evolved from facial expressions of emotion, and suggest an evolutionary route for the emergence of grammatical markers. PMID:26872248

  6. Face to face: blocking facial mimicry can selectively impair recognition of emotional expressions.

    Science.gov (United States)

    Oberman, Lindsay M; Winkielman, Piotr; Ramachandran, Vilayanur S

    2007-01-01

    People spontaneously mimic a variety of behaviors, including emotional facial expressions. Embodied cognition theories suggest that mimicry reflects internal simulation of perceived emotion in order to facilitate its understanding. If so, blocking facial mimicry should impair recognition of expressions, especially of emotions that are simulated using facial musculature. The current research tested this hypothesis using four expressions (happy, disgust, fear, and sad) and two mimicry-interfering manipulations (1) biting on a pen and (2) chewing gum, as well as two control conditions. Experiment 1 used electromyography over cheek, mouth, and nose regions. The bite manipulation consistently activated assessed muscles, whereas the chew manipulation activated muscles only intermittently. Further, expressing happiness generated most facial action. Experiment 2 found that the bite manipulation interfered most with recognition of happiness. These findings suggest that facial mimicry differentially contributes to recognition of specific facial expressions, thus allowing for more refined predictions from embodied cognition theories.

  7. Modified SIFT Descriptors for Face Recognition under Different Emotions

    Directory of Open Access Journals (Sweden)

    Nirvair Neeru

    2016-01-01

    Full Text Available The main goal of this work is to develop a fully automatic face recognition algorithm. Scale Invariant Feature Transform (SIFT has sparingly been used in face recognition. In this paper, a Modified SIFT (MSIFT approach has been proposed to enhance the recognition performance of SIFT. In this paper, the work is done in three steps. First, the smoothing of the image has been done using DWT. Second, the computational complexity of SIFT in descriptor calculation is reduced by subtracting average from each descriptor instead of normalization. Third, the algorithm is made automatic by using Coefficient of Correlation (CoC instead of using the distance ratio (which requires user interaction. The main achievement of this method is reduced database size, as it requires only neutral images to store instead of all the expressions of the same face image. The experiments are performed on the Japanese Female Facial Expression (JAFFE database, which indicates that the proposed approach achieves better performance than SIFT based methods. In addition, it shows robustness against various facial expressions.

  8. Different neural and cognitive response to emotional faces in healthy monozygotic twins at risk of depression

    DEFF Research Database (Denmark)

    Miskowiak, K W; Glerup, L; Vestbo, C

    2014-01-01

    BACKGROUND: Negative cognitive bias and aberrant neural processing of emotional faces are trait-marks of depression. Yet it is unclear whether these changes constitute an endophenotype for depression and are also present in healthy individuals with hereditary risk for depression. METHOD: Thirty......-risk controls. These effects occurred in the absence of differences between groups in mood, subjective state or coping. CONCLUSIONS: Different neural response and functional connectivity within fronto-limbic and occipito-parietal regions during emotional face processing and enhanced fear vigilance may be key...

  9. Priming the Secure Attachment Schema Affects the Emotional Face Processing Bias in Attachment Anxiety: An fMRI Research.

    Science.gov (United States)

    Tang, Qingting; Chen, Xu; Hu, Jia; Liu, Ying

    2017-01-01

    Our study explored how priming with a secure base schema affects the processing of emotional facial stimuli in individuals with attachment anxiety. We enrolled 42 undergraduate students between 18 and 27 years of age, and divided them into two groups: attachment anxiety and attachment secure. All participants were primed under two conditions, the secure priming using references to the partner, and neutral priming using neutral references. We performed repeated attachment security priming combined with a dual-task paradigm and functional magnetic resonance imaging. Participants' reaction times in terms of responding to the facial stimuli were also measured. Attachment security priming can facilitate an individual's processing of positive emotional faces; for instance, the presentation of the partner's name was associated with stronger activities in a wide range of brain regions and faster reaction times for positive facial expressions in the subjects. The current finding of higher activity in the left-hemisphere regions for secure priming rather than neutral priming is consistent with the prediction that attachment security priming triggers the spread of the activation of a positive emotional state. However, the difference in brain activity during processing of both, positive and negative emotional facial stimuli between the two priming conditions appeared in the attachment anxiety group alone. This study indicates that the effect of attachment secure priming on the processing of emotional facial stimuli could be mediated by chronic attachment anxiety. In addition, it highlights the association between higher-order processes of the attachment system (secure attachment schema priming) and early-stage information processing system (attention), given the increased attention toward the effects of secure base schema on the processing of emotion- and attachment-related information among the insecure population. Thus, the following study has applications in providing

  10. Colored halos around faces and emotion-evoked colors: a new form of synesthesia.

    Science.gov (United States)

    Ramachandran, Vilayanur S; Miller, Luke; Livingstone, Margaret S; Brang, David

    2012-01-01

    The claim that some individuals see colored halos or auras around faces has long been part of popular folklore. Here we report on a 23-year-old man (subject TK) diagnosed with Asperger's disorder, who began to consistently experience colors around individuals at the age of 10. TK's colors are based on the individual's identity and emotional connotation. We interpret these experiences as a form of synesthesia, and confirm their authenticity through a target detection paradigm. Additionally, we investigate TK's claim that emotions evoke highly specific colors, allowing him, despite his Asperger's, to introspect on emotions and recognize them in others.

  11. Judging trustworthiness from faces: Emotion cues modulate trustworthiness judgments in young children.

    Science.gov (United States)

    Caulfield, Frances; Ewing, Louise; Bank, Samantha; Rhodes, Gillian

    2016-08-01

    By adulthood, people judge trustworthiness from appearances rapidly and reliably. However, we know little about these judgments in children. This novel study investigates the developmental trajectory of explicit trust judgments from faces, and the contribution made by emotion cues across age groups. Five-, 7-, 10-year-olds, and adults rated the trustworthiness of trustworthy and untrustworthy faces with neutral expressions. The same participants also rated faces displaying overt happy and angry expressions, allowing us to investigate whether emotion cues modulate trustworthiness judgments similarly in children and adults. Results revealed that the ability to evaluate the trustworthiness of faces emerges in childhood, but may not be adult like until 10 years of age. Moreover, we show that emotion cues modulate trust judgments in young children, as well as adults. Anger cues diminished the appearance of trustworthiness for participants from 5 years of age and happy cues increased it, although this effect did not consistently emerge until later in childhood, that is, 10 years of age. These associations also extended to more subtle emotion cues present in neutral faces. Our results indicate that young children are sensitive to facial trustworthiness, and suggest that similar expression cues modulate these judgments in children and adults. © 2015 The British Psychological Society.

  12. Perceiving emotions in neutral faces: expression processing is biased by affective person knowledge.

    Science.gov (United States)

    Suess, Franziska; Rabovsky, Milena; Abdel Rahman, Rasha

    2015-04-01

    According to a widely held view, basic emotions such as happiness or anger are reflected in facial expressions that are invariant and uniquely defined by specific facial muscle movements. Accordingly, expression perception should not be vulnerable to influences outside the face. Here, we test this assumption by manipulating the emotional valence of biographical knowledge associated with individual persons. Faces of well-known and initially unfamiliar persons displaying neutral expressions were associated with socially relevant negative, positive or comparatively neutral biographical information. The expressions of faces associated with negative information were classified as more negative than faces associated with neutral information. Event-related brain potential modulations in the early posterior negativity, a component taken to reflect early sensory processing of affective stimuli such as emotional facial expressions, suggest that negative affective knowledge can bias the perception of faces with neutral expressions toward subjectively displaying negative emotions. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  13. Electrocortical reactivity to emotional images and faces in middle childhood to early adolescence.

    Science.gov (United States)

    Kujawa, Autumn; Klein, Daniel N; Hajcak, Greg

    2012-10-01

    The late positive potential (LPP) is an event-related potential (ERP) component that indexes sustained attention toward motivationally salient information. The LPP has been observed in children and adults, however little is known about its development from childhood into adolescence. In addition, whereas LPP studies examine responses to images from the International Affective Picture System (IAPS; Lang et al., 2008) or emotional faces, no previous studies have compared responses in youth across stimuli. To examine how emotion interacts with attention across development, the current study used an emotional-interrupt task to measure LPP and behavioral responses in 8- to 13-year-olds using unpleasant, pleasant, and neutral IAPS images, as well as sad, happy, and neutral faces. Compared to older youth, younger children exhibited enhanced LPPs over occipital sites. In addition, sad but not happy faces elicited a larger LPP than neutral faces; behavioral measures did not vary across facial expressions. Both unpleasant and pleasant IAPS images were associated with increased LPPs and behavioral interference compared to neutral images. Results suggest that there may be developmental differences in the scalp distribution of the LPP, and compared to faces, IAPS elicit more robust behavioral and electrocortical measures of attention to emotional stimuli.

  14. Emotional intelligence is associated with reduced insula responses to masked angry faces.

    Science.gov (United States)

    Alkozei, Anna; Killgore, William D S

    2015-07-08

    High levels of emotional intelligence (EI) have been associated with increased success in the workplace, greater quality of personal relationships, and enhanced wellbeing. Evidence suggests that EI is mediated extensively by the interplay of key emotion regions including the amygdala, insula, and ventromedial prefrontal cortex, among others. The insula, in particular, is important for processing interoceptive and somatic cues that are interpreted as emotional responses. We investigated the association between EI and functional brain responses within the aforementioned neurocircuitry in response to subliminal presentations of social threat. Fifty-four healthy adults completed the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT) and underwent functional magnetic brain imaging while viewing subliminal presentations of faces displaying anger, using a backward masked facial affect paradigm to minimize conscious awareness of the expressed emotion. In response to masked angry faces, the total MSCEIT scores correlated negatively with a cluster of activation located within the left insula, but not with activation in any other region of interest. Considering the insula's role in the processing of interoceptive emotional cues, the results suggest that greater EI is associated with reduced emotional visceral reactivity and/or more accurate interoceptive prediction when confronted with stimuli indicative of social threat.

  15. Brain potentials reflect access to visual and emotional memories for faces.

    Science.gov (United States)

    Bobes, Maria A; Quiñonez, Ileana; Perez, Jhoanna; Leon, Inmaculada; Valdés-Sosa, Mitchell

    2007-05-01

    Familiar faces convey different types of information, unlocking memories related to social-emotional significance. Here, the availability over time of different types of memory was evaluated using the time-course of P3 event related potentials. Two oddball paradigms were employed, both using unfamiliar faces as standards. The infrequent targets were, respectively, artificially-learned faces (devoid of social-emotional content) and faces of acquaintances. Although in both tasks targets were detected accurately, the corresponding time-course and scalp distribution of the P3 responses differed. Artificially-learned and acquaintance faces both elicited a P3b, maximal over centro-parietal sites, and a latency of 500ms. Faces of acquaintances elicited an additional component, an early P3 maximal over frontal sites: with a latency of 350ms. This suggests that visual familiarity can only trigger the overt recognition processes leading to the slower P3b, whereas emotional-social information can also elicit fast and automatic assessments (indexed by the frontal-P3) crucial for successful social interactions.

  16. Are patients with schizophrenia impaired in processing non-emotional features of human faces?

    Directory of Open Access Journals (Sweden)

    Hayley eDarke

    2013-08-01

    Full Text Available It is known that individuals with schizophrenia exhibit signs of impaired face processing, however, the exact perceptual and cognitive mechanisms underlying these deficits are yet to be elucidated. One possible source of confusion in the current literature is the methodological and conceptual inconsistencies that can arise from the varied treatment of different aspects of face processing relating to emotional and non-emotional aspects of face perception. This review aims to disentangle the literature by focusing on the performance of patients with schizophrenia in a range of tasks that required processing of non-emotional features of face stimuli (e.g. identity or gender. We also consider the performance of patients on non-face stimuli that share common elements such as familiarity (e.g. cars and social relevance (e.g. gait. We conclude by exploring whether observed deficits are best considered as face-specific and note that further investigation is required to properly assess the potential contribution of more generalised attentional or perceptual impairments.

  17. Emotional face recognition deficit in amnestic patients with mild cognitive impairment: behavioral and electrophysiological evidence

    Directory of Open Access Journals (Sweden)

    Yang L

    2015-08-01

    Full Text Available Linlin Yang, Xiaochuan Zhao, Lan Wang, Lulu Yu, Mei Song, Xueyi Wang Department of Mental Health, The First Hospital of Hebei Medical University, Hebei Medical University Institute of Mental Health, Shijiazhuang, People’s Republic of China Abstract: Amnestic mild cognitive impairment (MCI has been conceptualized as a transitional stage between healthy aging and Alzheimer’s disease. Thus, understanding emotional face recognition deficit in patients with amnestic MCI could be useful in determining progression of amnestic MCI. The purpose of this study was to investigate the features of emotional face processing in amnestic MCI by using event-related potentials (ERPs. Patients with amnestic MCI and healthy controls performed a face recognition task, giving old/new responses to previously studied and novel faces with different emotional messages as the stimulus material. Using the learning-recognition paradigm, the experiments were divided into two steps, ie, a learning phase and a test phase. ERPs were analyzed on electroencephalographic recordings. The behavior data indicated high emotion classification accuracy for patients with amnestic MCI and for healthy controls. The mean percentage of correct classifications was 81.19% for patients with amnestic MCI and 96.46% for controls. Our ERP data suggest that patients with amnestic MCI were still be able to undertake personalizing processing for negative faces, but not for neutral or positive faces, in the early frontal processing stage. In the early time window, no differences in frontal old/new effect were found between patients with amnestic MCI and normal controls. However, in the late time window, the three types of stimuli did not elicit any old/new parietal effects in patients with amnestic MCI, suggesting their recollection was impaired. This impairment may be closely associated with amnestic MCI disease. We conclude from our data that face recognition processing and emotional memory is

  18. Emotional Faces in Context: Age Differences in Recognition Accuracy and Scanning Patterns

    Science.gov (United States)

    Noh, Soo Rim; Isaacowitz, Derek M.

    2014-01-01

    While age-related declines in facial expression recognition are well documented, previous research relied mostly on isolated faces devoid of context. We investigated the effects of context on age differences in recognition of facial emotions and in visual scanning patterns of emotional faces. While their eye movements were monitored, younger and older participants viewed facial expressions (i.e., anger, disgust) in contexts that were emotionally congruent, incongruent, or neutral to the facial expression to be identified. Both age groups had highest recognition rates of facial expressions in the congruent context, followed by the neutral context, and recognition rates in the incongruent context were worst. These context effects were more pronounced for older adults. Compared to younger adults, older adults exhibited a greater benefit from congruent contextual information, regardless of facial expression. Context also influenced the pattern of visual scanning characteristics of emotional faces in a similar manner across age groups. In addition, older adults initially attended more to context overall. Our data highlight the importance of considering the role of context in understanding emotion recognition in adulthood. PMID:23163713

  19. How Context Influences Our Perception of Emotional Faces: A Behavioral Study on the Kuleshov Effect

    Directory of Open Access Journals (Sweden)

    Marta Calbi

    2017-10-01

    Full Text Available Facial expressions are of major importance in understanding the mental and emotional states of others. So far, most studies on the perception and comprehension of emotions have used isolated facial expressions as stimuli; for example, photographs of actors displaying facial expressions corresponding to one of the so called ‘basic emotions.’ However, our real experience during social interactions is different: facial expressions of emotion are mostly perceived in a wider context, constituted by body language, the surrounding environment, and our beliefs and expectations. Already in the early twentieth century, the Russian filmmaker Lev Kuleshov argued that such context, established by intermediate shots of strong emotional content, could significantly change our interpretation of facial expressions in film. Prior experiments have shown behavioral effects pointing in this direction, but have only used static images as stimuli. Our study used a more ecological design with participants watching film sequences of neutral faces, crosscut with scenes of strong emotional content (evoking happiness or fear, plus neutral stimuli as a baseline condition. The task was to rate the emotion displayed by a target person’s face in terms of valence, arousal, and category. Results clearly demonstrated the presence of a significant effect in terms of both valence and arousal in the fear condition only. Moreover, participants tended to categorize the target person’s neutral facial expression choosing the emotion category congruent with the preceding context. Our results highlight the context-sensitivity of emotions and the importance of studying them under ecologically valid conditions.

  20. Two years after epilepsy surgery in children : Recognition of emotions expressed by faces

    NARCIS (Netherlands)

    Braams, Olga; Meekes, Joost; van Nieuwenhuizen, Onno; Schappin, Renske; van Rijen, Peter C.; Veenstra, Wencke; Braun, Kees; Jennekens-Schinkel, Aag

    2015-01-01

    Objectives: The purpose of this study was to determine whether children with epilepsy surgery in their history are able to recognize emotions expressed by faces and whether this recognition is associated with demographic variables [age, sex, and verbal intelligence (VIQ)] and/or epilepsy variables

  1. Automated facial coding: validation of basic emotions and FACS AUs in FaceReader

    NARCIS (Netherlands)

    P. Lewinski; T.M. den Uyl; C. Butler

    2014-01-01

    In this study, we validated automated facial coding (AFC) software—FaceReader (Noldus, 2014)—on 2 publicly available and objective datasets of human expressions of basic emotions. We present the matching scores (accuracy) for recognition of facial expressions and the Facial Action Coding System (FAC

  2. Emotional Face Identification in Youths with Primary Bipolar Disorder or Primary Attention-Deficit/Hyperactivity Disorder

    Science.gov (United States)

    Seymour, Karen E.; Pescosolido, Matthew F.; Reidy, Brooke L.; Galvan, Thania; Kim, Kerri L.; Young, Matthew; Dickstein, Daniel P.

    2013-01-01

    Objective: Bipolar disorder (BD) and attention-deficit/hyperactivity disorder (ADHD) are often comorbid or confounded; therefore, we evaluated emotional face identification to better understand brain/behavior interactions in children and adolescents with either primary BD, primary ADHD, or typically developing controls (TDC). Method: Participants…

  3. Amygdala Hyperactivation During Face Emotion Processing in Unaffected Youth at Risk for Bipolar Disorder

    Science.gov (United States)

    Olsavsky, Aviva K.; Brotman, Melissa A.; Rutenberg, Julia G.; Muhrer, Eli J.; Deveney, Christen M.; Fromm, Stephen J.; Towbin, Kenneth; Pine, Daniel S.; Leibenluft, Ellen

    2012-01-01

    Objective: Youth at familial risk for bipolar disorder (BD) show deficits in face emotion processing, but the neural correlates of these deficits have not been examined. This preliminary study tests the hypothesis that, relative to healthy comparison (HC) subjects, both BD subjects and youth at risk for BD (i.e., those with a first-degree BD…

  4. Emotional Face Identification in Youths with Primary Bipolar Disorder or Primary Attention-Deficit/Hyperactivity Disorder

    Science.gov (United States)

    Seymour, Karen E.; Pescosolido, Matthew F.; Reidy, Brooke L.; Galvan, Thania; Kim, Kerri L.; Young, Matthew; Dickstein, Daniel P.

    2013-01-01

    Objective: Bipolar disorder (BD) and attention-deficit/hyperactivity disorder (ADHD) are often comorbid or confounded; therefore, we evaluated emotional face identification to better understand brain/behavior interactions in children and adolescents with either primary BD, primary ADHD, or typically developing controls (TDC). Method: Participants…

  5. Modulation of facial reactions to avatar emotional faces by nonconscious competition priming.

    Science.gov (United States)

    Weyers, Peter; Mühlberger, Andreas; Kund, Anja; Hess, Ursula; Pauli, Paul

    2009-03-01

    To investigate whether subliminally priming for competition influences facial reactions to facial emotional displays, 49 participants were either subliminally competition primed or neutrally primed. Thereafter, they viewed computer generated avatar faces with happy, neutral, and sad expressions while Corrugator supercilii and Zygomaticus major reactions were recorded. Results revealed facial mimicry to happy and sad faces in the neutrally primed group but not the competition primed group. Furthermore, subliminal competition priming enhanced Corrugator supercilii activity after an initial relaxation while viewing happy faces. An impression formation task revealed counter empathic effects confirming successful competition priming. Overall, results indicate that nonconscious processes influence a presumably nonconscious behavior.

  6. The changing face of emotion: age-related patterns of amygdala activation to salient faces

    NARCIS (Netherlands)

    Todd, R.M.; Evans, J.W.; Morris, D.; Lewis, M.D.; Taylor, M.J.

    2011-01-01

    The present study investigated age-related differences in the amygdala and other nodes of face-processing networks in response to facial expression and familiarity. fMRI data were analyzed from 31 children (3.5–8.5 years) and 14 young adults (18–33 years) who viewed pictures of familiar (mothers) an

  7. Emotion Recognition in Faces and the Use of Visual Context in Young People with High-Functioning Autism Spectrum Disorders

    Science.gov (United States)

    Wright, Barry; Clarke, Natalie; Jordan, Jo; Young, Andrew W.; Clarke, Paula; Miles, Jeremy; Nation, Kate; Clarke, Leesa; Williams, Christine

    2008-01-01

    We compared young people with high-functioning autism spectrum disorders (ASDs) with age, sex and IQ matched controls on emotion recognition of faces and pictorial context. Each participant completed two tests of emotion recognition. The first used Ekman series faces. The second used facial expressions in visual context. A control task involved…

  8. Emotion Recognition in Faces and the Use of Visual Context in Young People with High-Functioning Autism Spectrum Disorders

    Science.gov (United States)

    Wright, Barry; Clarke, Natalie; Jordan, Jo; Young, Andrew W.; Clarke, Paula; Miles, Jeremy; Nation, Kate; Clarke, Leesa; Williams, Christine

    2008-01-01

    We compared young people with high-functioning autism spectrum disorders (ASDs) with age, sex and IQ matched controls on emotion recognition of faces and pictorial context. Each participant completed two tests of emotion recognition. The first used Ekman series faces. The second used facial expressions in visual context. A control task involved…

  9. The right place at the right time: priming facial expressions with emotional face components in developmental visual agnosia.

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran R; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo

    2012-04-01

    The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG's impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face's emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG's performance was strongly influenced by the diagnosticity of the components: his emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components.

  10. Interactions between Identity and Emotional Expression in Face Processing across the Lifespan: Evidence from Redundancy Gains

    Directory of Open Access Journals (Sweden)

    Alla Yankouskaya

    2014-01-01

    Full Text Available We tested how aging affects the integration of visual information from faces. Three groups of participants aged 20–30, 40–50, and 60–70 performed a divided attention task in which they had to detect the presence of a target facial identity or a target facial expression. Three target stimuli were used: (1 with the target identity but not the target expression, (2 with the target expression but not the target identity, and (3 with both the target identity and target expression (the redundant target condition. On nontarget trials the faces contained neither the target identity nor expression. All groups were faster in responding to a face containing both the target identity and emotion compared to faces containing either single target. Furthermore the redundancy gains for combined targets exceeded performance limits predicted by the independent processing of facial identity and emotion. These results are held across the age range. The results suggest that there is interactive processing of facial identity and emotion which is independent of the effects of cognitive aging. Older participants demonstrated reliably larger size of the redundancy gains compared to the young group that reflect a greater experience with faces. Alternative explanations are discussed.

  11. The Effect of Repeated Ketamine Infusion Over Facial Emotion Recognition in Treatment-Resistant Depression: A Preliminary Report.

    Science.gov (United States)

    Shiroma, Paulo R; Albott, C Sophia; Johns, Brian; Thuras, Paul; Wels, Joseph; Lim, Kelvin O

    2015-01-01

    In contrast to improvement in emotion recognition bias by traditional antidepressants, the authors report preliminary findings that changes in facial emotion recognition are not associated with response of depressive symptoms after repeated ketamine infusions or relapse during follow-up in treatment-resistant depression.

  12. Neurodynamic studies on emotional and inverted faces in an oddball paradigm.

    Science.gov (United States)

    Susac, Ana; Ilmoniemi, Risto J; Pihko, Elina; Supek, Selma

    2004-01-01

    The detection of a change in a face stimulus was studied in an oddball paradigm. Event-related potentials (ERPs) and MEG responses to face stimuli were recorded in four conditions: 1) happy standard, neutral deviant; 2) neutral standard, neutral deviant; 3) inverted happy standard, inverted neutral deviant; 4) inverted neutral standard, inverted neutral deviant. In all conditions, the target was a face with glasses. Neutral deviants elicited a negative deflection (with a maximum around 280 ms) in ERP and MEG responses, an effect similar to auditory mismatch negativity. Face inversion diminished deviance-related negativity, implying an important role of face recognition in the observed effect. Emotional content and larger physical differences between stimuli in conditions 1 and 3 compared to conditions 2 and 4 did not show statistically significant effect on the neutral-deviant-related negativity.

  13. How is this child feeling? Preschool-aged children’s ability to recognize emotion in faces and body poses

    OpenAIRE

    Parker, Alison E.; Mathis, Erin T.; Kupersmidt, Janis B.

    2013-01-01

    The study examined children’s recognition of emotion from faces and body poses, as well as gender differences in these recognition abilities. Preschool-aged children (N = 55) and their parents and teachers participated in the study. Preschool-aged children completed a web-based measure of emotion recognition skills, which included five tasks (three with faces and two with bodies). Parents and teachers reported on children’s aggressive behaviors and social skills. Children’s emotion accuracy o...

  14. Strategy When Faced with Failure: Persistence and Degree Attainment of Course Repeaters versus Non-Repeaters. AIR 2002 Forum Paper.

    Science.gov (United States)

    Fenton, Kathleen S.

    Graduation and persistence rates were compared for 184 students, 92 of whom had repeated multiple courses or at least 1 course 3 times. A control group of 92 nonrepeating students was drawn from the remaining 303 students of the entire 1996 cohort. There was no difference between the graduation rate of repeaters and nonrepeaters. The persistence…

  15. Childhood Poverty Predicts Adult Amygdala and Frontal Activity and Connectivity in Response to Emotional Faces.

    Science.gov (United States)

    Javanbakht, Arash; King, Anthony P; Evans, Gary W; Swain, James E; Angstadt, Michael; Phan, K Luan; Liberzon, Israel

    2015-01-01

    Childhood poverty negatively impacts physical and mental health in adulthood. Altered brain development in response to social and environmental factors associated with poverty likely contributes to this effect, engendering maladaptive patterns of social attribution and/or elevated physiological stress. In this fMRI study, we examined the association between childhood poverty and neural processing of social signals (i.e., emotional faces) in adulthood. Fifty-two subjects from a longitudinal prospective study recruited as children, participated in a brain imaging study at 23-25 years of age using the Emotional Faces Assessment Task. Childhood poverty, independent of concurrent adult income, was associated with higher amygdala and medial prefrontal cortical (mPFC) responses to threat vs. happy faces. Also, childhood poverty was associated with decreased functional connectivity between left amygdala and mPFC. This study is unique, because it prospectively links childhood poverty to emotional processing during adulthood, suggesting a candidate neural mechanism for negative social-emotional bias. Adults who grew up poor appear to be more sensitive to social threat cues and less sensitive to positive social cues.

  16. Childhood Poverty Predicts Adult Amygdala and Frontal Activity and Connectivity in Response to Emotional Faces

    Directory of Open Access Journals (Sweden)

    Arash eJavanbakht

    2015-06-01

    Full Text Available Childhood poverty negatively impacts physical and mental health in adulthood. Altered brain development in response to social and environmental factors associated with poverty likely contributes to this effect, engendering maladaptive patterns of social attribution and/or elevated physiological stress. In this fMRI study, we examined the association between childhood poverty and neural processing of social signals (i.e., emotional faces in adulthood. 52 subjects from a longitudinal prospective study recruited as children, participated in a brain imaging study at 23-25 years of age using the Emotional Faces Assessment Task (EFAT. Childhood poverty, independent of concurrent adult income, was associated with higher amygdala and mPFC responses to threat vs. happy faces. Also, childhood poverty was associated with decreased functional connectivity between left amygdala and mPFC. This study is unique because it prospectively links childhood poverty to emotional processing during adulthood, suggesting a candidate neural mechanism for negative social-emotional bias. Adults who grew up poor appear to be more sensitive to social threat cues and less sensitive to positive social cues.

  17. Childhood Poverty Predicts Adult Amygdala and Frontal Activity and Connectivity in Response to Emotional Faces

    Science.gov (United States)

    Javanbakht, Arash; King, Anthony P.; Evans, Gary W.; Swain, James E.; Angstadt, Michael; Phan, K. Luan; Liberzon, Israel

    2015-01-01

    Childhood poverty negatively impacts physical and mental health in adulthood. Altered brain development in response to social and environmental factors associated with poverty likely contributes to this effect, engendering maladaptive patterns of social attribution and/or elevated physiological stress. In this fMRI study, we examined the association between childhood poverty and neural processing of social signals (i.e., emotional faces) in adulthood. Fifty-two subjects from a longitudinal prospective study recruited as children, participated in a brain imaging study at 23–25 years of age using the Emotional Faces Assessment Task. Childhood poverty, independent of concurrent adult income, was associated with higher amygdala and medial prefrontal cortical (mPFC) responses to threat vs. happy faces. Also, childhood poverty was associated with decreased functional connectivity between left amygdala and mPFC. This study is unique, because it prospectively links childhood poverty to emotional processing during adulthood, suggesting a candidate neural mechanism for negative social-emotional bias. Adults who grew up poor appear to be more sensitive to social threat cues and less sensitive to positive social cues. PMID:26124712

  18. Training Approach-Avoidance of Smiling Faces Affects Emotional Vulnerability in Socially Anxious Individuals

    Directory of Open Access Journals (Sweden)

    Mike eRinck

    2013-08-01

    Full Text Available Previous research revealed an automatic behavioral bias in high socially anxious individuals (HSAs: Although their explicit evaluations of smiling faces are positive, they show automatic avoidance of these faces. This is reflected by faster pushing than pulling of smiling faces in an Approach-Avoidance Task (AAT; Heuer, Rinck, & Becker, 2007. The current study addressed the causal role of this avoidance bias for social anxiety. To this end, we used the AAT to train HSAs, either to approach smiling faces or to avoid them. We examined whether such an AAT training could change HSAs’ automatic avoidance tendencies, and if yes, whether AAT effects would generalize to a new approach task with new facial stimuli, and to mood and anxiety in a social threat situation (a video-recorded self-presentation. We found that HSAs trained to approach smiling faces did indeed approach female faces faster after the training than HSAs trained to avoid smiling faces. Moreover, approach-faces training reduced emotional vulnerability: It led to more positive mood and lower anxiety after the self-presentation than avoid-faces training. These results suggest that automatic approach-avoidance tendencies have a causal role in social anxiety, and that they can be modified by a simple computerized training. This may open new avenues in the therapy of social phobia.

  19. The extended functional neuroanatomy of emotional processing biases for masked faces in major depressive disorder.

    Directory of Open Access Journals (Sweden)

    Teresa A Victor

    Full Text Available BACKGROUND: Major depressive disorder (MDD is associated with a mood-congruent processing bias in the amygdala toward face stimuli portraying sad expressions that is evident even when such stimuli are presented below the level of conscious awareness. The extended functional anatomical network that maintains this response bias has not been established, however. AIMS: To identify neural network differences in the hemodynamic response to implicitly presented facial expressions between depressed and healthy control participants. METHOD: Unmedicated-depressed participants with MDD (n=22 and healthy controls (HC; n=25 underwent functional MRI as they viewed face stimuli showing sad, happy or neutral face expressions, presented using a backward masking design. The blood-oxygen-level dependent (BOLD signal was measured to identify regions where the hemodynamic response to the emotionally valenced stimuli differed between groups. RESULTS: The MDD subjects showed greater BOLD responses than the controls to masked-sad versus masked-happy faces in the hippocampus, amygdala and anterior inferotemporal cortex. While viewing both masked-sad and masked-happy faces relative to masked-neutral faces, the depressed subjects showed greater hemodynamic responses than the controls in a network that included the medial and orbital prefrontal cortices and anterior temporal cortex. CONCLUSIONS: Depressed and healthy participants showed distinct hemodynamic responses to masked-sad and masked-happy faces in neural circuits known to support the processing of emotionally valenced stimuli and to integrate the sensory and visceromotor aspects of emotional behavior. Altered function within these networks in MDD may establish and maintain illness-associated differences in the salience of sensory/social stimuli, such that attention is biased toward negative and away from positive stimuli.

  20. Word wins over Face: Emotional Stroop effect activates the frontal cortical network

    Directory of Open Access Journals (Sweden)

    Shima Ovaysikia

    2011-01-01

    Full Text Available The prefrontal cortex (PFC has been implicated in higher order cognitive control of behaviour. Sometimes such control is executed through suppression of an unwanted response in order to avoid conflict. Conflict occurs when two simultaneously competing processes lead to different behavioral outcomes, as seen in tasks such as the anti-saccade, go/no-go and the Stroop task. We set out to examine whether different types of stimuli in a modified emotional Stroop task would cause similar interference effects as the original Stroop-colour/word, and whether the required suppression mechanism(s would recruit similar regions of the medial PFC (mPFC. By using emotional words and emotional faces in this Stroop experiment, we examined the two well-learned automatic behaviours of word reading and recognition of face expressions. In our emotional Stroop paradigm, words were processed faster than face expressions with incongruent trials yielding longer reaction times (RT and larger number of errors compared to the congruent trials. This novel Stroop effect activated the anterior and inferior regions of the mPFC, namely the anterior cingulate cortex (ACC, inferior frontal gyrus (IFG as well as the superior frontal gyrus. Our results suggest that prepotent behaviours such as reading and recognition of face expressions are stimulus-dependent and perhaps hierarchical, hence recruiting distinct regions of the mPFC. Moreover, the faster processing of word reading compared to reporting face expressions is indicative of the formation of stronger stimulus-response (SR associations of an over-learned behaviour compared to an instinctive one, which could alternatively be explained through the distinction between awareness and selective attention.

  1. Self-esteem Modulates the P3 Component in Response to the Self-face Processing after Priming with Emotional Faces

    Directory of Open Access Journals (Sweden)

    Lili Guan

    2017-08-01

    Full Text Available The self-face processing advantage (SPA refers to the research finding that individuals generally recognize their own face faster than another’s face; self-face also elicits an enhanced P3 amplitude compared to another’s face. It has been suggested that social evaluation threats could weaken the SPA and that self-esteem could be regarded as a threat buffer. However, little research has directly investigated the neural evidence of how self-esteem modulates the social evaluation threat to the SPA. In the current event-related potential study, 27 healthy Chinese undergraduate students were primed with emotional faces (angry, happy, or neutral and were asked to judge whether the target face (self, friend, and stranger was familiar or unfamiliar. Electrophysiological results showed that after priming with emotional faces (angry and happy, self-face elicited similar P3 amplitudes to friend-face in individuals with low self-esteem, but not in individuals with high self-esteem. The results suggest that as low self-esteem raises fears of social rejection and exclusion, priming with emotional faces (angry and happy can weaken the SPA in low self-esteem individuals but not in high self-esteem individuals.

  2. Self-esteem Modulates the P3 Component in Response to the Self-face Processing after Priming with Emotional Faces.

    Science.gov (United States)

    Guan, Lili; Zhao, Yufang; Wang, Yige; Chen, Yujie; Yang, Juan

    2017-01-01

    The self-face processing advantage (SPA) refers to the research finding that individuals generally recognize their own face faster than another's face; self-face also elicits an enhanced P3 amplitude compared to another's face. It has been suggested that social evaluation threats could weaken the SPA and that self-esteem could be regarded as a threat buffer. However, little research has directly investigated the neural evidence of how self-esteem modulates the social evaluation threat to the SPA. In the current event-related potential study, 27 healthy Chinese undergraduate students were primed with emotional faces (angry, happy, or neutral) and were asked to judge whether the target face (self, friend, and stranger) was familiar or unfamiliar. Electrophysiological results showed that after priming with emotional faces (angry and happy), self-face elicited similar P3 amplitudes to friend-face in individuals with low self-esteem, but not in individuals with high self-esteem. The results suggest that as low self-esteem raises fears of social rejection and exclusion, priming with emotional faces (angry and happy) can weaken the SPA in low self-esteem individuals but not in high self-esteem individuals.

  3. Emotional face recognition deficit in amnestic patients with mild cognitive impairment: behavioral and electrophysiological evidence

    OpenAIRE

    Yang L; Zhao X; Wang L; Yu L; Song M; Wang X.

    2015-01-01

    Linlin Yang, Xiaochuan Zhao, Lan Wang, Lulu Yu, Mei Song, Xueyi Wang Department of Mental Health, The First Hospital of Hebei Medical University, Hebei Medical University Institute of Mental Health, Shijiazhuang, People’s Republic of China Abstract: Amnestic mild cognitive impairment (MCI) has been conceptualized as a transitional stage between healthy aging and Alzheimer’s disease. Thus, understanding emotional face recognition deficit in patients with amnestic...

  4. Initial Orientation of Attention towards Emotional Faces in Children with Attention Deficit Hyperactivity Disorder

    Directory of Open Access Journals (Sweden)

    Mehrnoosh Ahmadi

    2011-09-01

    Full Text Available Objective: Early recognition of negative emotions is considered to be of vital importance. It seems that children with attention deficit hyperactivity disorder have some difficulties recognizing facial emotional expressions, especially negative ones. This study investigated the preference of children with attention deficit hyperactivity disorder for negative (angry, sad facial expressions compared to normal children.Method: Participants were 35 drug naive boys with ADHD, aged between 6-11 years ,and 31 matched healthy children. Visual orientation data were recorded while participants viewed face pairs (negative-neutral pairs shown for 3000ms. The number of first fixations made to each expression was considered as an index of initial orientation. Results: Group comparisons revealed no difference between attention deficit hyperactivity disorder group and their matched healthy counterparts in initial orientation of attention. A tendency towards negative emotions was found within the normal group, while no difference was observed between initial allocation of attention toward negative and neutral expressions in children with ADHD .Conclusion: Children with attention deficit hyperactivity disorder do not have significant preference for negative facial expressions. In contrast, normal children have a significant preference for negative facial emotions rather than neutral faces.

  5. Perception of emotional expressions is independent of face selectivity in monkey inferior temporal cortex.

    Science.gov (United States)

    Hadj-Bouziane, Fadila; Bell, Andrew H; Knusten, Tamara A; Ungerleider, Leslie G; Tootell, Roger B H

    2008-04-08

    The ability to perceive and differentiate facial expressions is vital for social communication. Numerous functional MRI (fMRI) studies in humans have shown enhanced responses to faces with different emotional valence, in both the amygdala and the visual cortex. However, relatively few studies have examined how valence influences neural responses in monkeys, thereby limiting the ability to draw comparisons across species and thus understand the underlying neural mechanisms. Here we tested the effects of macaque facial expressions on neural activation within these two regions using fMRI in three awake, behaving monkeys. Monkeys maintained central fixation while blocks of different monkey facial expressions were presented. Four different facial expressions were tested: (i) neutral, (ii) aggressive (open-mouthed threat), (iii) fearful (fear grin), and (iv) submissive (lip smack). Our results confirmed that both the amygdala and the inferior temporal cortex in monkeys are modulated by facial expressions. As in human fMRI, fearful expressions evoked the greatest response in monkeys-even though fearful expressions are physically dissimilar in humans and macaques. Furthermore, we found that valence effects were not uniformly distributed over the inferior temporal cortex. Surprisingly, these valence maps were independent of two related functional maps: (i) the map of "face-selective" regions (faces versus non-face objects) and (ii) the map of "face-responsive" regions (faces versus scrambled images). Thus, the neural mechanisms underlying face perception and valence perception appear to be distinct.

  6. Cocaine Exposure Is Associated with Subtle Compromises of Infants' and Mothers' Social-Emotional Behavior and Dyadic Features of Their Interaction in the Face-to-Face Still-Face Paradigm

    Science.gov (United States)

    Tronick, E. Z.; Messinger, D. S.; Weinberg, M. K.; Lester, B. M.; LaGasse, L.; Seifer, R.; Bauer, C. R.; Shankaran, S.; Bada, H.; Wright, L. L.; Poole, K.; Liu, J.

    2005-01-01

    Prenatal cocaine and opiate exposure are thought to subtly compromise social and emotional development. The authors observed a large sample of 236 cocaine-exposed and 459 nonexposed infants (49 were opiate exposed and 646 nonexposed) with their mothers in the face-to-face still-face paradigm. Infant and maternal behaviors were microanalytically…

  7. Attentionally modulated effects of cortisol and mood on memory for emotional faces in healthy young males.

    Science.gov (United States)

    Van Honk, J; Kessels, R P C; Putman, P; Jager, G; Koppeschaar, H P F; Postma, A

    2003-10-01

    Heightened cortisol levels due to stress or acute administration seem to enhance memory for emotional material, independently of emotional valence. An arousal-driven neurobiological mechanism involving the amygdala has been proposed. The relation between pre-task salivary measures of cortisol (by convention named 'basal levels') and emotionally modulated memory has not been investigated yet. Given the association between higher basal levels of cortisol and indices of low mood, valence-specific effects on emotionally modulated memory could be expected (e.g. mood-congruent or stimulus-specific forms of processing). This study was designed to investigate the relationship between basal levels of salivary cortisol, self-reported mood and spatial memory for neutral, happy and angry facial expressions in healthy young volunteers (N=31). Memory performance was indexed using a modified version of a computerized object-relocation task, using emotional facial expressions as stimuli. Results showed a significant relation between cortisol and depressive mood. More importantly, both the levels of cortisol and depressive mood were inversely related to the memory performance for the happy facial expressions, while a similar relationship between cortisol and memory performance on angry faces neared significance. An explanation in terms of the down-regulation of social behavior by elevated basal cortisol levels is postulated.

  8. Altered neurophysiological responses to emotional faces discriminate children with ASD, ADHD and ASD+ADHD.

    Science.gov (United States)

    Tye, Charlotte; Battaglia, Marco; Bertoletti, Eleonora; Ashwood, Karen L; Azadi, Bahare; Asherson, Philip; Bolton, Patrick; McLoughlin, Gráinne

    2014-12-01

    There are high rates of overlap between autism spectrum disorder (ASD) and attention deficit hyperactivity disorder (ADHD). Emotional impairment in the two disorders, however, has not been directly compared using event-related potentials (ERPs) that are able to measure distinct temporal stages in emotional processing. The N170 and N400 ERP components were measured during presentation of emotional face stimuli to boys with ASD (n=19), ADHD (n=18), comorbid ASD+ADHD (n=29) and typically developing controls (n=26). Subjects with ASD (ASD/ASD+ADHD) displayed reduced N170 amplitude across all stimuli, particularly for fearful versus neutral facial expressions. Conversely, subjects with ADHD (ADHD/ASD+ADHD) demonstrated reduced modulation of N400 amplitude by fearful expressions in parietal scalp regions and happy facial expressions in central scalp regions. These findings indicate a dissociation between disorders on the basis of distinct stages of emotion processing; while children with ASD show alterations at the structural encoding stage, children with ADHD display abnormality at the contextual processing stage. The comorbid ASD+ADHD group presents as an additive condition with the unique deficits of both disorders. This supports the use of objective neural measurement of emotional processing to delineate pathophysiological mechanisms in complex overlapping disorders. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Naso-Temporal Asymmetries: Suppression of Emotional Faces in the Temporal Visual Hemifield

    Science.gov (United States)

    Framorando, David; Bapst, Mylène; Vuille, Nathalie; Pegna, Alan J.

    2017-01-01

    An ongoing debate exists regarding the possible existence of a retino-tectal visual pathway projecting to the amygdala, which would rapidly process information involving threatening or behaviorally-relevant stimuli. It has been suggested that this route might be responsible for the involuntary capture of attention by potentially dangerous stimuli. In separate studies, anatomical evidence has suggested that the retino-tectal pathway relies essentially on projections from the nasal hemiretina (temporal visual field). In this study, we chose to take advantage of this anatomical difference to further investigate whether emotional facial expressions are indeed processed through a subcortical pathway. Using EEG, participants performed a monocular spatial attention paradigm in which lateralized, task-irrelevant distractors were presented, followed by a target. The distractors were fearful faces that appeared either in nasal or temporal visual hemifield (by virtue of their monocular presentations), while the neutral face was presented simultaneously on the opposite side. Participants were asked to identify a target letter that appeared subsequently in the nasal or temporal visual hemifield. Event-related potentials (ERPs) results revealed that fearful faces appearing in the temporal visual hemifield produced a strong inhibitory response, while a negative deflection reflecting attentional capture followed presentations of fear in the nasal hemifield. These effects can be explained by a greater sensitivity of the subcortical pathway for emotional stimuli. Fearful faces conveyed through this route are processed more effectively, consequently necessitating more vigorous suppression in order for targets to be dealt with adequately. PMID:28197067

  10. Neural circuitry of masked emotional face processing in youth with bipolar disorder, severe mood dysregulation, and healthy volunteers.

    Science.gov (United States)

    Thomas, Laura A; Brotman, Melissa A; Bones, Brian L; Chen, Gang; Rosen, Brooke H; Pine, Daniel S; Leibenluft, Ellen

    2014-04-01

    Youth with bipolar disorder (BD) and those with severe, non-episodic irritability (severe mood dysregulation, SMD) show face-emotion labeling deficits. These groups differ from healthy volunteers (HV) in neural responses to emotional faces. It is unknown whether awareness is required to elicit these differences. We compared activation in BD (N=20), SMD (N=18), and HV (N=22) during "Aware" and "Non-aware" priming of shapes by emotional faces. Subjects rated how much they liked the shape. In aware, a face (angry, fearful, happy, neutral, blank oval) appeared (187 ms) before the shape. In non-aware, a face appeared (17 ms), followed by a mask (170 ms), and shape. A Diagnosis-by-Awareness-by-Emotion ANOVA was not significant. There were significant Diagnosis-by-Awareness interactions in occipital regions. BD and SMD showed increased activity for non-aware vs. aware; HV showed the reverse pattern. When subjects viewed angry or neutral faces, there were Emotion-by-Diagnosis interactions in face-emotion processing regions, including the L precentral gyrus, R posterior cingulate, R superior temporal gyrus, R middle occipital gyrus, and L medial frontal gyrus. Regardless of awareness, BD and SMD differ in activation patterns from HV and each other in multiple brain regions, suggesting that BD and SMD are distinct developmental mood disorders.

  11. Neural circuitry of masked emotional face processing in youth with bipolar disorder, severe mood dysregulation, and healthy volunteers

    Directory of Open Access Journals (Sweden)

    Laura A. Thomas

    2014-04-01

    Full Text Available Youth with bipolar disorder (BD and those with severe, non-episodic irritability (severe mood dysregulation, SMD show face-emotion labeling deficits. These groups differ from healthy volunteers (HV in neural responses to emotional faces. It is unknown whether awareness is required to elicit these differences. We compared activation in BD (N = 20, SMD (N = 18, and HV (N = 22 during “Aware” and “Non-aware” priming of shapes by emotional faces. Subjects rated how much they liked the shape. In aware, a face (angry, fearful, happy, neutral, blank oval appeared (187 ms before the shape. In non-aware, a face appeared (17 ms, followed by a mask (170 ms, and shape. A Diagnosis-by-Awareness-by-Emotion ANOVA was not significant. There were significant Diagnosis-by-Awareness interactions in occipital regions. BD and SMD showed increased activity for non-aware vs. aware; HV showed the reverse pattern. When subjects viewed angry or neutral faces, there were Emotion-by-Diagnosis interactions in face-emotion processing regions, including the L precentral gyrus, R posterior cingulate, R superior temporal gyrus, R middle occipital gyrus, and L medial frontal gyrus. Regardless of awareness, BD and SMD differ in activation patterns from HV and each other in multiple brain regions, suggesting that BD and SMD are distinct developmental mood disorders.

  12. Recognition memory for low- and high-frequency-filtered emotional faces: Low spatial frequencies drive emotional memory enhancement, whereas high spatial frequencies drive the emotion-induced recognition bias.

    Science.gov (United States)

    Rohr, Michaela; Tröger, Johannes; Michely, Nils; Uhde, Alarith; Wentura, Dirk

    2017-02-17

    This article deals with two well-documented phenomena regarding emotional stimuli: emotional memory enhancement-that is, better long-term memory for emotional than for neutral stimuli-and the emotion-induced recognition bias-that is, a more liberal response criterion for emotional than for neutral stimuli. Studies on visual emotion perception and attention suggest that emotion-related processes can be modulated by means of spatial-frequency filtering of the presented emotional stimuli. Specifically, low spatial frequencies are assumed to play a primary role for the influence of emotion on attention and judgment. Given this theoretical background, we investigated whether spatial-frequency filtering also impacts (1) the memory advantage for emotional faces and (2) the emotion-induced recognition bias, in a series of old/new recognition experiments. Participants completed incidental-learning tasks with high- (HSF) and low- (LSF) spatial-frequency-filtered emotional and neutral faces. The results of the surprise recognition tests showed a clear memory advantage for emotional stimuli. Most importantly, the emotional memory enhancement was significantly larger for face images containing only low-frequency information (LSF faces) than for HSF faces across all experiments, suggesting that LSF information plays a critical role in this effect, whereas the emotion-induced recognition bias was found only for HSF stimuli. We discuss our findings in terms of both the traditional account of different processing pathways for HSF and LSF information and a stimulus features account. The double dissociation in the results favors the latter account-that is, an explanation in terms of differences in the characteristics of HSF and LSF stimuli.

  13. Effect of positive emotion on consolidation of memory for faces: the modulation of facial valence and facial gender.

    Science.gov (United States)

    Wang, Bo

    2013-01-01

    Studies have shown that emotion elicited after learning enhances memory consolidation. However, no prior studies have used facial photos as stimuli. This study examined the effect of post-learning positive emotion on consolidation of memory for faces. During the learning participants viewed neutral, positive, or negative faces. Then they were assigned to a condition in which they either watched a 9-minute positive video clip, or a 9-minute neutral video. Then 30 minutes after the learning participants took a surprise memory test, in which they made "remember", "know", and "new" judgements. The findings are: (1) Positive emotion enhanced consolidation of recognition for negative male faces, but impaired consolidation of recognition for negative female faces; (2) For males, recognition for negative faces was equivalent to that for positive faces; for females, recognition for negative faces was better than that for positive faces. Our study provides the important evidence that effect of post-learning emotion on memory consolidation can extend to facial stimuli and such an effect can be modulated by facial valence and facial gender. The findings may shed light on establishing models concerning the influence of emotion on memory consolidation.

  14. Scanning patterns of faces do not explain impaired emotion recognition in Huntington Disease: Evidence for a high level mechanism

    Directory of Open Access Journals (Sweden)

    Marieke evan Asselen

    2012-02-01

    Full Text Available Previous studies in patients with amygdala lesions suggested that deficits in emotion recognition might be mediated by impaired scanning patterns of faces. Here we investigated whether scanning patterns also contribute to the selective impairment in recognition of disgust in Huntington disease (HD. To achieve this goal, we recorded eye movements during a two-alternative forced choice emotion recognition task. HD patients in presymptomatic (n=16 and symptomatic (n=9 disease stages were tested and their performance was compared to a control group (n=22. In our emotion recognition task, participants had to indicate whether a face reflected one of six basic emotions. In addition, and in order to define whether emotion recognition was altered when the participants were forced to look at a specific component of the face, we used a second task where only limited facial information was provided (eyes/mouth in partially masked faces. Behavioural results showed no differences in the ability to recognize emotions between presymptomatic gene carriers and controls. However, an emotion recognition deficit was found for all 6 basic emotion categories in early stage HD. Analysis of eye movement patterns showed that patient and controls used similar scanning strategies. Patterns of deficits were similar regardless of whether parts of the faces were masked or not, thereby confirming that selective attention to particular face parts is not underlying the deficits. These results suggest that the emotion recognition deficits in symptomatic HD patients cannot be explained by impaired scanning patterns of faces. Furthermore, no selective deficit for recognition of disgust was found in presymptomatic HD patients.

  15. Personality, Attentional Biases towards Emotional Faces and Symptoms of Mental Disorders in an Adolescent Sample.

    Science.gov (United States)

    O'Leary-Barrett, Maeve; Pihl, Robert O; Artiges, Eric; Banaschewski, Tobias; Bokde, Arun L W; Büchel, Christian; Flor, Herta; Frouin, Vincent; Garavan, Hugh; Heinz, Andreas; Ittermann, Bernd; Mann, Karl; Paillère-Martinot, Marie-Laure; Nees, Frauke; Paus, Tomas; Pausova, Zdenka; Poustka, Luise; Rietschel, Marcella; Robbins, Trevor W; Smolka, Michael N; Ströhle, Andreas; Schumann, Gunter; Conrod, Patricia J

    2015-01-01

    To investigate the role of personality factors and attentional biases towards emotional faces, in establishing concurrent and prospective risk for mental disorder diagnosis in adolescence. Data were obtained as part of the IMAGEN study, conducted across 8 European sites, with a community sample of 2257 adolescents. At 14 years, participants completed an emotional variant of the dot-probe task, as well two personality measures, namely the Substance Use Risk Profile Scale and the revised NEO Personality Inventory. At 14 and 16 years, participants and their parents were interviewed to determine symptoms of mental disorders. Personality traits were general and specific risk indicators for mental disorders at 14 years. Increased specificity was obtained when investigating the likelihood of mental disorders over a 2-year period, with the Substance Use Risk Profile Scale showing incremental validity over the NEO Personality Inventory. Attentional biases to emotional faces did not characterise or predict mental disorders examined in the current sample. Personality traits can indicate concurrent and prospective risk for mental disorders in a community youth sample, and identify at-risk youth beyond the impact of baseline symptoms. This study does not support the hypothesis that attentional biases mediate the relationship between personality and psychopathology in a community sample. Task and sample characteristics that contribute to differing results among studies are discussed.

  16. How is This Child Feeling? Preschool-Aged Children's Ability to Recognize Emotion in Faces and Body Poses

    Science.gov (United States)

    Parker, Alison E.; Mathis, Erin T.; Kupersmidt, Janis B.

    2013-01-01

    Research Findings: The study examined children's recognition of emotion from faces and body poses, as well as gender differences in these recognition abilities. Preschool-aged children ("N" = 55) and their parents and teachers participated in the study. Preschool-aged children completed a web-based measure of emotion recognition skills that…

  17. Eye-Tracking, Autonomic, and Electrophysiological Correlates of Emotional Face Processing in Adolescents with Autism Spectrum Disorder

    Science.gov (United States)

    Wagner, Jennifer B.; Hirsch, Suzanna B.; Vogel-Farley, Vanessa K.; Redcay, Elizabeth; Nelson, Charles A.

    2013-01-01

    Individuals with autism spectrum disorder (ASD) often have difficulty with social-emotional cues. This study examined the neural, behavioral, and autonomic correlates of emotional face processing in adolescents with ASD and typical development (TD) using eye-tracking and event-related potentials (ERPs) across two different paradigms. Scanning of…

  18. Eye-Tracking, Autonomic, and Electrophysiological Correlates of Emotional Face Processing in Adolescents with Autism Spectrum Disorder

    Science.gov (United States)

    Wagner, Jennifer B.; Hirsch, Suzanna B.; Vogel-Farley, Vanessa K.; Redcay, Elizabeth; Nelson, Charles A.

    2013-01-01

    Individuals with autism spectrum disorder (ASD) often have difficulty with social-emotional cues. This study examined the neural, behavioral, and autonomic correlates of emotional face processing in adolescents with ASD and typical development (TD) using eye-tracking and event-related potentials (ERPs) across two different paradigms. Scanning of…

  19. Emotional face recognition in adolescent suicide attempters and adolescents engaging in non-suicidal self-injury.

    Science.gov (United States)

    Seymour, Karen E; Jones, Richard N; Cushman, Grace K; Galvan, Thania; Puzia, Megan E; Kim, Kerri L; Spirito, Anthony; Dickstein, Daniel P

    2016-03-01

    Little is known about the bio-behavioral mechanisms underlying and differentiating suicide attempts from non-suicidal self-injury (NSSI) in adolescents. Adolescents who attempt suicide or engage in NSSI often report significant interpersonal and social difficulties. Emotional face recognition ability is a fundamental skill required for successful social interactions, and deficits in this ability may provide insight into the unique brain-behavior interactions underlying suicide attempts versus NSSI in adolescents. Therefore, we examined emotional face recognition ability among three mutually exclusive groups: (1) inpatient adolescents who attempted suicide (SA, n = 30); (2) inpatient adolescents engaged in NSSI (NSSI, n = 30); and (3) typically developing controls (TDC, n = 30) without psychiatric illness. Participants included adolescents aged 13-17 years, matched on age, gender and full-scale IQ. Emotional face recognition was evaluated using the diagnostic assessment of nonverbal accuracy (DANVA-2). Compared to TDC youth, adolescents with NSSI made more errors on child fearful and adult sad face recognition while controlling for psychopathology and medication status (ps Adolescent inpatients engaged in NSSI showed greater deficits in emotional face recognition than TDC, but not inpatient adolescents who attempted suicide. Further results suggest the importance of psychopathology in emotional face recognition. Replication of these preliminary results and examination of the role of context-dependent emotional processing are needed moving forward.

  20. Evaluating faces on trustworthiness: an extension of systems for recognition of emotions signaling approach/avoidance behaviors.

    Science.gov (United States)

    Todorov, Alexander

    2008-03-01

    People routinely make various trait judgments from facial appearance, and such judgments affect important social outcomes. These judgments are highly correlated with each other, reflecting the fact that valence evaluation permeates trait judgments from faces. Trustworthiness judgments best approximate this evaluation, consistent with evidence about the involvement of the amygdala in the implicit evaluation of face trustworthiness. Based on computer modeling and behavioral experiments, I argue that face evaluation is an extension of functionally adaptive systems for understanding the communicative meaning of emotional expressions. Specifically, in the absence of diagnostic emotional cues, trustworthiness judgments are an attempt to infer behavioral intentions signaling approach/avoidance behaviors. Correspondingly, these judgments are derived from facial features that resemble emotional expressions signaling such behaviors: happiness and anger for the positive and negative ends of the trustworthiness continuum, respectively. The emotion overgeneralization hypothesis can explain highly efficient but not necessarily accurate trait judgments from faces, a pattern that appears puzzling from an evolutionary point of view and also generates novel predictions about brain responses to faces. Specifically, this hypothesis predicts a nonlinear response in the amygdala to face trustworthiness, confirmed in functional magnetic resonance imaging (fMRI) studies, and dissociations between processing of facial identity and face evaluation, confirmed in studies with developmental prosopagnosics. I conclude with some methodological implications for the study of face evaluation, focusing on the advantages of formally modeling representation of faces on social dimensions.

  1. Spatiotemporal brain dynamics of emotional face processing modulations induced by the serotonin 1A/2A receptor agonist psilocybin.

    Science.gov (United States)

    Bernasconi, Fosco; Schmidt, André; Pokorny, Thomas; Kometer, Michael; Seifritz, Erich; Vollenweider, Franz X

    2014-12-01

    Emotional face processing is critically modulated by the serotonergic system. For instance, emotional face processing is impaired by acute psilocybin administration, a serotonin (5-HT) 1A and 2A receptor agonist. However, the spatiotemporal brain mechanisms underlying these modulations are poorly understood. Here, we investigated the spatiotemporal brain dynamics underlying psilocybin-induced modulations during emotional face processing. Electrical neuroimaging analyses were applied to visual evoked potentials in response to emotional faces, following psilocybin and placebo administration. Our results indicate a first time period of strength (i.e., Global Field Power) modulation over the 168-189 ms poststimulus interval, induced by psilocybin. A second time period of strength modulation was identified over the 211-242 ms poststimulus interval. Source estimations over these 2 time periods further revealed decreased activity in response to both neutral and fearful faces within limbic areas, including amygdala and parahippocampal gyrus, and the right temporal cortex over the 168-189 ms interval, and reduced activity in response to happy faces within limbic and right temporo-occipital brain areas over the 211-242 ms interval. Our results indicate a selective and temporally dissociable effect of psilocybin on the neuronal correlates of emotional face processing, consistent with a modulation of the top-down control.

  2. Face scanning and spontaneous emotion preference in Cornelia de Lange syndrome and Rubinstein-Taybi syndrome.

    Science.gov (United States)

    Crawford, Hayley; Moss, Joanna; McCleery, Joseph P; Anderson, Giles M; Oliver, Chris

    2015-01-01

    Existing literature suggests differences in face scanning in individuals with different socio-behavioural characteristics. Cornelia de Lange syndrome (CdLS) and Rubinstein-Taybi syndrome (RTS) are two genetically defined neurodevelopmental disorders with unique profiles of social behaviour. Here, we examine eye gaze to the eye and mouth regions of neutrally expressive faces, as well as the spontaneous visual preference for happy and disgusted facial expressions compared to neutral faces, in individuals with CdLS versus RTS. Results indicate that the amount of time spent looking at the eye and mouth regions of faces was similar in 15 individuals with CdLS and 17 individuals with RTS. Both participant groups also showed a similar pattern of spontaneous visual preference for emotions. These results provide insight into two rare, genetically defined neurodevelopmental disorders that have been reported to exhibit contrasting socio-behavioural characteristics and suggest that differences in social behaviour may not be sufficient to predict attention to the eye region of faces. These results also suggest that differences in the social behaviours of these two groups may be cognitively mediated rather than subcortically mediated.

  3. Paranoid males have reduced lateralisation for processing of negative emotions: an investigation using the chimeric faces test.

    Science.gov (United States)

    Bourne, Victoria J; McKay, Ryan T

    2014-01-01

    Reduced strength of lateralisation in patients with schizophrenia has been reported in a number of studies. However the exact nature of this relationship remains unclear. In this study, lateralisation for processing emotional faces was measured using the chimeric faces test and examined in relation to paranoia in a non-clinical sample. For males only, those with higher scores on a paranoia questionnaire had reduced lateralisation for processing negative facial emotion. For females there were no significant relationships. These findings suggest that atypical patterns of lateralisation for processing emotional stimuli may be implicated in, or associated with, increased levels of paranoia.

  4. Recognition of Immaturity and Emotional Expressions in Blended Faces by Children with Autism and Other Developmental Disabilities

    Science.gov (United States)

    Gross, Thomas F.

    2008-01-01

    The recognition of facial immaturity and emotional expression by children with autism, language disorders, mental retardation, and non-disabled controls was studied in two experiments. Children identified immaturity and expression in upright and inverted faces. The autism group identified fewer immature faces and expressions than control (Exp. 1 &…

  5. Age-related changes in emotional face processing across childhood and into young adulthood: Evidence from event-related potentials.

    Science.gov (United States)

    MacNamara, Annmarie; Vergés, Alvaro; Kujawa, Autumn; Fitzgerald, Kate D; Monk, Christopher S; Phan, K Luan

    2016-01-01

    Socio-emotional processing is an essential part of development, and age-related changes in its neural correlates can be observed. The late positive potential (LPP) is a measure of motivated attention that can be used to assess emotional processing; however, changes in the LPP elicited by emotional faces have not been assessed across a wide age range in childhood and young adulthood. We used an emotional face matching task to examine behavior and event-related potentials (ERPs) in 33 youth aged 7-19 years old. Younger children were slower when performing the matching task. The LPP elicited by emotional faces but not control stimuli (geometric shapes) decreased with age; by contrast, an earlier ERP (the P1) decreased with age for both faces and shapes, suggesting increased efficiency of early visual processing. Results indicate age-related attenuation in emotional processing that may stem from greater efficiency and regulatory control when performing a socio-emotional task. © 2015 Wiley Periodicals, Inc.

  6. Association between Individual Differences in Self-Reported Emotional Resilience and the Affective Perception of Neutral Faces

    Science.gov (United States)

    Arce, Estibaliz; Simmons, Alan N.; Stein, Murray B.; Winkielman, Piotr; Hitchcock, Carla; Paulus, Martin P.

    2009-01-01

    Background Resilience, i.e., the ability to cope with stress and adversity, relies heavily on judging adaptively complex situations. Judging facial emotions is a complex process of daily living that is important for evaluating the affective context of uncertain situations, which could be related to the individual's level of resilience. We used a novel experimental paradigm to test the hypothesis that highly resilient individuals show a judgment bias towards positive emotions. Methods 65 non-treatment seeking subjects completed a forced emotional choice task when presented with neutral faces and faces morphed to display a range of emotional intensities across sadness, fear, and happiness. Results Overall, neutral faces were judged more often to be sad or fearful than happy. Furthermore, high compared to low resilient individuals showed a bias towards happiness, particularly when judging neutral faces. Limitations This is a cross-sectional study with a non-clinical sample. Conclusions These results support the hypothesis that resilient individuals show a bias towards positive emotions when faced with uncertain emotional expressions. This capacity may contribute to their ability to better cope with certain types of difficult situations, perhaps especially those that are interpersonal in nature. PMID:18957273

  7. Emotional face expression modulates occipital-frontal effective connectivity during memory formation in a bottom-up fashion.

    Science.gov (United States)

    Xiu, Daiming; Geiger, Maximilian J; Klaver, Peter

    2015-01-01

    This study investigated the role of bottom-up and top-down neural mechanisms in the processing of emotional face expression during memory formation. Functional brain imaging data was acquired during incidental learning of positive ("happy"), neutral and negative ("angry" or "fearful") faces. Dynamic Causal Modeling (DCM) was applied on the functional magnetic resonance imaging (fMRI) data to characterize effective connectivity within a brain network involving face perception (inferior occipital gyrus and fusiform gyrus) and successful memory formation related areas (hippocampus, superior parietal lobule, amygdala, and orbitofrontal cortex). The bottom-up models assumed processing of emotional face expression along feed forward pathways to the orbitofrontal cortex. The top-down models assumed that the orbitofrontal cortex processed emotional valence and mediated connections to the hippocampus. A subsequent recognition memory test showed an effect of negative emotion on the response bias, but not on memory performance. Our DCM findings showed that the bottom-up model family of effective connectivity best explained the data across all subjects and specified that emotion affected most bottom-up connections to the orbitofrontal cortex, especially from the occipital visual cortex and superior parietal lobule. Of those pathways to the orbitofrontal cortex the connection from the inferior occipital gyrus correlated with memory performance independently of valence. We suggest that bottom-up neural mechanisms support effects of emotional face expression and memory formation in a parallel and partially overlapping fashion.

  8. Emotional face expression modulates occipital-frontal effective connectivity during memory formation in a bottom-up fashion

    Directory of Open Access Journals (Sweden)

    Daiming eXiu

    2015-04-01

    Full Text Available This study investigated the role of bottom-up and top-down neural mechanisms in the processing of emotional face expression during memory formation. Functional brain imaging data was acquired during incidental learning of positive (‘happy’, neutral and negative (‘angry’ or ‘fearful’ faces. Dynamic Causal Modeling (DCM was applied on the fMRI data to characterize effective connectivity within a brain network involving face perception (inferior occipital gyrus and fusiform gyrus and successful memory formation related areas (hippocampus, superior parietal lobule, amygdala and orbitofrontal cortex. The bottom-up models assumed processing of emotional face expression along feed forward pathways to the orbitofrontal cortex. The top-down models assumed that the orbitofrontal cortex processed emotional valence and mediated connections to the hippocampus. A subsequent recognition memory test showed an effect of negative emotion on the response bias, but not on memory performance. Our DCM findings showed that the bottom-up model family of effective connectivity best explained the data across all subjects and specified that emotion affected most bottom-up connections to the orbitofrontal cortex, especially from the occipital visual cortex and superior parietal lobule. Of those pathways to the orbitofrontal cortex the connection from the inferior occipital gyrus correlated with memory performance independently of valence. We suggest that bottom-up neural mechanisms support effects of emotional face expression and memory formation in a parallel and partially overlapping fashion.

  9. Neural correlates of perceiving emotional faces and bodies in developmental prosopagnosia: an event-related fMRI-study.

    Directory of Open Access Journals (Sweden)

    Jan Van den Stock

    Full Text Available Many people experience transient difficulties in recognizing faces but only a small number of them cannot recognize their family members when meeting them unexpectedly. Such face blindness is associated with serious problems in everyday life. A better understanding of the neuro-functional basis of impaired face recognition may be achieved by a careful comparison with an equally unique object category and by a adding a more realistic setting involving neutral faces as well facial expressions. We used event-related functional magnetic resonance imaging (fMRI to investigate the neuro-functional basis of perceiving faces and bodies in three developmental prosopagnosics (DP and matched healthy controls. Our approach involved materials consisting of neutral faces and bodies as well as faces and bodies expressing fear or happiness. The first main result is that the presence of emotional information has a different effect in the patient vs. the control group in the fusiform face area (FFA. Neutral faces trigger lower activation in the DP group, compared to the control group, while activation for facial expressions is the same in both groups. The second main result is that compared to controls, DPs have increased activation for bodies in the inferior occipital gyrus (IOG and for neutral faces in the extrastriate body area (EBA, indicating that body and face sensitive processes are less categorically segregated in DP. Taken together our study shows the importance of using naturalistic emotional stimuli for a better understanding of developmental face deficits.

  10. Verificação da prótese auditiva realizada face a face e via teleconsulta: medidas repetidas Telecounselling and face to face hearing aid verification: repeated measures

    Directory of Open Access Journals (Sweden)

    Deborah Viviane Ferrari

    2012-12-01

    Full Text Available OBJETIVO: avaliar as medidas repetidas da resposta de ressonância da orelha externa sem (REUR e com uso de amplificação (REAR e o ganho de inserção (REIG, realizados face a face e via teleconsulta. MÉTODO: estudo prospectivo longitudinal. Foram realizadas quatro repetições da REUR, REAR e REIG (com estímulo speech noise apresentado em 65 dB NPS e 0º azimute em 19 orelhas de adultos ouvintes normais, via face a face (F e teleconsulta síncrona por controle remoto de aplicativo (T e vídeo interativo. O software Polycom PVX foi utilizado para compartilhamento e transmissão de áudio e vídeo. A conexão foi realizada via LAN (Local Area Network USP na velocidade de 384 kbps Foi calculado o erro causal (Dalhberg entre as quatro medidas para frequências de 250 a 8000 Hz. RESULTADOS: os erros casuais entre as medidas F e T foram muito semelhantes, sendo maiores nas frequências acima de 4 kHz. As diferenças e variações entre as medidas F e T estavam dentro da magnitude de variabilidade do procedimento de medidas com microfone sonda. CONCLUSÃO: as medidas com microfone sonda via teleconsulta fornece resultados confiáveis e similares aos obtidos pelo procedimento padrão.PURPOSE: to evaluate the repeated measurements of the real ear unaided response (REUR, real ear aided response (REAR and insertion gain (REIG conducted as face to face (F and via telecounselling (T. METHOD: longitudinal prospective study. Four measures of REUR, REAR and REIG (carried out with speech noise stimulus presented at 65 dB SPL at 0° azimuth were obtained in 19 ears in normal hearing adults, face to face and via synchronous telecounsellings with remote control of the real ear equipment and interactive video. Polycom PVX software was used for desktop sharing and video and audio transmission. The Loca Area Network (LAN USP was used for the connection (transmission rate: 384 kbps. Dahlberg’s casual errors were calculated for obtaining the measures for

  11. Early handling and repeated cross-fostering have opposite effect on mouse emotionality.

    Directory of Open Access Journals (Sweden)

    Alessandra eLuchetti

    2015-04-01

    Full Text Available Early life events have a crucial role in programming the individual phenotype and exposure to traumatic experiences during infancy can increase later risk for a variety of neuropsychiatric conditions, including mood and anxiety disorders. Animal models of postnatal stress have been developed in rodents to explore molecular mechanisms responsible for the observed short and long lasting neurobiological effects of such manipulations. The main aim of this study was to compare the behavioral and hormonal phenotype of young and adult animals exposed to different postnatal treatments. Outbred mice were exposed to i the classical Handling protocol (H: 15 min-day of separation from the mother from day 1 to 14 of life or to ii a Repeated Cross-Fostering protocol (RCF: adoption of litters from day 1 to 4 of life by different dams. Handled mice received more maternal care in infancy and showed the already described reduced emotionality at adulthood. Repeated cross fostered animals did not differ for maternal care received, but showed enhanced sensitivity to separation from the mother in infancy and altered respiratory response to 6%CO2 in breathing air in comparison with controls. Abnormal respiratory responses to hypercapnia are commonly found among humans with panic disorders, and point to RCF-induced instability of the early environment as a valid developmental model for panic disorder. The comparisons between short- and long-term effects of postnatal handling vs. RCF indicate that different types of early adversities are associated with different behavioral profiles, and evoke psychopathologies that can be distinguished according to the neurobiological systems disrupted by early-life manipulations.

  12. Emotional expectations influence neural sensitivity to fearful faces in humans:An event-related potential study

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The present study tested whether neural sensitivity to salient emotional facial expressions was influenced by emotional expectations induced by a cue that validly predicted the expression of a subsequently presented target face. Event-related potentials (ERPs) elicited by fearful and neutral faces were recorded while participants performed a gender discrimination task under cued (‘expected’) and uncued (‘unexpected’) conditions. The behavioral results revealed that accuracy was lower for fearful compared with neutral faces in the unexpected condition, while accuracy was similar for fearful and neutral faces in the expected condition. ERP data revealed increased amplitudes in the P2 component and 200–250 ms interval for unexpected fearful versus neutral faces. By contrast, ERP responses were similar for fearful and neutral faces in the expected condition. These findings indicate that human neural sensitivity to fearful faces is modulated by emotional expectations. Although the neural system is sensitive to unpredictable emotionally salient stimuli, sensitivity to salient stimuli is reduced when these stimuli are predictable.

  13. Reduced amygdala and ventral striatal activity to happy faces in PTSD is associated with emotional numbing.

    Directory of Open Access Journals (Sweden)

    Kim L Felmingham

    Full Text Available There has been a growing recognition of the importance of reward processing in PTSD, yet little is known of the underlying neural networks. This study tested the predictions that (1 individuals with PTSD would display reduced responses to happy facial expressions in ventral striatal reward networks, and (2 that this reduction would be associated with emotional numbing symptoms. 23 treatment-seeking patients with Posttraumatic Stress Disorder were recruited from the treatment clinic at the Centre for Traumatic Stress Studies, Westmead Hospital, and 20 trauma-exposed controls were recruited from a community sample. We examined functional magnetic resonance imaging responses during the presentation of happy and neutral facial expressions in a passive viewing task. PTSD participants rated happy facial expression as less intense than trauma-exposed controls. Relative to controls, PTSD participants revealed lower activation to happy (-neutral faces in ventral striatum and and a trend for reduced activation in left amygdala. A significant negative correlation was found between emotional numbing symptoms in PTSD and right ventral striatal regions after controlling for depression, anxiety and PTSD severity. This study provides initial evidence that individuals with PTSD have lower reactivity to happy facial expressions, and that lower activation in ventral striatal-limbic reward networks may be associated with symptoms of emotional numbing.

  14. Impaired Integration of Emotional Faces and Affective Body Context in a Rare Case of Developmental Visual Agnosia

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran. R.; Bentin, Shlomo

    2011-01-01

    In the current study we examined the recognition of facial expressions embedded in emotionally expressive bodies in case LG, an individual with a rare form of developmental visual agnosia who suffers from severe prosopagnosia. Neuropsychological testing demonstrated that LG‘s agnosia is characterized by profoundly impaired visual integration. Unlike individuals with typical developmental prosopagnosia who display specific difficulties with face identity (but typically not expression) recognition, LG was also impaired at recognizing isolated facial expressions. By contrast, he successfully recognized the expressions portrayed by faceless emotional bodies handling affective paraphernalia. When presented with contextualized faces in emotional bodies his ability to detect the emotion expressed by a face did not improve even if it was embedded in an emotionally-congruent body context. Furthermore, in contrast to controls, LG displayed an abnormal pattern of contextual influence from emotionally-incongruent bodies. The results are interpreted in the context of a general integration deficit in developmental visual agnosia, suggesting that impaired integration may extend from the level of the face to the level of the full person. PMID:21482423

  15. Impaired integration of emotional faces and affective body context in a rare case of developmental visual agnosia.

    Science.gov (United States)

    Aviezer, Hillel; Hassin, Ran R; Bentin, Shlomo

    2012-06-01

    In the current study we examined the recognition of facial expressions embedded in emotionally expressive bodies in case LG, an individual with a rare form of developmental visual agnosia (DVA) who suffers from severe prosopagnosia. Neuropsychological testing demonstrated that LG's agnosia is characterized by profoundly impaired visual integration. Unlike individuals with typical developmental prosopagnosia who display specific difficulties with face identity (but typically not expression) recognition, LG was also impaired at recognizing isolated facial expressions. By contrast, he successfully recognized the expressions portrayed by faceless emotional bodies handling affective paraphernalia. When presented with contextualized faces in emotional bodies his ability to detect the emotion expressed by a face did not improve even if it was embedded in an emotionally-congruent body context. Furthermore, in contrast to controls, LG displayed an abnormal pattern of contextual influence from emotionally-incongruent bodies. The results are interpreted in the context of a general integration deficit in DVA, suggesting that impaired integration may extend from the level of the face to the level of the full person.

  16. The Relationship between Early Neural Responses to Emotional Faces at Age 3 and Later Autism and Anxiety Symptoms in Adolescents with Autism

    Science.gov (United States)

    Neuhaus, Emily; Jones, Emily J. H.; Barnes, Karen; Sterling, Lindsey; Estes, Annette; Munson, Jeff; Dawson, Geraldine; Webb, Sara J.

    2016-01-01

    Both autism spectrum (ASD) and anxiety disorders are associated with atypical neural and attentional responses to emotional faces, differing in affective face processing from typically developing peers. Within a longitudinal study of children with ASD (23 male, 3 female), we hypothesized that early ERPs to emotional faces would predict concurrent…

  17. Emotional face recognition deficits and medication effects in pre-manifest through stage-II Huntington's disease.

    Science.gov (United States)

    Labuschagne, Izelle; Jones, Rebecca; Callaghan, Jenny; Whitehead, Daisy; Dumas, Eve M; Say, Miranda J; Hart, Ellen P; Justo, Damian; Coleman, Allison; Dar Santos, Rachelle C; Frost, Chris; Craufurd, David; Tabrizi, Sarah J; Stout, Julie C

    2013-05-15

    Facial emotion recognition impairments have been reported in Huntington's disease (HD). However, the nature of the impairments across the spectrum of HD remains unclear. We report on emotion recognition data from 344 participants comprising premanifest HD (PreHD) and early HD patients, and controls. In a test of recognition of facial emotions, we examined responses to six basic emotional expressions and neutral expressions. In addition, and within the early HD sample, we tested for differences on emotion recognition performance between those 'on' vs. 'off' neuroleptic or selective serotonin reuptake inhibitor (SSRI) medications. The PreHD groups showed significant (precognition, compared to controls, on fearful, angry and surprised faces; whereas the early HD groups were significantly impaired across all emotions including neutral expressions. In early HD, neuroleptic use was associated with worse facial emotion recognition, whereas SSRI use was associated with better facial emotion recognition. The findings suggest that emotion recognition impairments exist across the HD spectrum, but are relatively more widespread in manifest HD than in the premanifest period. Commonly prescribed medications to treat HD-related symptoms also appear to affect emotion recognition. These findings have important implications for interpersonal communication and medication usage in HD.

  18. Enhancing Emotion Recognition in Children with Autism Spectrum Conditions: An Intervention Using Animated Vehicles with Real Emotional Faces

    Science.gov (United States)

    Golan, Ofer; Ashwin, Emma; Granader, Yael; McClintock, Suzy; Day, Kate; Leggett, Victoria; Baron-Cohen, Simon

    2010-01-01

    This study evaluated "The Transporters", an animated series designed to enhance emotion comprehension in children with autism spectrum conditions (ASC). n = 20 children with ASC (aged 4-7) watched "The Transporters" everyday for 4 weeks. Participants were tested before and after intervention on emotional vocabulary and emotion recognition at three…

  19. Efficacy of identifying neural components in the face and emotion processing system in schizophrenia using a dynamic functional localizer.

    Science.gov (United States)

    Arnold, Aiden E G F; Iaria, Giuseppe; Goghari, Vina M

    2016-02-28

    Schizophrenia is associated with deficits in face perception and emotion recognition. Despite consistent behavioural results, the neural mechanisms underlying these cognitive abilities have been difficult to isolate, in part due to differences in neuroimaging methods used between studies for identifying regions in the face processing system. Given this problem, we aimed to validate a recently developed fMRI-based dynamic functional localizer task for use in studies of psychiatric populations and specifically schizophrenia. Previously, this functional localizer successfully identified each of the core face processing regions (i.e. fusiform face area, occipital face area, superior temporal sulcus), and regions within an extended system (e.g. amygdala) in healthy individuals. In this study, we tested the functional localizer success rate in 27 schizophrenia patients and in 24 community controls. Overall, the core face processing regions were localized equally between both the schizophrenia and control group. Additionally, the amygdala, a candidate brain region from the extended system, was identified in nearly half the participants from both groups. These results indicate the effectiveness of a dynamic functional localizer at identifying regions of interest associated with face perception and emotion recognition in schizophrenia. The use of dynamic functional localizers may help standardize the investigation of the facial and emotion processing system in this and other clinical populations.

  20. [Decrease in N170 evoked potential component latency during repeated presentation of face images].

    Science.gov (United States)

    Verkhliutov, V M; Ushakov, V L; Strelets, V B

    2009-01-01

    The 15 healthy volunteers EEG from 28 channels was recorded during the presentation of visual stimuli in the form of face and building images. The stimuli were presented in two series. The first series consisted of 60 face and 60 building images presented in random order. The second series consisted of 30 face and 30 building images. The second series began 1.5-2 min after the end of the first ore. No instruction was given to the participants. P1, N170 and VPP EP components were identified for both stimuli categories. These components were located in the medial parietal area (Brodmann area 40). P1 and N170 components were recorded in the superior temporal fissure (Brodmann area 21, STS region), the first component had the latency 120 ms, the second one--155 ms. VPP was recorded with the latency 190 ms (Brodmann area 19). Dynamic mapping of EP components with the latency from 97 to 242 ms revealed the removal of positive maximums from occipital to frontal areas through temporal ones and their subsequent returning to occipital areas through the central ones. During the comparison of EP components to face and building images the amplitude differences were revealed in the following areas: P1--in frontal, central and anterior temporal areas, N170--in frontal, central, temporal and parietal areas, VPP--in all areas. It was also revealed that N170 latency was 12 ms shorter for face than for building images. It was proposed that the above mentioned N170 latency decrease for face in comparison with building images is connected with the different space location of the fusiform area responsible for face and building images recognition. Priming--the effect that is revealed during the repetitive face images presentation is interpreted as the manifestation of functional heterogeneity of the fusiform area responsible for the face images recognition. The hypothesis is put forward that the parts of extrastriate cortex which are located closer to the central retinotopical

  1. Avoidant decision making in social anxiety: The interaction of angry faces and emotional responses

    Directory of Open Access Journals (Sweden)

    Andre ePittig

    2014-09-01

    Full Text Available Recent research indicates that angry facial expressions are preferentially processed and may facilitate automatic avoidance response, especially in socially anxious individuals. However, few studies have examined whether this bias also expresses itself in more complex cognitive processes and behavior such as decision making. We recently introduced a variation of the Iowa Gambling Task which allowed us to document the influence of task-irrelevant emotional cues on rational decision making. The present study used a modified gambling task to investigate the impact of angry facial expressions on decision making in 38 individuals with a wide range of social anxiety. Participants were to find out which choices were (dis- advantageous to maximize overall gain. To create a decision conflict between approach of rewards and avoidance of fear-relevant angry faces, advantageous choices were associated with angry facial expressions, whereas disadvantageous choices were associated with happy facial expressions. Results indicated that higher social avoidance predicted less advantageous decisions in the beginning of the task, i.e., when contingencies were still uncertain. Interactions with specific skin conductance responses further clarified that this initial avoidance only occurred in combination with elevated responses before choosing an angry facial expressions. In addition, an interaction between high trait anxiety and elevated responses to early losses predicted faster learning of an advantageous strategy. These effects were independent of intelligence, general risky decision-making, self-reported state anxiety, and depression. Thus, socially avoidant individuals who respond emotionally to angry facial expressions are more likely to show avoidance of these faces under uncertainty. This novel laboratory paradigm may be an appropriate analog for central features of social anxiety.

  2. Avoidant decision making in social anxiety: the interaction of angry faces and emotional responses.

    Science.gov (United States)

    Pittig, Andre; Pawlikowski, Mirko; Craske, Michelle G; Alpers, Georg W

    2014-01-01

    Recent research indicates that angry facial expressions are preferentially processed and may facilitate automatic avoidance response, especially in socially anxious individuals. However, few studies have examined whether this bias also expresses itself in more complex cognitive processes and behavior such as decision making. We recently introduced a variation of the Iowa Gambling Task which allowed us to document the influence of task-irrelevant emotional cues on rational decision making. The present study used a modified gambling task to investigate the impact of angry facial expressions on decision making in 38 individuals with a wide range of social anxiety. Participants were to find out which choices were (dis-) advantageous to maximize overall gain. To create a decision conflict between approach of reward and avoidance of fear-relevant angry faces, advantageous choices were associated with angry facial expressions, whereas disadvantageous choices were associated with happy facial expressions. Results indicated that higher social avoidance predicted less advantageous decisions in the beginning of the task, i.e., when contingencies were still uncertain. Interactions with specific skin conductance responses further clarified that this initial avoidance only occurred in combination with elevated responses before choosing an angry facial expressions. In addition, an interaction between high trait anxiety and elevated responses to early losses predicted faster learning of an advantageous strategy. These effects were independent of intelligence, general risky decision-making, self-reported state anxiety, and depression. Thus, socially avoidant individuals who respond emotionally to angry facial expressions are more likely to show avoidance of these faces under uncertainty. This novel laboratory paradigm may be an appropriate analog for central features of social anxiety.

  3. Multisensory integration of dynamic emotional faces and voices: method for simultaneous EEG-fMRI measurements

    Directory of Open Access Journals (Sweden)

    Patrick David Schelenz

    2013-11-01

    Full Text Available Combined EEG-fMRI analysis correlates time courses from single electrodes or independent EEG components with the hemodynamic response. Implementing information from only one electrode, however, may miss relevant information from complex electrophysiological networks. Component based analysis, in turn, depends on a priori knowledge of the signal topography. Complex designs such as studies on multisensory integration of emotions investigate subtle differences in distributed networks based on only a few trials per condition. Thus, they require a sensitive and comprehensive approach which does not rely on a-priori knowledge about the underlying neural processes. In this pilot study, feasibility and sensitivity of source localization-driven analysis for EEG-fMRI was tested using a multisensory integration paradigm. Dynamic audiovisual stimuli consisting of emotional talking faces and pseudowords with emotional prosody were rated in a delayed response task. The trials comprised affectively congruent and incongruent displays.In addition to event-locked EEG and fMRI analyses, induced oscillatory EEG responses at estimated cortical sources and in specific temporo-spectral windows were correlated with the corresponding BOLD responses. EEG analysis showed high data quality with less than 10% trial rejection. In an early time window, alpha oscillations were suppressed in bilateral occipital cortices and fMRI analysis confirmed high data quality with reliable activation in auditory, visual and frontal areas to the presentation of multisensory stimuli. In line with previous studies, we obtained reliable correlation patterns for event locked occipital alpha suppression and BOLD signal time course.Our results suggest a valid methodological approach to investigate complex stimuli using the present source localization driven method for EEG-fMRI. This novel procedure may help to investigate combined EEG-fMRI data from novel complex paradigms with high spatial and

  4. Multisensory integration of dynamic emotional faces and voices: method for simultaneous EEG-fMRI measurements.

    Science.gov (United States)

    Schelenz, Patrick D; Klasen, Martin; Reese, Barbara; Regenbogen, Christina; Wolf, Dhana; Kato, Yutaka; Mathiak, Klaus

    2013-01-01

    Combined EEG-fMRI analysis correlates time courses from single electrodes or independent EEG components with the hemodynamic response. Implementing information from only one electrode, however, may miss relevant information from complex electrophysiological networks. Component based analysis, in turn, depends on a priori knowledge of the signal topography. Complex designs such as studies on multisensory integration of emotions investigate subtle differences in distributed networks based on only a few trials per condition. Thus, they require a sensitive and comprehensive approach which does not rely on a-priori knowledge about the underlying neural processes. In this pilot study, feasibility and sensitivity of source localization-driven analysis for EEG-fMRI was tested using a multisensory integration paradigm. Dynamic audiovisual stimuli consisting of emotional talking faces and pseudowords with emotional prosody were rated in a delayed response task. The trials comprised affectively congruent and incongruent displays. In addition to event-locked EEG and fMRI analyses, induced oscillatory EEG responses at estimated cortical sources and in specific temporo-spectral windows were correlated with the corresponding BOLD responses. EEG analysis showed high data quality with less than 10% trial rejection. In an early time window, alpha oscillations were suppressed in bilateral occipital cortices and fMRI analysis confirmed high data quality with reliable activation in auditory, visual and frontal areas to the presentation of multisensory stimuli. In line with previous studies, we obtained reliable correlation patterns for event locked occipital alpha suppression and BOLD signal time course. Our results suggest a valid methodological approach to investigate complex stimuli using the present source localization driven method for EEG-fMRI. This novel procedure may help to investigate combined EEG-fMRI data from novel complex paradigms with high spatial and temporal

  5. Characterization and recognition of mixed emotional expressions in thermal face image

    Science.gov (United States)

    Saha, Priya; Bhattacharjee, Debotosh; De, Barin K.; Nasipuri, Mita

    2016-05-01

    Facial expressions in infrared imaging have been introduced to solve the problem of illumination, which is an integral constituent of visual imagery. The paper investigates facial skin temperature distribution on mixed thermal facial expressions of our created face database where six are basic expressions and rest 12 are a mixture of those basic expressions. Temperature analysis has been performed on three facial regions of interest (ROIs); periorbital, supraorbital and mouth. Temperature variability of the ROIs in different expressions has been measured using statistical parameters. The temperature variation measurement in ROIs of a particular expression corresponds to a vector, which is later used in recognition of mixed facial expressions. Investigations show that facial features in mixed facial expressions can be characterized by positive emotion induced facial features and negative emotion induced facial features. Supraorbital is a useful facial region that can differentiate basic expressions from mixed expressions. Analysis and interpretation of mixed expressions have been conducted with the help of box and whisker plot. Facial region containing mixture of two expressions is generally less temperature inducing than corresponding facial region containing basic expressions.

  6. Attentional bias to affective faces and complex IAPS images in early visual cortex follows emotional cue extraction.

    Science.gov (United States)

    Bekhtereva, Valeria; Craddock, Matt; Müller, Matthias M

    2015-05-15

    Emotionally arousing stimuli are known to rapidly draw the brain's processing resources, even when they are task-irrelevant. The steady-state visual evoked potential (SSVEP) response, a neural response to a flickering stimulus which effectively allows measurement of the processing resources devoted to that stimulus, has been used to examine this process of attentional shifting. Previous studies have used a task in which participants detected periods of coherent motion in flickering random dot kinematograms (RDKs) which generate an SSVEP, and found that task-irrelevant emotional stimuli withdraw more attentional resources from the task-relevant RDKs than task-irrelevant neutral stimuli. However, it is not clear whether the emotion-related differences in the SSVEP response are conditional on higher-level extraction of emotional cues as indexed by well-known event-related potential (ERPs) components (N170, early posterior negativity, EPN), or if affective bias in competition for visual attention resources is a consequence of a time-invariant shifting process. In the present study, we used two different types of emotional distractors - IAPS pictures and facial expressions - for which emotional cue extraction occurs at different speeds, being typically earlier for faces (at ~170ms, as indexed by the N170) than for IAPS images (~220-280ms, EPN). We found that emotional modulation of attentional resources as measured by the SSVEP occurred earlier for faces (around 180ms) than for IAPS pictures (around 550ms), after the extraction of emotional cues as indexed by visual ERP components. This is consistent with emotion related re-allocation of attentional resources occurring after emotional cue extraction rather than being linked to a time-fixed shifting process. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. The Processing of Human Emotional Faces by Pet and Lab Dogs: Evidence for Lateralization and Experience Effects.

    Science.gov (United States)

    Barber, Anjuli L A; Randi, Dania; Müller, Corsin A; Huber, Ludwig

    2016-01-01

    From all non-human animals dogs are very likely the best decoders of human behavior. In addition to a high sensitivity to human attentive status and to ostensive cues, they are able to distinguish between individual human faces and even between human facial expressions. However, so far little is known about how they process human faces and to what extent this is influenced by experience. Here we present an eye-tracking study with dogs emanating from two different living environments and varying experience with humans: pet and lab dogs. The dogs were shown pictures of familiar and unfamiliar human faces expressing four different emotions. The results, extracted from several different eye-tracking measurements, revealed pronounced differences in the face processing of pet and lab dogs, thus indicating an influence of the amount of exposure to humans. In addition, there was some evidence for the influences of both, the familiarity and the emotional expression of the face, and strong evidence for a left gaze bias. These findings, together with recent evidence for the dog's ability to discriminate human facial expressions, indicate that dogs are sensitive to some emotions expressed in human faces.

  8. Alterations in neural processing of emotional faces in adolescent anorexia nervosa patients - an event-related potential study.

    Science.gov (United States)

    Sfärlea, Anca; Greimel, Ellen; Platt, Belinda; Bartling, Jürgen; Schulte-Körne, Gerd; Dieler, Alica C

    2016-09-01

    The present study explored the neurophysiological correlates of perception and recognition of emotional facial expressions in adolescent anorexia nervosa (AN) patients using event-related potentials (ERPs). We included 20 adolescent girls with AN and 24 healthy girls and recorded ERPs during a passive viewing task and three active tasks requiring processing of emotional faces in varying processing depths; one of the tasks also assessed emotion recognition abilities behaviourally. Despite the absence of behavioural differences, we found that across all tasks AN patients exhibited a less pronounced early posterior negativity (EPN) in response to all facial expressions compared to controls. The EPN is an ERP component reflecting an automatic, perceptual processing stage which is modulated by the intrinsic salience of a stimulus. Hence, the less pronounced EPN in anorexic girls suggests that they might perceive other people's faces as less intrinsically relevant, i.e. as less "important" than do healthy girls. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Association of Maternal Interaction with Emotional Regulation in 4 and 9 Month Infants During the Still Face Paradigm

    Science.gov (United States)

    Lowe, Jean R.; MacLean, Peggy C.; Duncan, Andrea F.; Aragón, Crystal; Schrader, Ronald M.; Caprihan, Arvind; Phillips, John P.

    2013-01-01

    This study used the Still Face Paradigm to investigate the relationship of maternal interaction on infants’ emotion regulation responses. Seventy infant-mother dyads were seen at 4 months and 25 of these same dyads were re-evaluated at 9 months. Maternal interactions were coded for attention seeking and contingent responding. Emotional regulation was described by infant stress reaction and overall positive affect. Results indicated that at both 4 and 9 months mothers who used more contingent responding interactions had infants who showed more positive affect. In contrast, mothers who used more attention seeking play had infants who showed less positive affect after the Still Face Paradigm. Patterns of stress reaction were reversed, as mothers who used more attention seeking play had infants with less negative affect. Implications for intervention and emotional regulation patterns over time are discussed. PMID:22217393

  10. Psychopathic traits are associated with reduced attention to the eyes of emotional faces among adult male non-offenders

    Directory of Open Access Journals (Sweden)

    Steven Mark Gillespie

    2015-10-01

    Full Text Available Psychopathic traits are linked with impairments in emotional facial expression recognition. These impairments may, in part, reflect reduced attention to the eyes of emotional faces. Although reduced attention to the eyes has been noted among children with conduct problems and callous-unemotional traits, similar findings are yet to be found in relation to psychopathic traits among adult male participants. Here we investigated the relationship of primary (selfish, uncaring and secondary (impulsive, antisocial psychopathic traits with attention to the eyes among adult male non-offenders during an emotion recognition task. We measured the number of fixations, and overall dwell time, on the eyes and the mouth of male and female faces showing the six basic emotions at varying levels of intensity. We found no relationship of primary or secondary psychopathic traits with recognition accuracy. However, primary psychopathic traits were associated with a reduced number of fixations, and lower overall dwell time, on the eyes relative to the mouth across expressions, intensity, and sex. Furthermore, the relationship of primary psychopathic traits with attention to the eyes of angry and fearful faces was influenced by the sex and intensity of the expression. We also showed that a greater number of fixations on the eyes, relative to the mouth, was associated with increased accuracy for angry and fearful expression recognition. These results are the first to show effects of psychopathic traits on attention to the eyes of emotional faces in an adult male sample, and may support amygdala based accounts of psychopathy. These findings may also have methodological implications for clinical studies of emotion recognition.

  11. Amygdala habituation to emotional faces in adolescents with internalizing disorders, adolescents with childhood sexual abuse related PTSD and healthy adolescents

    Directory of Open Access Journals (Sweden)

    Bianca G. van den Bulk

    2016-10-01

    Full Text Available Adolescents with internalizing disorders and adolescents with childhood sexual abuse related post-traumatic stress disorder (CSA-related PTSD show a large overlap in symptomatology. In addition, brain research indicated hyper-responsiveness and sustained activation instead of habituation of amygdala activation to emotional faces in both groups. Little is known, however, about whether the same patterns of amygdala habituation are present in these two groups. The current study examined habituation patterns of amygdala activity to emotional faces (fearful, happy and neutral in adolescents with a DSM-IV depressive and/or anxiety disorder (N = 25, adolescents with CSA-related PTSD (N = 19 and healthy controls (N = 26. Behaviourally, the adolescents from the internalizing and CSA-related PTSD group reported more anxiety to fearful and neutral faces than adolescents from the control group and adolescents from the CSA-related PTSD group reacted slower compared to the internalizing group. At the whole brain level, there was a significant interaction between time and group within the left amygdala. Follow-up ROI analysis showed elevated initial activity in the amygdala and rapid habituation in the CSA-related PTSD group compared to the internalizing group. These findings suggest that habituation patterns of amygdala activation provide additional information on problems with emotional face processing. Furthermore, the results suggest there are differences in the underlying neurobiological mechanisms related to emotional face processing for adolescents with internalizing disorders and adolescents with CSA-related PTSD. Possibly CSA-related PTSD is characterized by a stronger primary emotional response driven by the amygdala.

  12. Response inhibition results in the emotional devaluation of faces: neural correlates as revealed by fMRI.

    Science.gov (United States)

    Doallo, Sonia; Raymond, Jane E; Shapiro, Kimron L; Kiss, Monika; Eimer, Martin; Nobre, Anna C

    2012-08-01

    Although it is well established that prior experience with faces determines their subsequent social-emotional evaluation, recent work shows that top-down inhibitory mechanisms, including response inhibition, can lead to social devaluation after even a single, brief exposure. These rapidly induced effects indicate interplay among perceptual, attentional, response-selection and social-emotional networks; yet, the brain mechanisms underlying this are not well understood. This study used functional magnetic resonance imaging (fMRI) to investigate the neural mechanism mediating the relationship between inhibitory control and emotional devaluation. Participants performed two tasks: (i) a Go/No-Go task in response to faces and (ii) a trustworthiness rating task involving the previously seen faces. No-Go faces were rated as significantly less trustworthy than Go faces. By examining brain activations during Task 1, behavioral measures and brain activations obtained in Task 2 could be predicted. Specifically, activity in brain areas during Task 1 associated with (i) executive control and response suppression (i.e. lateral prefrontal cortex) and (ii) affective responses and value representation (i.e. orbitofrontal cortex), systematically covaried with behavioral ratings and amygdala activity obtained during Task 2. The present findings offer insights into the neural mechanisms linking inhibitory processes to affective responses.

  13. Influence of spatial frequency and emotion expression on face processing in patients with panic disorder.

    Science.gov (United States)

    Shim, Miseon; Kim, Do-Won; Yoon, Sunkyung; Park, Gewnhi; Im, Chang-Hwan; Lee, Seung-Hwan

    2016-06-01

    Deficits in facial emotion processing is a major characteristic of patients with panic disorder. It is known that visual stimuli with different spatial frequencies take distinct neural pathways. This study investigated facial emotion processing involving stimuli presented at broad, high, and low spatial frequencies in patients with panic disorder. Eighteen patients with panic disorder and 19 healthy controls were recruited. Seven event-related potential (ERP) components: (P100, N170, early posterior negativity (EPN); vertex positive potential (VPP), N250, P300; and late positive potential (LPP)) were evaluated while the participants looked at fearful and neutral facial stimuli presented at three spatial frequencies. When a fearful face was presented, panic disorder patients showed a significantly increased P100 amplitude in response to low spatial frequency compared to high spatial frequency; whereas healthy controls demonstrated significant broad spatial frequency dependent processing in P100 amplitude. Vertex positive potential amplitude was significantly increased in high and broad spatial frequency, compared to low spatial frequency in panic disorder. Early posterior negativity amplitude was significantly different between HSF and BSF, and between LSF and BSF processing in both groups, regardless of facial expression. The possibly confounding effects of medication could not be controlled. During early visual processing, patients with panic disorder prefer global to detailed information. However, in later processing, panic disorder patients overuse detailed information for the perception of facial expressions. These findings suggest that unique spatial frequency-dependent facial processing could shed light on the neural pathology associated with panic disorder. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. P2-27: Electrophysiological Correlates of Conscious and Unconscious Processing of Emotional Faces in Individuals with High and Low Autistic Traits

    Directory of Open Access Journals (Sweden)

    Svjetlana Vukusic

    2012-10-01

    Full Text Available LeDoux (1996 The Emotional Brain has suggested that subconsciouss presentation of fearful emotional information is relayed to the amygdala along a rapid subcortical route. Rapid emotion processing is important because it alerts other parts of brain to emotionally salient information. It also produces immediate reflexive responses to threating stimuli in comparison to slower conscious appraisal, which is of important adaptive survival value. Current theoretical models of autism spectrum disorders (ASD have linked impairments in the processing of emotional information to amygdala dysfunction. It can be suggested that impairment in face processing found in autism may be the result of impaired rapid subconscious processing of emotional information which does not make faces socially salient. Previous studies examined subconscious processing of emotional stimuli with backward masking paradigms by using very brief presentation of emotional face stimuli proceeded by a mask. We used an event-related potential (ERP study within a backward masking paradigm with subjects with low and high autistic tendencies as measured by the Autism Spectrum Quotient (AQ questionnaire. The time course of processing of fearful and happy facial expressions and an emotionally neutral face was investigated during subliminal (16 ms and supraliminal (166 ms stimuli presentation. The task consisted of an explicit categorization of emotional and neutral faces. We looked at ERP components N2, P3a, and also N170 for differences between subjects with low ( 19 AQ.

  15. Affective resonance in response to others' emotional faces varies with affective ratings and psychopathic traits in amygdala and anterior insula.

    Science.gov (United States)

    Seara-Cardoso, Ana; Sebastian, Catherine L; Viding, Essi; Roiser, Jonathan P

    2016-01-01

    Despite extensive research on the neural basis of empathic responses for pain and disgust, there is limited data about the brain regions that underpin affective response to other people's emotional facial expressions. Here, we addressed this question using event-related functional magnetic resonance imaging to assess neural responses to emotional faces, combined with online ratings of subjective state. When instructed to rate their own affective response to others' faces, participants recruited anterior insula, dorsal anterior cingulate, inferior frontal gyrus, and amygdala, regions consistently implicated in studies investigating empathy for disgust and pain, as well as emotional saliency. Importantly, responses in anterior insula and amygdala were modulated by trial-by-trial variations in subjective affective responses to the emotional facial stimuli. Furthermore, overall task-elicited activations in these regions were negatively associated with psychopathic personality traits, which are characterized by low affective empathy. Our findings suggest that anterior insula and amygdala play important roles in the generation of affective internal states in response to others' emotional cues and that attenuated function in these regions may underlie reduced empathy in individuals with high levels of psychopathic traits.

  16. Wide Eyes and Drooping Arms: Adult-Like Congruency Effects Emerge Early in the Development of Sensitivity to Emotional Faces and Body Postures

    Science.gov (United States)

    Mondloch, Catherine J.; Horner, Matthew; Mian, Jasmine

    2013-01-01

    Adults' and 8-year-old children's perception of emotional faces is disrupted when faces are presented in the context of incongruent body postures (e.g., when a sad face is displayed on a fearful body) if the two emotions are highly similar (e.g., sad/fear) but not if they are highly dissimilar (e.g., sad/happy). The current research investigated…

  17. Emotional responses during repeated sprint intervals performed on level, downhill and uphill surfaces.

    Science.gov (United States)

    Baron, Bertrand; Guilloux, Bertrand; Begue, Mylène; Uriac, Stéphane

    2015-01-01

    The purpose of this study was to test emotional responses during sprint intervals performed on a level, down and up surface. Fifty trained participants performed a maximal effort during a 60-m sprint and 10 repetitions of 60 m running sprints on a level, down and up surface on a 5.9% slope. Running speeds, emotional responses and heart rate were measured. Self-selected speeds were correlated with the rating of perceived exertion, the affective balance, the desire to stop and the resources needed for the task in all conditions whereas the pleasure, the desire to continue and the capacity to realise the task were correlated with speeds only during level and uphill running. Mean values of emotional parameters were significantly different (P < 0.05) during running on a flat surface, downhill and uphill. When the gradient of running surface is changed, the pattern of emotional responses was just translated, i.e. most of the slope between the evolution of emotional parameters and the repetitions were not significantly different whereas Y-intercepts were different. Consented effort is highly correlated with the resources needed for the task (P < 0.001, r(2) = 0.72). We propose that the difference in the resources needed for the task between level, downhill and uphill running (F 2, 1499 = 166.5, P < 0.001, Eta(2) = 0.18) is the most important key that explains our results.

  18. Infants’ Temperament and Mothers’, and Fathers’ Depression Predict Infants’ Attention to Objects Paired with Emotional Faces

    NARCIS (Netherlands)

    E. Aktar; D.J. Mandell; W. de Vente; M. Majdandžić; M.E.J. Raijmakers; S.M. Bögels

    2015-01-01

    Between 10 and 14 months, infants gain the ability to learn about unfamiliar stimuli by observing others’ emotional reactions to those stimuli, so called social referencing (SR). Joint processing of emotion and head/gaze direction is essential for SR. This study tested emotion and head/gaze directio

  19. Faces of product pleasure: 25 positive emotions in human-product interactions

    NARCIS (Netherlands)

    Desmet, P.M.A.

    2012-01-01

    The study of user emotions is hindered by the absence of a clear overview of what positive emotions can be experienced in humanproduct interactions. Existing typologies are either too concise or too comprehensive, including less than five or hundreds of positive emotions, respectively. To overcome t

  20. 3D Face Model Dataset: Automatic Detection of Facial Expressions and Emotions for Educational Environments

    Science.gov (United States)

    Chickerur, Satyadhyan; Joshi, Kartik

    2015-01-01

    Emotion detection using facial images is a technique that researchers have been using for the last two decades to try to analyze a person's emotional state given his/her image. Detection of various kinds of emotion using facial expressions of students in educational environment is useful in providing insight into the effectiveness of tutoring…

  1. Preschooler's Faces in Spontaneous Emotional Contexts--How Well Do They Match Adult Facial Expression Prototypes?

    Science.gov (United States)

    Gaspar, Augusta; Esteves, Francisco G.

    2012-01-01

    Prototypical facial expressions of emotion, also known as universal facial expressions, are the underpinnings of most research concerning recognition of emotions in both adults and children. Data on natural occurrences of these prototypes in natural emotional contexts are rare and difficult to obtain in adults. By recording naturalistic…

  2. Infants’ Temperament and Mothers’, and Fathers’ Depression Predict Infants’ Attention to Objects Paired with Emotional Faces

    NARCIS (Netherlands)

    Aktar, E.; Mandell, D.J.; de Vente, W.; Majdandžić, M.; Raijmakers, M.E.J.; Bögels, S.M.

    2016-01-01

    Between 10 and 14 months, infants gain the ability to learn about unfamiliar stimuli by observing others’ emotional reactions to those stimuli, so called social referencing (SR). Joint processing of emotion and head/gaze direction is essential for SR. This study tested emotion and head/gaze

  3. Infants’ Temperament and Mothers’, and Fathers’ Depression Predict Infants’ Attention to Objects Paired with Emotional Faces

    NARCIS (Netherlands)

    Aktar, E.; Mandell, D.J.; de Vente, W.; Majdandžić, M.; Raijmakers, M.E.J.; Bögels, S.M.

    2016-01-01

    Between 10 and 14 months, infants gain the ability to learn about unfamiliar stimuli by observing others’ emotional reactions to those stimuli, so called social referencing (SR). Joint processing of emotion and head/gaze direction is essential for SR. This study tested emotion and head/gaze directio

  4. Cortisol-induced enhancement of emotional face processing in social phobia depends on symptom severity and motivational context.

    Science.gov (United States)

    van Peer, Jacobien M; Spinhoven, Philip; van Dijk, J Gert; Roelofs, Karin

    2009-05-01

    We investigated the effects of cortisol administration on approach and avoidance tendencies in 20 patients with social anxiety disorder (SAD). Event-related brain potentials (ERPs) were measured during a reaction time task, in which patients evaluated the emotional expression of photographs of happy and angry faces by making an approaching (flexion) or avoiding (extension) arm movement. Patients showed significant avoidance tendencies for angry but not for happy faces, both in the placebo and cortisol condition. Moreover, ERP analyses showed a significant interaction of condition by severity of social anxiety on early positive (P150) amplitudes during avoidance compared to approach, indicating that cortisol increases early processing of social stimuli (in particular angry faces) during avoidance. This result replicates previous findings from a non-clinical sample of high anxious individuals and demonstrates their relevance for clinical SAD. Apparently the cortisol-induced increase in processing of angry faces in SAD depends on symptom severity and motivational context.

  5. Early processing of emotional faces in a Go/NoGo task: lack of N170 right-hemispheric specialisation in children with major depression.

    Science.gov (United States)

    Grunewald, Madlen; Stadelmann, Stephanie; Brandeis, Daniel; Jaeger, Sonia; Matuschek, Tina; Weis, Steffi; Kalex, Virgenie; Hiemisch, Andreas; von Klitzing, Kai; Döhnert, Mirko

    2015-09-01

    Emotionally biased information processing towards sad and away from happy information characterises individuals with major depression. To learn more about the nature of these dysfunctional modulations, developmental and neural aspects of emotional face processing have to be considered. By combining measures of performance (attention control, inhibition) in an emotional Go/NoGo task with an event-related potential (ERP) of early face processing (N170), we obtained a multifaceted picture of emotional face processing in a sample of children and adolescents (11-14 years) with major depression (MDD, n = 26) and healthy controls (CTRL, n = 26). Subjects had to respond to emotional faces (fearful, happy or sad) and withhold their response to calm faces or vice versa. Children of the MDD group displayed shorter N170 latencies than children of the CTRL group. Typical right lateralisation of the N170 was observed for all faces in the CTRL but not for happy and calm faces in the MDD group. However, the MDD group did not differ in their behavioural reaction to emotional faces, and effects of interference by emotional information on the reaction to calm faces in this group were notably mild. Although we could not find a typical pattern of emotional bias, the results suggest that alterations in face processing of children with major depression can be seen at early stages of face perception indexed by the N170. The findings call for longitudinal examinations considering effects of development in children with major depression as well as associations to later stages of processing.

  6. Emotion recognition through static faces and moving bodies: a comparison between typically-developed adults and individuals with high level of autistic traits

    Directory of Open Access Journals (Sweden)

    Rossana eActis-Grosso

    2015-10-01

    Full Text Available We investigated whether the type of stimulus (pictures of static faces vs. body motion contributes differently to the recognition of emotions. The performance (accuracy and response times of 25 Low Autistic Traits (LAT group young adults (21 males and 20 young adults (16 males with either High Autistic Traits (HAT group or with High Functioning Autism Spectrum Disorder was compared in the recognition of four emotions (Happiness, Anger, Fear and Sadness either shown in static faces or conveyed by moving bodies (patch-light displays, PLDs. Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage. Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that i emotion recognition is not generally impaired in HAT individuals, ii the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals.

  7. Emotion recognition through static faces and moving bodies: a comparison between typically developed adults and individuals with high level of autistic traits

    Science.gov (United States)

    Actis-Grosso, Rossana; Bossi, Francesco; Ricciardelli, Paola

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits or with High Functioning Autism Spectrum Disorder (HAT group) was compared in the recognition of four emotions (Happiness, Anger, Fear, and Sadness) either shown in static faces or conveyed by moving body patch-light displays (PLDs). Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage). Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that (i) emotion recognition is not generally impaired in HAT individuals, (ii) the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals. PMID:26557101

  8. The changing face of P300 BCIs: a comparison of stimulus changes in a P300 BCI involving faces, emotion, and movement.

    Directory of Open Access Journals (Sweden)

    Jing Jin

    Full Text Available BACKGROUND: One of the most common types of brain-computer interfaces (BCIs is called a P300 BCI, since it relies on the P300 and other event-related potentials (ERPs. In the canonical P300 BCI approach, items on a monitor flash briefly to elicit the necessary ERPs. Very recent work has shown that this approach may yield lower performance than alternate paradigms in which the items do not flash but instead change in other ways, such as moving, changing colour or changing to characters overlaid with faces. METHODOLOGY/PRINCIPAL FINDINGS: The present study sought to extend this research direction by parametrically comparing different ways to change items in a P300 BCI. Healthy subjects used a P300 BCI across six different conditions. Three conditions were similar to our prior work, providing the first direct comparison of characters flashing, moving, and changing to faces. Three new conditions also explored facial motion and emotional expression. The six conditions were compared across objective measures such as classification accuracy and bit rate as well as subjective measures such as perceived difficulty. In line with recent studies, our results indicated that the character flash condition resulted in the lowest accuracy and bit rate. All four face conditions (mean accuracy >91% yielded significantly better performance than the flash condition (mean accuracy = 75%. CONCLUSIONS/SIGNIFICANCE: Objective results reaffirmed that the face paradigm is superior to the canonical flash approach that has dominated P300 BCIs for over 20 years. The subjective reports indicated that the conditions that yielded better performance were not considered especially burdensome. Therefore, although further work is needed to identify which face paradigm is best, it is clear that the canonical flash approach should be replaced with a face paradigm when aiming at increasing bit rate. However, the face paradigm has to be further explored with practical applications

  9. Gamma activation in young people with autism spectrum disorders and typically-developing controls when viewing emotions on faces.

    Directory of Open Access Journals (Sweden)

    Barry Wright

    Full Text Available BACKGROUND: Behavioural studies have highlighted irregularities in recognition of facial affect in children and young people with autism spectrum disorders (ASDs. Recent findings from studies utilising electroencephalography (EEG and magnetoencephalography (MEG have identified abnormal activation and irregular maintenance of gamma (>30 Hz range oscillations when ASD individuals attempt basic visual and auditory tasks. METHODOLOGY/PRINCIPAL FINDINGS: The pilot study reported here is the first study to use spatial filtering techniques in MEG to explore face processing in children with ASD. We set out to examine theoretical suggestions that gamma activation underlying face processing may be different in a group of children and young people with ASD (n = 13 compared to typically developing (TD age, gender and IQ matched controls. Beamforming and virtual electrode techniques were used to assess spatially localised induced and evoked activity. While lower-band (3-30 Hz responses to faces were similar between groups, the ASD gamma response in occipital areas was observed to be largely absent when viewing emotions on faces. Virtual electrode analysis indicated the presence of intact evoked responses but abnormal induced activity in ASD participants. CONCLUSIONS/SIGNIFICANCE: These findings lend weight to previous suggestions that specific components of the early visual response to emotional faces is abnormal in ASD. Elucidation of the nature and specificity of these findings is worthy of further research.

  10. Development of cerebral lateralisation for recognition of emotions in chimeric faces in children aged 5 to 11.

    Science.gov (United States)

    Workman, Lance; Chilvers, Louise; Yeomans, Heather; Taylor, Sandie

    2006-11-01

    In contrast to research into the development of language laterality, there has been relatively little research into the development of lateralisation of emotional processing. If language lateralisation begins in childhood and is complete by puberty (Lenneberg, 1967) it seems reasonable that the lateralisation of the perception of emotions might also occur during this period. In this study a split field chimeric faces test using the six universal facial expressions proposed by Ekman and Friesen (1971), an emotion in the eyes test, and a situational cartoon test were administered to three groups of children aged 5/6, 7/8, and 10/11. No overall hemispace advantage was seen for the 5/6-year-old group, but by the age of 10/11 a clear left hemispace advantage (right hemisphere) was found for all six emotions. Such a pattern is comparable to a previous study that made use of adults on this task (Workman, Peters, & Taylor, 2000b). Moreover, a significant positive correlation between a child's ability to recognise emotions in cartoon situations and their left hemispatial advantage score was uncovered. Finally, a significant positive correlation between a child's ability to recognise emotions in the eyes of others and their left hemispatial advantage score was also uncovered. These findings are taken as evidence that there may be a relationship between the development of emotional processing in the right hemisphere and a child's emerging ability to perceive or attend to the emotional states of others. Results are discussed in relation to the child's development of a theory of mind.

  11. The ties to unbind: Age-related differences in feature (unbinding in working memory for emotional faces

    Directory of Open Access Journals (Sweden)

    Didem ePehlivanoglu

    2014-04-01

    Full Text Available In the present study, we investigated age-related differences in the processing of emotional stimuli. Specifically, we were interested in whether older adults would show deficits in unbinding emotional expression (i.e., either no emotion, happiness, anger, or disgust from bound stimuli (i.e., photographs of faces expressing these emotions, as a hyperbinding account of age-related differences in working memory would predict. Younger and older adults completed different N-Back tasks (side-by-side 0-Back, 1-Back, 2-Back under three conditions: match/mismatch judgments based on either the identity of the face (identity condition, the face’s emotional expression (expression condition, or both identity and expression of the face (binding condition. Both age groups performed more slowly and with lower accuracy in the expression condition than in the binding condition, indicating the presence of an unbinding process. This unbinding effect was more pronounced in older adults than in younger adults, but only in the 2-Back task. Thus, older adults seemed to have a specific deficit in unbinding in working memory, over and beyond age-related differences observed in perceptual processing (0-Back and attention/short-term memory (1-Back. Additionally, no age-related differences were found in accuracy in the 0-Back task, but such differences emerged in the 1-Back task, and were further magnified in the 2-Back task, indicating independent age-related differences in attention/short-term memory and working memory. Pupil dilation data confirmed that the attention/short-term memory version of the task (1-Back is more effortful in older adults than younger adults.

  12. The influence of variations in eating disorder-related symptoms on processing of emotional faces in a non-clinical female sample: An eye-tracking study.

    Science.gov (United States)

    Sharpe, Emma; Wallis, Deborah J; Ridout, Nathan

    2016-06-30

    This study aimed to: (i) determine if the attention bias towards angry faces reported in eating disorders generalises to a non-clinical sample varying in eating disorder-related symptoms; (ii) examine if the bias occurs during initial orientation or later strategic processing; and (iii) confirm previous findings of impaired facial emotion recognition in non-clinical disordered eating. Fifty-two females viewed a series of face-pairs (happy or angry paired with neutral) whilst their attentional deployment was continuously monitored using an eye-tracker. They subsequently identified the emotion portrayed in a separate series of faces. The highest (n=18) and lowest scorers (n=17) on the Eating Disorders Inventory (EDI) were compared on the attention and facial emotion recognition tasks. Those with relatively high scores exhibited impaired facial emotion recognition, confirming previous findings in similar non-clinical samples. They also displayed biased attention away from emotional faces during later strategic processing, which is consistent with previously observed impairments in clinical samples. These differences were related to drive-for-thinness. Although we found no evidence of a bias towards angry faces, it is plausible that the observed impairments in emotion recognition and avoidance of emotional faces could disrupt social functioning and act as a risk factor for the development of eating disorders.

  13. Effects of repeated retrieval of central and peripheral details in complex emotional slides

    NARCIS (Netherlands)

    Hauer, Beatrijs J. A.; Wessel, Ineke; Merckelbach, Harald; Roefs, Anne; Dalgleish, Tim; Merkelbach, H

    2007-01-01

    Research has demonstrated that repeated retrieval enhances memory for practised verbal information, but undermines correct recall of unpractised related verbal information, a phenomenon known as retrieval-induced forgetting (RIF). This paper addresses the question of what happens with memory for

  14. Double attention bias for positive and negative emotional faces in clinical depression: evidence from an eye-tracking study.

    Science.gov (United States)

    Duque, Almudena; Vázquez, Carmelo

    2015-03-01

    According to cognitive models, attentional biases in depression play key roles in the onset and subsequent maintenance of the disorder. The present study examines the processing of emotional facial expressions (happy, angry, and sad) in depressed and non-depressed adults. Sixteen unmedicated patients with Major Depressive Disorder (MDD) and 34 never-depressed controls (ND) completed an eye-tracking task to assess different components of visual attention (orienting attention and maintenance of attention) in the processing of emotional faces. Compared to ND, participants with MDD showed a negative attentional bias in attentional maintenance indices (i.e. first fixation duration and total fixation time) for sad faces. This attentional bias was positively associated with the severity of depressive symptoms. Furthermore, the MDD group spent a marginally less amount of time viewing happy faces compared with the ND group. No differences were found between the groups with respect to angry faces and orienting attention indices. The current study is limited by its cross-sectional design. These results support the notion that attentional biases in depression are specific to depression-related information and that they operate in later stages in the deployment of attention. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Event-related brain responses to emotional words, pictures, and faces - a cross-domain comparison.

    Science.gov (United States)

    Bayer, Mareike; Schacht, Annekathrin

    2014-01-01

    Emotion effects in event-related brain potentials (ERPs) have previously been reported for a range of visual stimuli, including emotional words, pictures, and facial expressions. Still, little is known about the actual comparability of emotion effects across these stimulus classes. The present study aimed to fill this gap by investigating emotion effects in response to words, pictures, and facial expressions using a blocked within-subject design. Furthermore, ratings of stimulus arousal and valence were collected from an independent sample of participants. Modulations of early posterior negativity (EPN) and late positive complex (LPC) were visible for all stimulus domains, but showed clear differences, particularly in valence processing. While emotion effects were limited to positive stimuli for words, they were predominant for negative stimuli in pictures and facial expressions. These findings corroborate the notion of a positivity offset for words and a negativity bias for pictures and facial expressions, which was assumed to be caused by generally lower arousal levels of written language. Interestingly, however, these assumed differences were not confirmed by arousal ratings. Instead, words were rated as overall more positive than pictures and facial expressions. Taken together, the present results point toward systematic differences in the processing of written words and pictorial stimuli of emotional content, not only in terms of a valence bias evident in ERPs, but also concerning their emotional evaluation captured by ratings of stimulus valence and arousal.

  16. A threatening face in the crowd: effects of emotional singletons on visual working memory.

    Science.gov (United States)

    Thomas, Paul M J; Jackson, Margaret C; Raymond, Jane E

    2014-02-01

    Faces with threatening versus positive expressions are better remembered in visual working memory (WM) and are especially effective at capturing attention. We asked how the presence of a single threatening or happy face affects WM for concurrently viewed faces with neutral expressions. If threat captures attention and attention determines WM, then a WM performance cost for neutral faces should be evident. However, if threat boosts processing in an object-specific, noncompetitive manner, then no such costs should be produced. Participants viewed three neutral and one angry or happy face for 2 s. Face recognition was tested 1 s later. Although WM was better for singletons than nonsingletons and better for angry versus happy singletons, WM for neutral faces remained unaffected by either singleton. These results, combined with eye movement and response time analyses, argue against a selective attention account of threat-based benefits to WM and support object-specific enhancement via threat processing.

  17. Neural Reactivity to Emotional Faces May Mediate the Relationship Between Childhood Empathy and Adolescent Prosocial Behavior.

    Science.gov (United States)

    Flournoy, John C; Pfeifer, Jennifer H; Moore, William E; Tackman, Allison M; Masten, Carrie L; Mazziotta, John C; Iacoboni, Marco; Dapretto, Mirella

    2016-11-01

    Reactivity to others' emotions not only can result in empathic concern (EC), an important motivator of prosocial behavior, but can also result in personal distress (PD), which may hinder prosocial behavior. Examining neural substrates of emotional reactivity may elucidate how EC and PD differentially influence prosocial behavior. Participants (N = 57) provided measures of EC, PD, prosocial behavior, and neural responses to emotional expressions at ages 10 and 13. Initial EC predicted subsequent prosocial behavior. Initial EC and PD predicted subsequent reactivity to emotions in the inferior frontal gyrus (IFG) and inferior parietal lobule, respectively. Activity in the IFG, a region linked to mirror neuron processes, as well as cognitive control and language, mediated the relation between initial EC and subsequent prosocial behavior. © 2016 The Authors. Child Development © 2016 Society for Research in Child Development, Inc.

  18. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions

    OpenAIRE

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed t...

  19. Emotion recognition from facial expressions: a normative study of the Ekman 60-Faces Test in the Italian population.

    Science.gov (United States)

    Dodich, Alessandra; Cerami, Chiara; Canessa, Nicola; Crespi, Chiara; Marcone, Alessandra; Arpone, Marta; Realmuto, Sabrina; Cappa, Stefano F

    2014-07-01

    The Ekman 60-Faces (EK-60F) Test is a well-known neuropsychological tool assessing emotion recognition from facial expressions. It is the most employed task for research purposes in psychiatric and neurological disorders, including neurodegenerative diseases, such as the behavioral variant of Frontotemporal Dementia (bvFTD). Despite its remarkable usefulness in the social cognition research field, to date, there are still no normative data for the Italian population, thus limiting its application in a clinical context. In this study, we report procedures and normative data for the Italian version of the test. A hundred and thirty-two healthy Italian participants aged between 20 and 79 years with at least 5 years of education were recruited on a voluntary basis. They were administered the EK-60F Test from the Ekman and Friesen series of Pictures of Facial Affect after a preliminary semantic recognition test of the six basic emotions (i.e., anger, fear, sadness, happiness, disgust, surprise). Data were analyzed according to the Capitani procedure [1]. The regression analysis revealed significant effects of demographic variables, with younger, more educated, female subjects showing higher scores. Normative data were then applied to a sample of 15 bvFTD patients which showed global impaired performance in the task, consistently with the clinical condition. We provided EK-60F Test normative data for the Italian population allowing the investigation of global emotion recognition ability as well as selective impairment of basic emotions recognition, both for clinical and research purposes.

  20. Mother-infant mutual eye gaze supports emotion regulation in infancy during the Still-Face paradigm.

    Science.gov (United States)

    MacLean, Peggy C; Rynes, Kristina N; Aragón, Crystal; Caprihan, Arvind; Phillips, John P; Lowe, Jean R

    2014-11-01

    This study was designed to examine the sequential relationship between mother-infant synchrony and infant affect using multilevel modeling during the Still Face paradigm. We also examined self-regulatory behaviors that infants use during the Still-Face paradigm to modulate their affect, particularly during stressors where their mothers are not available to help them co-regulate. There were 84 mother-infant dyads, of healthy full term 4 month old infants. Second-by-second coding of infant self-regulation and infant affect was done, in addition to mother-infant mutual eye gaze. Using multilevel modeling, we found that infant affect became more positive when mutual gaze had occurred the previous second, suggesting that the experience of synchronicity was associated with observable shifts in affect. We also found a positive association between self-regulatory behaviors and increases in positive affect only during the Still-Face episode (episode 2). Our study provides support for the role of mother-infant synchronicity in emotion regulation as well as support for the role of self-regulatory behaviors in emotion regulation that can have important implication for intervention.

  1. Amygdala activation during emotional face processing in adolescents with affective disorders: the role of underlying depression and anxiety symptoms

    Science.gov (United States)

    van den Bulk, Bianca G.; Meens, Paul H. F.; van Lang, Natasja D. J.; de Voogd, E. L.; van der Wee, Nic J. A.; Rombouts, Serge A. R. B.; Crone, Eveline A.; Vermeiren, Robert R. J. M.

    2014-01-01

    Depressive and anxiety disorders are often first diagnosed during adolescence and it is known that they persist into adulthood. Previous studies often tried to dissociate depressive and anxiety disorders, but high comorbidity makes this difficult and maybe even impossible. The goal of this study was to use neuroimaging to test what the unique contribution is of depression and anxiety symptomatology on emotional processing and amygdala activation, and to compare the results with a healthy control group. We included 25 adolescents with depressive and/or anxiety disorders and 26 healthy adolescents. Participants performed an emotional face processing task while in the MRI scanner. We were particularly interested in the relation between depression/anxiety symptomatology and patterns of amygdala activation. There were no significant differences in activation patterns between the control group and the clinical group on whole brain level and ROI level. However, we found that dimensional scores on an anxiety but not a depression subscale significantly predicted brain activation in the right amygdala when processing fearful, happy and neutral faces. These results suggest that anxiety symptoms are a better predictor for differentiating activation patterns in the amygdala than depression symptoms. Although the current study includes a relatively large sample of treatment naïve adolescents with depression/anxiety disorders, results might be influenced by differences between studies in recruitment strategies or methodology. Future research should include larger samples with a more equal distribution of adolescents with a clinical diagnosis of depression and/or anxiety. To conclude, this study shows that abnormal amygdala responses to emotional faces in depression and anxiety seems to be more dependent on anxiety symptoms than on depression symptoms, and thereby highlights the need for more research to better characterize clinical groups in future studies. PMID:24926249

  2. Amygdala activation during emotional face processing in adolescents with affective disorders: the role of underlying depression and anxiety symptoms.

    Directory of Open Access Journals (Sweden)

    Bianca G Van Den Bulk

    2014-06-01

    Full Text Available AbstractDepressive and anxiety disorders are often first diagnosed during adolescence and it is known that they persist into adulthood. Previous studies often tried to dissociate depressive and anxiety disorders, but high comorbidity makes this difficult and maybe even impossible. The goal of this study was to use neuroimaging to test what the unique contribution is of depression and anxiety symptomatology on emotional processing and amygdala activation, and to compare the results with a healthy control group. We included 25 adolescents with depressive and/or anxiety disorders and 26 healthy adolescents. Participants performed an emotional face processing task while in the MRI scanner. We were particularly interested in the relation between depression/anxiety symptomatology and patterns of amygdala activation. There were no significant differences in activation patterns between the control group and the clinical group on whole brain level and ROI level. However, we found that dimensional scores on an anxiety but not a depression subscale significantly predicted brain activation in the right amygdala when processing fearful, happy and neutral faces. These results suggest that anxiety symptoms are a better predictor for differentiating activation patterns in the amygdala than depression symptoms. Although the current study includes a relatively large sample of treatment naïve adolescents with depression/anxiety disorders, results might be influenced by differences between studies in recruitment strategies or methodology. Future research should include larger samples with a more equal distribution of adolescents with a clinical diagnosis of depression and/or anxiety. To conclude, this study shows that abnormal amygdala responses to emotional faces in depression and anxiety seems to be more dependent on anxiety symptoms than on depression symptoms, and thereby highlights the need for more research to better characterize clinical groups in future

  3. Anxiety attenuates awareness of emotional faces during rapid serial visual presentation.

    Science.gov (United States)

    Van Dam, Nicholas T; Earleywine, Mitch; Altarriba, Jeanette

    2012-08-01

    Anxiety limits normative perception and impacts the interaction between key neurophysiological systems, possibly by decreasing recruitment of goal-oriented processes and increasing recruitment of stimulus-driven processes. Previous studies examining the impact of anxiety on emotion processing commonly lack a sample with levels of anxiety comparable to a clinical population. Many also fail to control for co-occurring symptoms like depression. The current study used rapid serial visual presentation (RSVP) with two emotional targets, comparing healthy controls to a group of individuals with symptom levels comparable to anxiety disorder patients. The results showed a modulatory effect of anxiety; the high anxiety (HA) group showed an enhanced impairment in detecting the second of two emotional targets relative to the low anxiety (LA) control group. Though there were no group-specific effects on the attentional blink or repetition blindness, there was a significant interaction of group with first and second target valence. Notably, HA individuals showed deficits (where LA individuals showed benefits) when the same emotion was presented twice. Further, when the first target was neutral, second target detection was especially impaired in HA individuals. Correlational analyses confirmed perceptual limitations were related to anxiety, but not depression, positive affect, or negative affect. The current results, along with past findings, suggest that clinical anxiety leads to deficits in overall cognitive control, increased difficulty inhibiting attention to distractors, and impairments in rapid, intuitive emotion processing.

  4. The role of emotion in dynamic audiovisual integration of faces and voices.

    Science.gov (United States)

    Kokinous, Jenny; Kotz, Sonja A; Tavano, Alessandro; Schröger, Erich

    2015-05-01

    We used human electroencephalogram to study early audiovisual integration of dynamic angry and neutral expressions. An auditory-only condition served as a baseline for the interpretation of integration effects. In the audiovisual conditions, the validity of visual information was manipulated using facial expressions that were either emotionally congruent or incongruent with the vocal expressions. First, we report an N1 suppression effect for angry compared with neutral vocalizations in the auditory-only condition. Second, we confirm early integration of congruent visual and auditory information as indexed by a suppression of the auditory N1 and P2 components in the audiovisual compared with the auditory-only condition. Third, audiovisual N1 suppression was modulated by audiovisual congruency in interaction with emotion: for neutral vocalizations, there was N1 suppression in both the congruent and the incongruent audiovisual conditions. For angry vocalizations, there was N1 suppression only in the congruent but not in the incongruent condition. Extending previous findings of dynamic audiovisual integration, the current results suggest that audiovisual N1 suppression is congruency- and emotion-specific and indicate that dynamic emotional expressions compared with non-emotional expressions are preferentially processed in early audiovisual integration.

  5. The Cambridge Mindreading Face-Voice Battery for Children (CAM-C): complex emotion recognition in children with and without autism spectrum conditions.

    Science.gov (United States)

    Golan, Ofer; Sinai-Gavrilov, Yana; Baron-Cohen, Simon

    2015-01-01

    Difficulties in recognizing emotions and mental states are central characteristics of autism spectrum conditions (ASC). However, emotion recognition (ER) studies have focused mostly on recognition of the six 'basic' emotions, usually using still pictures of faces. This study describes a new battery of tasks for testing recognition of nine complex emotions and mental states from video clips of faces and from voice recordings taken from the Mindreading DVD. This battery (the Cambridge Mindreading Face-Voice Battery for Children or CAM-C) was given to 30 high-functioning children with ASC, aged 8 to 11, and to 25 matched controls. The ASC group scored significantly lower than controls on complex ER from faces and voices. In particular, participants with ASC had difficulty with six out of nine complex emotions. Age was positively correlated with all task scores, and verbal IQ was correlated with scores in the voice task. CAM-C scores were negatively correlated with parent-reported level of autism spectrum symptoms. Children with ASC show deficits in recognition of complex emotions and mental states from both facial and vocal expressions. The CAM-C may be a useful test for endophenotypic studies of ASC and is one of the first to use dynamic stimuli as an assay to reveal the ER profile in ASC. It complements the adult version of the CAM Face-Voice Battery, thus providing opportunities for developmental assessment of social cognition in autism.

  6. Implicit racial attitudes influence perceived emotional intensity on other-race faces.

    Directory of Open Access Journals (Sweden)

    Qiandong Wang

    Full Text Available An ability to accurately perceive and evaluate out-group members' emotions plays a critical role in intergroup interactions. Here we showed that Chinese participants' implicit attitudes toward White people bias their perception and judgment of emotional intensity of White people's facial expressions such as anger, fear and sadness. We found that Chinese participants held pro-Chinese/anti-White implicit biases that were assessed in an evaluative implicit association test (IAT. Moreover, their implicit biases positively predicted the perceived intensity of White people's angry, fearful and sad facial expressions but not for happy expressions. This study demonstrates that implicit racial attitudes can influence perception and judgment of a range of emotional expressions. Implications for intergroup interactions were discussed.

  7. 情绪概念加工与情绪面孔知觉的相互影响%The interaction between emotional concept processing and emotional face perception

    Institute of Scientific and Technical Information of China (English)

    刘文娟; 沈曼琼; 李莹; 王瑞明

    2016-01-01

    our experiments, we use emotional concepts and emotional faces to address this question. All the experiments were carried out using E-prime1.0. In experiment 1, emotional faces were presented to the participants and they were told to judge the valence of the faces. Participants also judged the target emotional face that following the emotional words. The results showed that there was interaction between emotional concept processing and emotional face perception on the deep semantic level. The relationship between emotional concept processing and emotional face perception was symmetric. In order to further explore the influence of semantic depth on emotional concept processing and emotional face perception, we changed the valence judgment task to pseudo-words judgment task in Experiment 2 so as to ensure that readers would only engage in shallow semantic processing in this task. We found then that concept processing affected the processing of emotional face, but not vice versa. Their relationship was asymmetric. According to the two experiment results, we can conclude that the depth of semantic processing affected the relationship between emotional concept processing and emotional face perception. These results, along with others in the literature, indicate that conceptual processing uses sensorimotor representation, and that the depth of semantic processing affects the relationship between emotional concept processing and emotional face perception. In experiment 3, we further explored the depth of perceptual processing on this relationship. We changed face duration in experiment 3 and we found that when emotional face duration was shortened, emotional face perception did not affect emotion concept processing, and vice versa. There was no interaction between emotional concept processing and emotional face perception. Thus, our results suggested that the depth of perceptual processing also had an influence on the relationship between emotional concept processing and

  8. Faces of Shame: Implications for Self-Esteem, Emotion Regulation, Aggression, and Well-Being.

    Science.gov (United States)

    Velotti, Patrizia; Garofalo, Carlo; Bottazzi, Federica; Caretti, Vincenzo

    2017-02-17

    There is an increasing interest in psychological research on shame experiences and their associations with other aspects of psychological functioning and well-being, as well as with possible maladaptive outcomes. In an attempt to confirm and extend previous knowledge on this topic, we investigated the nomological network of shame experiences in a large community sample (N = 380; 66.1% females), adopting a multidimensional conceptualization of shame. Females reported higher levels of shame (in particular, bodily and behavioral shame), guilt, psychological distress, emotional reappraisal, and hostility. Males had higher levels of self-esteem, emotional suppression, and physical aggression. Shame feelings were associated with low self-esteem, hostility, and psychological distress in a consistent way across gender. Associations between characterological shame and emotional suppression, as well as between bodily shame and anger occurred only among females. Moreover, characterological and bodily shame added to the prediction of low self-esteem, hostility, and psychological distress above and beyond the influence of trait shame. Finally, among females, emotional suppression mediated the influence of characterological shame on hostility and psychological distress. These findings extend current knowledge on the nomological net surrounding shame experiences in everyday life, supporting the added value of a multidimensional conceptualization of shame feelings.

  9. Recognizing emotions in faces : effects of acute tryptophan depletion and bright light

    NARCIS (Netherlands)

    aan het Rot, Marije; Coupland, Nicholas; Boivin, Diane B.; Benkelfat, Chawki; Young, Simon N.

    2010-01-01

    In healthy never-depressed individuals, acute tryptophan depletion (ATD) may selectively decrease the accurate recognition of fearful facial expressions. Here we investigated the perception of facial emotions after ATD in more detail. We also investigated whether bright light, which can reverse ATD'

  10. Facial Electromyographic Responses to Emotional Information from Faces and Voices in Individuals with Pervasive Developmental Disorder

    Science.gov (United States)

    Magnee, Maurice J. C. M.; de Gelder, Beatrice; van Engeland, Herman; Kemner, Chantal

    2007-01-01

    Background: Despite extensive research, it is still debated whether impairments in social skills of individuals with pervasive developmental disorder (PDD) are related to specific deficits in the early processing of emotional information. We aimed to test both automatic processing of facial affect as well as the integration of auditory and visual…

  11. Exploring the space of emotional faces of subjects without acting experience

    NARCIS (Netherlands)

    J. Hendrix (Jeroen); Z.M. Ruttkay

    2000-01-01

    textabstractThe current state of semi-automated facial animation demands better understanding of how facial expressions are produced by real people. As a first step of a series of empirical studies, we investigated snapshots of facial expressions of the six basic emotions, produced by 18 subjects in

  12. Exploring the space of emotional faces of subjects without acting experience

    NARCIS (Netherlands)

    J. Hendrix (Jeroen); Z.M. Ruttkay

    2000-01-01

    htmlabstractThe current state of semi-automated facial animation demands better understanding of how facial expressions are produced by real people. As a first step of a series of empirical studies, we investigated snapshots of facial expressions of the six basic emotions, produced by 18 subjects in

  13. Processing of masked and unmasked emotional faces under different attentional conditions: an electrophysiological investigation.

    Directory of Open Access Journals (Sweden)

    Marzia eDel Zotto

    2015-10-01

    Full Text Available In order to investigate the interactions between non-spatial selective attention, awareness and emotion processing, we carried out an ERP study using a backward masking paradigm, in which angry, fearful, happy and neutral facial expressions were presented, while participants attempted to detect the presence of one or the other category of facial expressions in the different experimental blocks. ERP results showed that negative emotions enhanced an early N170 response over temporal-occipital leads in both masked and unmasked conditions, independently of selective attention. A later effect arising at the P2 was linked to awareness. Finally, selective attention was found to affect the N2 and N3 components over occipito-parietal leads. Our findings reveal that i the initial processing of facial expressions arises prior to attention and awareness; ii attention and awareness give rise to temporally distinct periods of activation independently of the type of emotion with only a partial degree of overlap; and iii selective attention appears to be influenced by the emotional nature of the stimuli, which in turn impinges on unconscious processing at a very early stage. This study confirms previous reports that negative facial expressions can be processed rapidly, in absence of visual awareness and independently of selective attention. On the other hand, attention and awareness may operate in a synergistic way, depending on task demand.

  14. Beyond face value: does involuntary emotional anticipation shape the perception of dynamic facial expressions?

    Directory of Open Access Journals (Sweden)

    Letizia Palumbo

    Full Text Available Emotional facial expressions are immediate indicators of the affective dispositions of others. Recently it has been shown that early stages of social perception can already be influenced by (implicit attributions made by the observer about the agent's mental state and intentions. In the current study possible mechanisms underpinning distortions in the perception of dynamic, ecologically-valid, facial expressions were explored. In four experiments we examined to what extent basic perceptual processes such as contrast/context effects, adaptation and representational momentum underpinned the perceptual distortions, and to what extent 'emotional anticipation', i.e. the involuntary anticipation of the other's emotional state of mind on the basis of the immediate perceptual history, might have played a role. Neutral facial expressions displayed at the end of short video-clips, in which an initial facial expression of joy or anger gradually morphed into a neutral expression, were misjudged as being slightly angry or slightly happy, respectively (Experiment 1. This response bias disappeared when the actor's identity changed in the final neutral expression (Experiment 2. Videos depicting neutral-to-joy-to-neutral and neutral-to-anger-to-neutral sequences again produced biases but in opposite direction (Experiment 3. The bias survived insertion of a 400 ms blank (Experiment 4. These results suggested that the perceptual distortions were not caused by any of the low-level perceptual mechanisms (adaptation, representational momentum and contrast effects. We speculate that especially when presented with dynamic, facial expressions, perceptual distortions occur that reflect 'emotional anticipation' (a low-level mindreading mechanism, which overrules low-level visual mechanisms. Underpinning neural mechanisms are discussed in relation to the current debate on action and emotion understanding.

  15. Beyond face value: does involuntary emotional anticipation shape the perception of dynamic facial expressions?

    Science.gov (United States)

    Palumbo, Letizia; Jellema, Tjeerd

    2013-01-01

    Emotional facial expressions are immediate indicators of the affective dispositions of others. Recently it has been shown that early stages of social perception can already be influenced by (implicit) attributions made by the observer about the agent's mental state and intentions. In the current study possible mechanisms underpinning distortions in the perception of dynamic, ecologically-valid, facial expressions were explored. In four experiments we examined to what extent basic perceptual processes such as contrast/context effects, adaptation and representational momentum underpinned the perceptual distortions, and to what extent 'emotional anticipation', i.e. the involuntary anticipation of the other's emotional state of mind on the basis of the immediate perceptual history, might have played a role. Neutral facial expressions displayed at the end of short video-clips, in which an initial facial expression of joy or anger gradually morphed into a neutral expression, were misjudged as being slightly angry or slightly happy, respectively (Experiment 1). This response bias disappeared when the actor's identity changed in the final neutral expression (Experiment 2). Videos depicting neutral-to-joy-to-neutral and neutral-to-anger-to-neutral sequences again produced biases but in opposite direction (Experiment 3). The bias survived insertion of a 400 ms blank (Experiment 4). These results suggested that the perceptual distortions were not caused by any of the low-level perceptual mechanisms (adaptation, representational momentum and contrast effects). We speculate that especially when presented with dynamic, facial expressions, perceptual distortions occur that reflect 'emotional anticipation' (a low-level mindreading mechanism), which overrules low-level visual mechanisms. Underpinning neural mechanisms are discussed in relation to the current debate on action and emotion understanding.

  16. Fusiform gyrus dysfunction is associated with perceptual processing efficiency to emotional faces in adolescent depression: a model-based approach

    Directory of Open Access Journals (Sweden)

    Tiffany Cheing Ho

    2016-02-01

    Full Text Available While the extant literature has focused on major depressive disorder (MDD as being characterized by abnormalities in processing affective stimuli (e.g., facial expressions, little is known regarding which specific aspects of cognition influence the evaluation of affective stimuli, and what are the underlying neural correlates. To investigate these issues, we assessed 26 adolescents diagnosed with MDD and 37 well-matched healthy controls (HCL who completed an emotion identification task of dynamically morphing faces during functional magnetic resonance imaging (fMRI. We analyzed the behavioral data using a sequential sampling model of response time (RT commonly used to elucidate aspects of cognition in binary perceptual decision making tasks: the Linear Ballistic Accumulator (LBA model. Using a hierarchical Bayesian estimation method, we obtained group-level and individual-level estimates of LBA parameters on the facial emotion identification task. While the MDD and HCL groups did not differ in mean RT, accuracy, or group-level estimates of perceptual processing efficiency (i.e., drift rate parameter of the LBA, the MDD group showed significantly reduced responses in left fusiform gyrus compared to the HCL group during the facial emotion identification task. Furthermore, within the MDD group, fMRI signal in the left fusiform gyrus during affective face processing was significantly associated with greater individual-level estimates of perceptual processing efficiency. Our results therefore suggest that affective processing biases in adolescents with MDD are characterized by greater perceptual processing efficiency of affective visual information in sensory brain regions responsible for the early processing of visual information. The theoretical, methodological, and clinical implications of our results are discussed.

  17. vMMN for schematic faces: automatic detection of change in emotional expression

    Directory of Open Access Journals (Sweden)

    Kairi eKreegipuu

    2013-10-01

    Full Text Available Our brain is able to automatically detect changes in sensory stimulation, including in vision. A large variety of changes of features in stimulation elicit a deviance-reflecting ERP component known as the mismatch negativity (MMN. The present study has three main goals: (1 to register vMMN using a rapidly presented stream of schematic faces (neutral, happy, angry; adapted from Öhman et al., 2001; (2 to compare elicited vMMNs to angry and happy schematic faces in two different paradigms, in a traditional oddball design with frequent standard and rare target and deviant stimuli (12.5% each and in an version of an optimal multi-feature paradigm with several deviant stimuli (altogether 37.5% in the stimulus block; (3 to compare vMMNs to subjective ratings of valence, arousal and attention capture for happy and angry schematic faces, i.e., to estimate the effect of affective value of stimuli on their automatic detection. Eleven observers (19-32 years, 6 women took part in both experiments, an oddball and optimum paradigm. Stimuli were rapidly presented schematic faces and an object with face-features that served as the target stimulus to be detected by a button-press. Results show that a vMMN-type response at posterior sites was equally elicited in both experiments. Post-experimental reports confirmed that the angry face attracted more automatic attention than the happy face but the difference did not emerge directly at the ERP level. Thus, when interested in studying change detection in facial expressions we encourage the use of the optimum (multi-feature design in order to save time and other experimental resources.

  18. Writ Large on Your Face: Observing Emotions Using Automatic Facial Analysis

    Directory of Open Access Journals (Sweden)

    Dieckmann Anja

    2014-05-01

    Full Text Available Emotions affect all of our daily decisions and, of course, they also influence our evaluations of brands, products and advertisements. But what exactly do consumers feel when they watch a TV commercial, visit a website or when they interact with a brand in different ways? Measuring such emotions is not an easy task. In the past, the effectiveness of marketing material was evaluated mostly by subsequent surveys. Now, with the emergence of neuroscientific approaches like EEG, the measurement of real-time reactions is possible, for instance, when watching a commercial. However, most neuroscientific procedures are fairly invasive and irritating. For an EEG, for instance, numerous electrodes need to be placed on the participant's scalp. Furthermore, data analysis is highly complex. Scientific expertise is necessary for interpretation, so the procedure remains a black box to most practitioners and the results are still rather controversial. By contrast, automatic facial analysis provides similar information without having to wire study participants. In addition, the results of such analyses are intuitive and easy to interpret even for laypeople. These convincing advantages led GfK Company to decide on facial analysis and to develop a tool suitable for measuring emotional responses to marketing stimuli, making it easily applicable in marketing research practice.

  19. One size does not fit all: face emotion processing impairments in semantic dementia, behavioural-variant frontotemporal dementia and Alzheimer's disease are mediated by distinct cognitive deficits.

    Science.gov (United States)

    Miller, Laurie A; Hsieh, Sharpley; Lah, Suncica; Savage, Sharon; Hodges, John R; Piguet, Olivier

    2012-01-01

    Patients with frontotemporal dementia (both behavioural variant [bvFTD] and semantic dementia [SD]) as well as those with Alzheimer's disease (AD) show deficits on tests of face emotion processing, yet the mechanisms underlying these deficits have rarely been explored. We compared groups of patients with bvFTD (n = 17), SD (n = 12) or AD (n = 20) to an age- and education-matched group of healthy control subjects (n = 36) on three face emotion processing tasks (Ekman 60, Emotion Matching and Emotion Selection) and found that all three patient groups were similarly impaired. Analyses of covariance employed to partial out the influences of language and perceptual impairments, which frequently co-occur in these patients, provided evidence of different underlying cognitive mechanisms. These analyses revealed that language impairments explained the original poor scores obtained by the SD patients on the Ekman 60 and Emotion Selection tasks, which involve verbal labels. Perceptual deficits contributed to Emotion Matching performance in the bvFTD and AD patients. Importantly, all groups remained impaired on one task or more following these analyses, denoting a primary emotion processing disturbance in these dementia syndromes. These findings highlight the multifactorial nature of emotion processing deficits in patients with dementia.

  20. Does a single session of electroconvulsive therapy alter the neural response to emotional faces in depression?

    DEFF Research Database (Denmark)

    Miskowiak, Kamilla W; Kessing, Lars V; Ott, Caroline V

    2017-01-01

    to fearful versus happy faces as well as in fear-specific functional connectivity between amygdala and occipito-temporal regions. Across all patients, greater fear-specific amygdala - occipital coupling correlated with lower fear vigilance. Despite no statistically significant shift in neural response...

  1. Attention Bias to Emotional Faces Varies by IQ and Anxiety in Williams Syndrome

    Science.gov (United States)

    McGrath, Lauren M.; Oates, Joyce M.; Dai, Yael G.; Dodd, Helen F.; Waxler, Jessica; Clements, Caitlin C.; Weill, Sydney; Hoffnagle, Alison; Anderson, Erin; MacRae, Rebecca; Mullett, Jennifer; McDougle, Christopher J.; Pober, Barbara R.; Smoller, Jordan W.

    2016-01-01

    Individuals with Williams syndrome (WS) often experience significant anxiety. A promising approach to anxiety intervention has emerged from cognitive studies of attention bias to threat. To investigate the utility of this intervention in WS, this study examined attention bias to happy and angry faces in individuals with WS (N = 46). Results showed…

  2. Face Emotion Processing in Depressed Children and Adolescents with and without Comorbid Conduct Disorder

    Science.gov (United States)

    Schepman, Karen; Taylor, Eric; Collishaw, Stephan; Fombonne, Eric

    2012-01-01

    Studies of adults with depression point to characteristic neurocognitive deficits, including differences in processing facial expressions. Few studies have examined face processing in juvenile depression, or taken account of other comorbid disorders. Three groups were compared: depressed children and adolescents with conduct disorder (n = 23),…

  3. Neural Correlates of Task-Irrelevant First and Second Language Emotion Words — Evidence from the Face-Word Stroop Task

    Directory of Open Access Journals (Sweden)

    Lin Fan

    2016-11-01

    Full Text Available Emotionally valenced words have thus far not been empirically examined in a bilingual population with the emotional face-word Stroop paradigm. Chinese-English bilinguals were asked to identify the facial expressions of emotion with their first (L1 or second (L2 language task-irrelevant emotion words superimposed on the face pictures. We attempted to examine how the emotional content of words modulates behavioral performance and cerebral functioning in the bilinguals’ two languages. The results indicated that there were significant congruency effects for both L1 and L2 emotion words, and that identifiable differences in the magnitude of Stroop effect between the two languages were also observed, suggesting L1 is more capable of activating the emotional response to word stimuli. For event-related potentials (ERPs data, an N350-550 effect was observed only in L1 task with greater negativity for incongruent than congruent trials. The size of N350-550 effect differed across languages, whereas no identifiable language distinction was observed in the effect of conflict slow potential (conflict SP. Finally, more pronounced negative amplitude at 230-330 ms was observed in L1 than in L2, but only for incongruent trials. This negativity, likened to an orthographic decoding N250, may reflect the extent of attention to emotion word processing at word-form level, while N350-550 reflects a complicated set of processes in the conflict processing. Overall, the face-word congruency effect has reflected identifiable language distinction at 230-330 and 350-550 ms, which provides supporting evidence for the theoretical proposals assuming attenuated emotionality of L2 processing.

  4. Attentional biases for emotional faces in young children of mothers with chronic or recurrent depression.

    Science.gov (United States)

    Kujawa, Autumn J; Torpey, Dana; Kim, Jiyon; Hajcak, Greg; Rose, Suzanne; Gotlib, Ian H; Klein, Daniel N

    2011-01-01

    Attentional biases for negative stimuli have been observed in school-age and adolescent children of depressed mothers and may reflect a vulnerability to depression. The direction of these biases and whether they can be identified in early childhood remains unclear. The current study examined attentional biases in 5-7-year-old children of depressed and non-depressed mothers. Following a mood induction, children participated in a dot-probe task assessing biases for sad and happy faces. There was a significant interaction of group and sex: daughters of depressed mothers attended selectively to sad faces, while children of controls and sons of depressed mothers did not exhibit biases. No effects were found for happy stimuli. These findings suggest that attentional biases are discernible in early childhood and may be vulnerability markers for depression. The results also raise the possibility that sex differences in cognitive biases are evident before the emergence of sex differences in the prevalence of depression.

  5. Emotions

    DEFF Research Database (Denmark)

    2017-01-01

    Observing science classroom activities presents an opportunity to observe the emotional aspect of interactions, and this chapter presents how this can be done and why. Drawing on ideas proposed by French philosopher Maurice Merleau-Ponty, emotions are theorized as publicly embodied enactments......, where differences in behavior between people shape emotional responses. Merleau-Ponty’s theorization of the body and feelings is connected to embodiment while examining central concepts such as consciousness and perception. Merleau-Ponty describes what he calls the emotional atmosphere and how it shapes...... the ways we experience events and activities. We use our interpretation of his understanding of emotions to examine an example of a group of year 8 science students who were engaged in a physics activity. Using the analytical framework of analyzing bodily stance by Goodwin, Cekaite, and Goodwin...

  6. The telltale face: possible mechanisms behind defector and cooperator recognition revealed by emotional facial expression metrics.

    Science.gov (United States)

    Kovács-Bálint, Zsófia; Bereczkei, Tamás; Hernádi, István

    2013-11-01

    In this study, we investigated the role of facial cues in cooperator and defector recognition. First, a face image database was constructed from pairs of full face portraits of target subjects taken at the moment of decision-making in a prisoner's dilemma game (PDG) and in a preceding neutral task. Image pairs with no deficiencies (n = 67) were standardized for orientation and luminance. Then, confidence in defector and cooperator recognition was tested with image rating in a different group of lay judges (n = 62). Results indicate that (1) defectors were better recognized (58% vs. 47%), (2) they looked different from cooperators (p facial microexpression analysis, defection was strongly linked with depressed lower lips and less opened eyes. Significant correlation was found between the intensity of micromimics and the rating of images in the cooperator-defector dimension. In summary, facial expressions can be considered as reliable indicators of momentary social dispositions in the PDG. Females may exhibit an evolutionary-based overestimation bias to detecting social visual cues of the defector face.

  7. 'It's intense, to an extent': a qualitative study of the emotional challenges faced by staff working on a treatment programme for intellectually disabled sex offenders.

    Science.gov (United States)

    Sandhu, Daljit K; Rose, John; Rostill-Brookes, Helen J; Thrift, Su

    2012-07-01

    This study explores the emotional challenges faced by staff working on a sex offender treatment programme for people with an intellectual disability. Semi-structured interviews were carried out with eight participants working on a treatment programme for sex offenders with an intellectual disability. Interviews were analysed using interpretative phenomenological analysis. Staff experienced a range of negative emotions that they dealt with in a variety of ways including through the use of humour and various emotional defences. Empathy was a challenging and complex issue with individuals taking a variety of positions. Staff awareness and understanding of the role of emotions in relation to their own well-being and in relation to therapeutic processes varied. Emotional intelligence was associated with greater therapeutic understanding. Recommendations are made in relation to personal and professional characteristics and need for clinical supervision to support staff well-being and the development of therapeutic competence and effectiveness. © 2012 Blackwell Publishing Ltd.

  8. Differential effects of face-realism and emotion on event-related brain potentials and their implications for the uncanny valley theory

    Science.gov (United States)

    Schindler, Sebastian; Zell, Eduard; Botsch, Mario; Kissler, Johanna

    2017-03-01

    Cartoon characters are omnipresent in popular media. While few studies have scientifically investigated their processing, in computer graphics, efforts are made to increase realism. Yet, close approximations of reality have been suggested to evoke sometimes a feeling of eeriness, the “uncanny valley” effect. Here, we used high-density electroencephalography to investigate brain responses to professionally stylized happy, angry, and neutral character faces. We employed six face-stylization levels varying from abstract to realistic and investigated the N170, early posterior negativity (EPN), and late positive potential (LPP) event-related components. The face-specific N170 showed a u-shaped modulation, with stronger reactions towards both most abstract and most realistic compared to medium-stylized faces. For abstract faces, N170 was generated more occipitally than for real faces, implying stronger reliance on structural processing. Although emotional faces elicited highest amplitudes on both N170 and EPN, on the N170 realism and expression interacted. Finally, LPP increased linearly with face realism, reflecting activity increase in visual and parietal cortex for more realistic faces. Results reveal differential effects of face stylization on distinct face processing stages and suggest a perceptual basis to the uncanny valley hypothesis. They are discussed in relation to face perception, media design, and computer graphics.

  9. Differential effects of face-realism and emotion on event-related brain potentials and their implications for the uncanny valley theory

    Science.gov (United States)

    Schindler, Sebastian; Zell, Eduard; Botsch, Mario; Kissler, Johanna

    2017-01-01

    Cartoon characters are omnipresent in popular media. While few studies have scientifically investigated their processing, in computer graphics, efforts are made to increase realism. Yet, close approximations of reality have been suggested to evoke sometimes a feeling of eeriness, the “uncanny valley” effect. Here, we used high-density electroencephalography to investigate brain responses to professionally stylized happy, angry, and neutral character faces. We employed six face-stylization levels varying from abstract to realistic and investigated the N170, early posterior negativity (EPN), and late positive potential (LPP) event-related components. The face-specific N170 showed a u-shaped modulation, with stronger reactions towards both most abstract and most realistic compared to medium-stylized faces. For abstract faces, N170 was generated more occipitally than for real faces, implying stronger reliance on structural processing. Although emotional faces elicited highest amplitudes on both N170 and EPN, on the N170 realism and expression interacted. Finally, LPP increased linearly with face realism, reflecting activity increase in visual and parietal cortex for more realistic faces. Results reveal differential effects of face stylization on distinct face processing stages and suggest a perceptual basis to the uncanny valley hypothesis. They are discussed in relation to face perception, media design, and computer graphics. PMID:28332557

  10. Differential effects of face-realism and emotion on event-related brain potentials and their implications for the uncanny valley theory.

    Science.gov (United States)

    Schindler, Sebastian; Zell, Eduard; Botsch, Mario; Kissler, Johanna

    2017-03-23

    Cartoon characters are omnipresent in popular media. While few studies have scientifically investigated their processing, in computer graphics, efforts are made to increase realism. Yet, close approximations of reality have been suggested to evoke sometimes a feeling of eeriness, the "uncanny valley" effect. Here, we used high-density electroencephalography to investigate brain responses to professionally stylized happy, angry, and neutral character faces. We employed six face-stylization levels varying from abstract to realistic and investigated the N170, early posterior negativity (EPN), and late positive potential (LPP) event-related components. The face-specific N170 showed a u-shaped modulation, with stronger reactions towards both most abstract and most realistic compared to medium-stylized faces. For abstract faces, N170 was generated more occipitally than for real faces, implying stronger reliance on structural processing. Although emotional faces elicited highest amplitudes on both N170 and EPN, on the N170 realism and expression interacted. Finally, LPP increased linearly with face realism, reflecting activity increase in visual and parietal cortex for more realistic faces. Results reveal differential effects of face stylization on distinct face processing stages and suggest a perceptual basis to the uncanny valley hypothesis. They are discussed in relation to face perception, media design, and computer graphics.

  11. Loss of Sustained Activity in the Ventromedial Prefrontal Cortex in Response to Repeated Stress in Individuals with Early-Life Emotional Abuse: Implications for Depression Vulnerability

    Directory of Open Access Journals (Sweden)

    Lihong eWang

    2013-06-01

    Full Text Available Repeated psychosocial stress in early life has significant impact on both behavior and neural function which, together, increase vulnerability to depression. However, neural mechanisms related to repeated stress remain unclear. We hypothesize that early-life stress may result in a reduced capacity for cognitive control in response to a repeated stressor, particularly in individuals who developed maladaptive emotional processing strategies, namely trait rumination. Individuals who encountered early-life stress but have adaptive emotional processing, namely trait mindfulness, may demonstrate an opposite pattern. Using a mental arithmetic task to induce mild stress and a mindful breathing task to induce a mindful state, we tested this hypothesis by examining blood perfusion changes over time in healthy young men. We found that subjects with early-life stress, particularly emotional abuse, failed to sustain neural activation in the orbitofrontal and ventromedial prefrontal cortex (vmPFC over time. Given that the vmPFC is known to regulate amygdala activity during emotional processing, we subsequently compared the perfusion in the vmPFC and the amygdala in depression-vulnerable (having early life stress and high in rumination and resilient (having early life stress and high in mindfulness subjects. We found that depression-vulnerable subjects had increased amygdala perfusion and reduced vmPFC perfusion during the later runs than that during the earlier stressful task runs. In contrast, depression resilient individuals showed the reverse pattern. Our results indicate that the vmPFC of depression-vulnerable subjects may have a limited capacity to inhibit amygdala activation to repeated stress over time, whereas the vmPFC in resilient individuals may adapt to stress quickly. This pilot study warrants future investigation to clarify the stress-related neural activity pattern dynamically to identify depression vulnerability at an individual level.

  12. Emotion talk in the context of young people self-harming: facing the feelings in family therapy.

    Science.gov (United States)

    Rogers, Alice; Schmidt, Petra

    2016-04-01

    This article describes the use of emotion talk in the context of using a manualised approach to family therapy where the presenting problem is self-harm. Whilst we understand that there is an internal aspect to emotion, we also consider emotions to be socially purposeful, culturally constructed and interactional. We found that within the presenting families, negative emotions were often talked about as located within the young person. Through using 'emotion talk' (Fredman, 2004) in deconstructing and tracking emotions and exploring how emotions connected to family-of-origin and cultural contexts, we developed an interactional understanding of these emotions. This led to better emotional regulation within the family and offered alternative ways of relating. The article discusses the use of relational reflexivity, and using the therapist and team's emotions to enable the therapeutic process, encouraging reflexivity on the self of the therapist in relation to work with emotions.

  13. Developing an eBook-Integrated High-Fidelity Mobile App Prototype for Promoting Child Motor Skills and Taxonomically Assessing Children's Emotional Responses Using Face and Sound Topology.

    Science.gov (United States)

    Brown, William; Liu, Connie; John, Rita Marie; Ford, Phoebe

    2014-01-01

    Developing gross and fine motor skills and expressing complex emotion is critical for child development. We introduce "StorySense", an eBook-integrated mobile app prototype that can sense face and sound topologies and identify movement and expression to promote children's motor skills and emotional developmental. Currently, most interactive eBooks on mobile devices only leverage "low-motor" interaction (i.e. tapping or swiping). Our app senses a greater breath of motion (e.g. clapping, snapping, and face tracking), and dynamically alters the storyline according to physical responses in ways that encourage the performance of predetermined motor skills ideal for a child's gross and fine motor development. In addition, our app can capture changes in facial topology, which can later be mapped using the Facial Action Coding System (FACS) for later interpretation of emotion. StorySense expands the human computer interaction vocabulary for mobile devices. Potential clinical applications include child development, physical therapy, and autism.

  14. Face-to-face: Perceived personal relevance amplifies face processing.

    Science.gov (United States)

    Bublatzky, Florian; Pittig, Andre; Schupp, Harald T; Alpers, Georg W

    2017-05-01

    The human face conveys emotional and social information, but it is not well understood how these two aspects influence face perception. In order to model a group situation, two faces displaying happy, neutral or angry expressions were presented. Importantly, faces were either facing the observer, or they were presented in profile view directed towards, or looking away from each other. In Experiment 1 (n = 64), face pairs were rated regarding perceived relevance, wish-to-interact, and displayed interactivity, as well as valence and arousal. All variables revealed main effects of facial expression (emotional > neutral), face orientation (facing observer > towards > away) and interactions showed that evaluation of emotional faces strongly varies with their orientation. Experiment 2 (n = 33) examined the temporal dynamics of perceptual-attentional processing of these face constellations with event-related potentials. Processing of emotional and neutral faces differed significantly in N170 amplitudes, early posterior negativity (EPN), and sustained positive potentials. Importantly, selective emotional face processing varied as a function of face orientation, indicating early emotion-specific (N170, EPN) and late threat-specific effects (LPP, sustained positivity). Taken together, perceived personal relevance to the observer-conveyed by facial expression and face direction-amplifies emotional face processing within triadic group situations. © The Author (2017). Published by Oxford University Press.

  15. Effects of Training of Affect Recognition on the recognition and visual exploration of emotional faces in schizophrenia.

    Science.gov (United States)

    Drusch, Katharina; Stroth, Sanna; Kamp, Daniel; Frommann, Nicole; Wölwer, Wolfgang

    2014-11-01

    Schizophrenia patients have impairments in facial affect recognition and display scanpath abnormalities during the visual exploration of faces. These abnormalities are characterized by fewer fixations on salient feature areas and longer fixation durations. The present study investigated whether social-cognitive remediation not only improves performance in facial affect recognition but also normalizes patients' gaze behavior while looking at faces. Within a 2 × 2-design (group × time), 16 schizophrenia patients and 16 healthy controls performed a facial affect recognition task with concomitant infrared oculography at baseline (T0) and after six weeks (T1). Between the measurements, patients completed the Training of Affect Recognition (TAR) program. The influence of the training on facial affect recognition (percent of correct answers) and gaze behavior (number and mean duration of fixations into salient or non-salient facial areas) was assessed. In line with former studies, at baseline patients showed poorer facial affect recognition than controls and aberrant scanpaths, and after TAR facial affect recognition was improved. Concomitant with improvements in performance, the number of fixations in feature areas ('mouth') increased while fixations in non-feature areas ('white space') decreased. However, the change in fixation behavior did not correlate with the improvement in performance. After TAR, patients pay more attention to facial areas that contain information about a displayed emotion. Although this may contribute to the improved performance, the lack of a statistical correlation implies that this factor is not sufficient to explain the underlying mechanism of the treatment effect. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Emotion

    DEFF Research Database (Denmark)

    Jantzen, Christian; Vetner, Mikael

    2006-01-01

    En emotion er en evaluerende respons på en betydningsfuld hændelse, som har affektiv valens og motiverer organismen i forhold til objektverdenen (omverden). Emotioner fører til affekt: til smerte (negativ) eller glæde (positiv affekt). Både positive og negative emotioner påvirker organismens...

  17. Is empathy necessary to comprehend the emotional faces? The empathic effect on attentional mechanisms (eye movements), cortical correlates (N200 event-related potentials) and facial behaviour (electromyography) in face processing.

    Science.gov (United States)

    Balconi, Michela; Canavesio, Ylenia

    2016-01-01

    The present research explored the effect of social empathy on processing emotional facial expressions. Previous evidence suggested a close relationship between emotional empathy and both the ability to detect facial emotions and the attentional mechanisms involved. A multi-measure approach was adopted: we investigated the association between trait empathy (Balanced Emotional Empathy Scale) and individuals' performance (response times; RTs), attentional mechanisms (eye movements; number and duration of fixations), correlates of cortical activation (event-related potential (ERP) N200 component), and facial responsiveness (facial zygomatic and corrugator activity). Trait empathy was found to affect face detection performance (reduced RTs), attentional processes (more scanning eye movements in specific areas of interest), ERP salience effect (increased N200 amplitude), and electromyographic activity (more facial responses). A second important result was the demonstration of strong, direct correlations among these measures. We suggest that empathy may function as a social facilitator of the processes underlying the detection of facial emotion, and a general "facial response effect" is proposed to explain these results. We assumed that empathy influences cognitive and the facial responsiveness, such that empathic individuals are more skilful in processing facial emotion.

  18. Does a single session of electroconvulsive therapy alter the neural response to emotional faces in depression? A randomised sham-controlled functional magnetic resonance imaging study.

    Science.gov (United States)

    Miskowiak, Kamilla W; Kessing, Lars V; Ott, Caroline V; Macoveanu, Julian; Harmer, Catherine J; Jørgensen, Anders; Revsbech, Rasmus; Jensen, Hans M; Paulson, Olaf B; Siebner, Hartwig R; Jørgensen, Martin B

    2017-09-01

    Negative neurocognitive bias is a core feature of major depressive disorder that is reversed by pharmacological and psychological treatments. This double-blind functional magnetic resonance imaging study investigated for the first time whether electroconvulsive therapy modulates negative neurocognitive bias in major depressive disorder. Patients with major depressive disorder were randomised to one active ( n=15) or sham electroconvulsive therapy ( n=12). The following day they underwent whole-brain functional magnetic resonance imaging at 3T while viewing emotional faces and performed facial expression recognition and dot-probe tasks. A single electroconvulsive therapy session had no effect on amygdala response to emotional faces. Whole-brain analysis revealed no effects of electroconvulsive therapy versus sham therapy after family-wise error correction at the cluster level, using a cluster-forming threshold of Z>3.1 ( p2.3; pelectroconvulsive therapy-induced changes in parahippocampal and superior frontal responses to fearful versus happy faces as well as in fear-specific functional connectivity between amygdala and occipito-temporal regions. Across all patients, greater fear-specific amygdala - occipital coupling correlated with lower fear vigilance. Despite no statistically significant shift in neural response to faces after a single electroconvulsive therapy session, the observed trend changes after a single electroconvulsive therapy session point to an early shift in emotional processing that may contribute to antidepressant effects of electroconvulsive therapy.

  19. Face-to-Face and Online: An Investigation of Children's and Adolescents' Bullying Behavior through the Lens of Moral Emotions and Judgments

    Science.gov (United States)

    Conway, Lauryn; Gomez-Garibello, Carlos; Talwar, Victoria; Shariff, Shaheen

    2016-01-01

    The current study investigated the influence of type of aggression (cyberbullying or traditional bullying) and participant role (bystander or perpetrator) on children and adolescents' self-attribution of moral emotions and judgments, while examining the influence of chronological age. Participants (N = 122, 8-16 years) evaluated vignettes and were…

  20. Shades of Emotion: What the Addition of Sunglasses or Masks to Faces Reveals about the Development of Facial Expression Processing

    Science.gov (United States)

    Roberson, Debi; Kikutani, Mariko; Doge, Paula; Whitaker, Lydia; Majid, Asifa

    2012-01-01

    Three studies investigated developmental changes in facial expression processing, between 3 years-of-age and adulthood. For adults and older children, the addition of sunglasses to upright faces caused an equivalent decrement in performance to face inversion. However, younger children showed "better" classification of expressions of faces wearing…

  1. Shades of Emotion: What the Addition of Sunglasses or Masks to Faces Reveals about the Development of Facial Expression Processing

    Science.gov (United States)

    Roberson, Debi; Kikutani, Mariko; Doge, Paula; Whitaker, Lydia; Majid, Asifa

    2012-01-01

    Three studies investigated developmental changes in facial expression processing, between 3 years-of-age and adulthood. For adults and older children, the addition of sunglasses to upright faces caused an equivalent decrement in performance to face inversion. However, younger children showed "better" classification of expressions of faces wearing…

  2. Event-related brain responses to emotional words, pictures and faces – A cross-domain comparison

    Directory of Open Access Journals (Sweden)

    Mareike eBayer

    2014-10-01

    Full Text Available Emotion effects in event-related brain potentials (ERPs have previously been reported for a range of visual stimuli, including emotional words, pictures, and facial expressions. Still, little is known about the actual comparability of emotion effects within these stimulus classes. The present study aimed to fill this gap by investigating emotion effects in response to words, pictures, and facial expressions using a blocked within-subject design. Furthermore, ratings of stimulus arousal and valence were collected from an independent sample of participants. Modulations of early posterior negativity (EPN and late positive complex (LPC were visible for all stimulus domains, but showed clear differences, particularly in valence processing. While emotion effects were limited to positive stimuli for words, they were predominant for negative stimuli in pictures and facial expressions. These findings corroborate the notion of a positivity offset for words and a negativity bias for pictures and facial expressions, which was assumed to be caused by generally lower arousal levels of written language. Interestingly, however, these assumed differences were not confirmed by arousal ratings. Instead, words were rated as overall more positive than pictures and facial expressions. Taken together, the present results point towards systematic differences in the processing of written words and pictorial stimuli of emotional content, not only in terms of a valence bias evident in ERPs, but also concerning their emotional evaluation captured by ratings of stimulus valence and arousal.

  3. Event-related brain responses to emotional words, pictures, and faces – a cross-domain comparison

    Science.gov (United States)

    Bayer, Mareike; Schacht, Annekathrin

    2014-01-01

    Emotion effects in event-related brain potentials (ERPs) have previously been reported for a range of visual stimuli, including emotional words, pictures, and facial expressions. Still, little is known about the actual comparability of emotion effects across these stimulus classes. The present study aimed to fill this gap by investigating emotion effects in response to words, pictures, and facial expressions using a blocked within-subject design. Furthermore, ratings of stimulus arousal and valence were collected from an independent sample of participants. Modulations of early posterior negativity (EPN) and late positive complex (LPC) were visible for all stimulus domains, but showed clear differences, particularly in valence processing. While emotion effects were limited to positive stimuli for words, they were predominant for negative stimuli in pictures and facial expressions. These findings corroborate the notion of a positivity offset for words and a negativity bias for pictures and facial expressions, which was assumed to be caused by generally lower arousal levels of written language. Interestingly, however, these assumed differences were not confirmed by arousal ratings. Instead, words were rated as overall more positive than pictures and facial expressions. Taken together, the present results point toward systematic differences in the processing of written words and pictorial stimuli of emotional content, not only in terms of a valence bias evident in ERPs, but also concerning their emotional evaluation captured by ratings of stimulus valence and arousal. PMID:25339927

  4. Association of trait emotional intelligence and individual fMRI-activation patterns during the perception of social signals from voice and face.

    Science.gov (United States)

    Kreifelts, Benjamin; Ethofer, Thomas; Huberle, Elisabeth; Grodd, Wolfgang; Wildgruber, Dirk

    2010-07-01

    Multimodal integration of nonverbal social signals is essential for successful social interaction. Previous studies have implicated the posterior superior temporal sulcus (pSTS) in the perception of social signals such as nonverbal emotional signals as well as in social cognitive functions like mentalizing/theory of mind. In the present study, we evaluated the relationships between trait emotional intelligence (EI) and fMRI activation patterns in individual subjects during the multimodal perception of nonverbal emotional signals from voice and face. Trait EI was linked to hemodynamic responses in the right pSTS, an area which also exhibits a distinct sensitivity to human voices and faces. Within all other regions known to subserve the perceptual audiovisual integration of human social signals (i.e., amygdala, fusiform gyrus, thalamus), no such linked responses were observed. This functional difference in the network for the audiovisual perception of human social signals indicates a specific contribution of the pSTS as a possible interface between the perception of social information and social cognition.

  5. The Perception and Identification of Facial Emotions in Individuals with Autism Spectrum Disorders Using the "Let's Face It!" Emotion Skills Battery

    Science.gov (United States)

    Tanaka, James W.; Wolf, Julie M.; Klaiman, Cheryl; Koenig, Kathleen; Cockburn, Jeffrey; Herlihy, Lauren; Brown, Carla; Stahl, Sherin S.; South, Mikle; McPartland, James C.; Kaiser, Martha D.; Schultz, Robert T.

    2012-01-01

    Background: Although impaired social-emotional ability is a hallmark of autism spectrum disorder (ASD), the perceptual skills and mediating strategies contributing to the social deficits of autism are not well understood. A perceptual skill that is fundamental to effective social communication is the ability to accurately perceive and interpret…

  6. The Perception and Identification of Facial Emotions in Individuals with Autism Spectrum Disorders Using the "Let's Face It!" Emotion Skills Battery

    Science.gov (United States)

    Tanaka, James W.; Wolf, Julie M.; Klaiman, Cheryl; Koenig, Kathleen; Cockburn, Jeffrey; Herlihy, Lauren; Brown, Carla; Stahl, Sherin S.; South, Mikle; McPartland, James C.; Kaiser, Martha D.; Schultz, Robert T.

    2012-01-01

    Background: Although impaired social-emotional ability is a hallmark of autism spectrum disorder (ASD), the perceptual skills and mediating strategies contributing to the social deficits of autism are not well understood. A perceptual skill that is fundamental to effective social communication is the ability to accurately perceive and interpret…

  7. Adjunctive selective estrogen receptor modulator increases neural activity in the hippocampus and inferior frontal gyrus during emotional face recognition in schizophrenia.

    Science.gov (United States)

    Ji, E; Weickert, C S; Lenroot, R; Kindler, J; Skilleter, A J; Vercammen, A; White, C; Gur, R E; Weickert, T W

    2016-05-03

    Estrogen has been implicated in the development and course of schizophrenia with most evidence suggesting a neuroprotective effect. Treatment with raloxifene, a selective estrogen receptor modulator, can reduce symptom severity, improve cognition and normalize brain activity during learning in schizophrenia. People with schizophrenia are especially impaired in the identification of negative facial emotions. The present study was designed to determine the extent to which adjunctive raloxifene treatment would alter abnormal neural activity during angry facial emotion recognition in schizophrenia. Twenty people with schizophrenia (12 men, 8 women) participated in a 13-week, randomized, double-blind, placebo-controlled, crossover trial of adjunctive raloxifene treatment (120 mg per day orally) and performed a facial emotion recognition task during functional magnetic resonance imaging after each treatment phase. Two-sample t-tests in regions of interest selected a priori were performed to assess activation differences between raloxifene and placebo conditions during the recognition of angry faces. Adjunctive raloxifene significantly increased activation in the right hippocampus and left inferior frontal gyrus compared with the placebo condition (family-wise error, Pschizophrenia. These findings support the hypothesis that estrogen plays a modifying role in schizophrenia and shows that adjunctive raloxifene treatment may reverse abnormal neural activity during facial emotion recognition, which is relevant to impaired social functioning in men and women with schizophrenia.

  8. Adjunctive selective estrogen receptor modulator increases neural activity in the hippocampus and inferior frontal gyrus during emotional face recognition in schizophrenia

    Science.gov (United States)

    Ji, E; Weickert, C S; Lenroot, R; Kindler, J; Skilleter, A J; Vercammen, A; White, C; Gur, R E; Weickert, T W

    2016-01-01

    Estrogen has been implicated in the development and course of schizophrenia with most evidence suggesting a neuroprotective effect. Treatment with raloxifene, a selective estrogen receptor modulator, can reduce symptom severity, improve cognition and normalize brain activity during learning in schizophrenia. People with schizophrenia are especially impaired in the identification of negative facial emotions. The present study was designed to determine the extent to which adjunctive raloxifene treatment would alter abnormal neural activity during angry facial emotion recognition in schizophrenia. Twenty people with schizophrenia (12 men, 8 women) participated in a 13-week, randomized, double-blind, placebo-controlled, crossover trial of adjunctive raloxifene treatment (120 mg per day orally) and performed a facial emotion recognition task during functional magnetic resonance imaging after each treatment phase. Two-sample t-tests in regions of interest selected a priori were performed to assess activation differences between raloxifene and placebo conditions during the recognition of angry faces. Adjunctive raloxifene significantly increased activation in the right hippocampus and left inferior frontal gyrus compared with the placebo condition (family-wise error, Pschizophrenia. These findings support the hypothesis that estrogen plays a modifying role in schizophrenia and shows that adjunctive raloxifene treatment may reverse abnormal neural activity during facial emotion recognition, which is relevant to impaired social functioning in men and women with schizophrenia. PMID:27138794

  9. The influence of combined cognitive plus social-cognitive training on amygdala response during face emotion recognition in schizophrenia.

    Science.gov (United States)

    Hooker, Christine I; Bruce, Lori; Fisher, Melissa; Verosky, Sara C; Miyakawa, Asako; D'Esposito, Mark; Vinogradov, Sophia

    2013-08-30

    Both cognitive and social-cognitive deficits impact functional outcome in schizophrenia. Cognitive remediation studies indicate that targeted cognitive and/or social-cognitive training improves behavioral performance on trained skills. However, the neural effects of training in schizophrenia and their relation to behavioral gains are largely unknown. This study tested whether a 50-h intervention which included both cognitive and social-cognitive training would influence neural mechanisms that support social ccognition. Schizophrenia participants completed a computer-based intervention of either auditory-based cognitive training (AT) plus social-cognition training (SCT) (N=11) or non-specific computer games (CG) (N=11). Assessments included a functional magnetic resonance imaging (fMRI) task of facial emotion recognition, and behavioral measures of cognition, social cognition, and functional outcome. The fMRI results showed the predicted group-by-time interaction. Results were strongest for emotion recognition of happy, surprise and fear: relative to CG participants, AT+SCT participants showed a neural activity increase in bilateral amygdala, right putamen and right medial prefrontal cortex. Across all participants, pre-to-post intervention neural activity increase in these regions predicted behavioral improvement on an independent emotion perception measure (MSCEIT: Perceiving Emotions). Among AT+SCT participants alone, neural activity increase in right amygdala predicted behavioral improvement in emotion perception. The findings indicate that combined cognition and social-cognition training improves neural systems that support social-cognition skills.

  10. Facing the challenge of teaching emotions to individuals with low- and high-functioning autism using a new Serious game: a pilot study.

    Science.gov (United States)

    Serret, Sylvie; Hun, Stephanie; Iakimova, Galina; Lozada, Jose; Anastassova, Margarita; Santos, Andreia; Vesperini, Stephanie; Askenazy, Florence

    2014-01-01

    It is widely accepted that emotion processing difficulties are involved in Autism Spectrum Conditions (ASC). An increasing number of studies have focused on the development of training programs and have shown promising results. However, most of these programs are appropriate for individuals with high-functioning ASC (HFA) but exclude individuals with low-functioning ASC (LFA). We have developed a computer-based game called JeStiMulE based on logical skills to teach emotions to individuals with ASC, independently of their age, intellectual, verbal and academic level. The aim of the present study was to verify the usability of JeStiMulE (which is its adaptability, effectiveness and efficiency) on a heterogeneous ASC group. We hypothesized that after JeStiMulE training, a performance improvement would be found in emotion recognition tasks. A heterogeneous group of thirty-three children and adolescents with ASC received two one-hour JeStiMulE sessions per week over four weeks. In order to verify the usability of JeStiMulE, game data were collected for each participant. Furthermore, all participants were presented before and after training with five emotion recognition tasks, two including pictures of game avatars (faces and gestures) and three including pictures of real-life characters (faces, gestures and social scenes). Descriptive data showed suitable adaptability, effectiveness and efficiency of JeStiMulE. Results revealed a significant main effect of Session on avatars (ANOVA: F (1,32) = 98.48, P < .001) and on pictures of real-life characters (ANOVA: F (1,32) = 49.09, P < .001). A significant Session × Task × Emotion interaction was also found for avatars (ANOVA: F (6,192) = 2.84, P = .01). This triple interaction was close to significance for pictures of real-life characters (ANOVA: F (12,384) = 1.73, P = .057). Post-hoc analyses revealed that 30 out of 35 conditions found a significant increase after training. JeStiMulE appears

  11. Computer-Assisted Face Processing Instruction Improves Emotion Recognition, Mentalizing, and Social Skills in Students with ASD

    Science.gov (United States)

    Rice, Linda Marie; Wall, Carla Anne; Fogel, Adam; Shic, Frederick

    2015-01-01

    This study examined the extent to which a computer-based social skills intervention called "FaceSay"™ was associated with improvements in affect recognition, mentalizing, and social skills of school-aged children with Autism Spectrum Disorder (ASD). "FaceSay"™ offers students simulated practice with eye gaze, joint attention,…

  12. Cognitive aging explains age-related differences in face-based recognition of basic emotions except for anger and disgust.

    Science.gov (United States)

    Suzuki, Atsunobu; Akiyama, Hiroko

    2013-01-01

    This study aimed at a detailed understanding of the possible dissociable influences of cognitive aging on the recognition of facial expressions of basic emotions (happiness, surprise, fear, anger, disgust, and sadness). The participants were 36 older and 36 young adults. They viewed 96 pictures of facial expressions and were asked to choose one emotion that best described each. Four cognitive tasks measuring the speed of processing and fluid intelligence were also administered, the scores of which were used to compute a composite measure of general cognitive ability. A series of hierarchical regression analyses revealed that age-related deficits in identifying happiness, surprise, fear, and sadness were statistically explained by general cognitive ability, while the differences in anger and disgust were not. This provides clear evidence that age-related cognitive impairment remarkably and differentially affects the recognition of basic emotions, contrary to the common view that cognitive aging has a uniformly minor effect.

  13. Differences in neural and cognitive response to emotional faces in middle-aged dizygotic twins at familial risk of depression

    DEFF Research Database (Denmark)

    Miskowiak, K W; Svendsen, Anne-Mette Bruun; Harmer, Catherine J

    2017-01-01

    -twin history of depression (high-risk) and 20 were without co-twin history of depression (low-risk). During fMRI, participants viewed fearful and happy faces while performing a gender discrimination task. After the scan, they were given a faces dot-probe task, a facial expression recognition task...... and questionnaires assessing mood, personality traits and coping. RESULTS: Unexpectedly, high-risk twins showed reduced fear vigilance and lower recognition of fear and happiness relative to low-risk twins. During face processing in the scanner, high-risk twins displayed distinct negative functional coupling between...

  14. Cortisol-induced enhancement of emotional face processing in social phobia depends on symptom severity and motivational context

    NARCIS (Netherlands)

    Peer, J.M. van; Spinhoven, P.; Dijk, J.G. van; Roelofs, K.

    2009-01-01

    We investigated the effects of cortisol administration on approach and avoidance tendencies in 20 patients with social anxiety disorder (SAD). Event-related brain potentials (ERPs) were measured during a reaction time task, in which patients evaluated the emotional expression of photographs of happy

  15. Effects of Self-esteem on Attentional Bias for Emotional Faces%自尊对情绪面孔注意偏向的影响

    Institute of Scientific and Technical Information of China (English)

    杨娟; 李海江; 张庆林

    2012-01-01

    研究表明低自尊个体对负性信息表现出高度的警觉和关注,而高自尊个体相对来说注意积极地方面,但并未表现明显偏向。为了精确记录注意选择神经加工时间进程,进一步从电生理学角度对低自尊个体的负性注意偏向进行探讨,本研究运用事件相关电位(ERP)技术对不同自尊个体在点-探测任务过程中的注意偏向进行研究。实验一以129名在校大学生作为被试,以大学生面孔情绪图片(高兴、悲伤和中性)作为实验材料,结果发现,高低自尊个体之间没有出现注意偏向的差异。实验二以实验一筛选出的30名高低自尊个体作为被试,选择了更具负性情绪唤醒的愤怒图片作为实验材料(高兴、愤怒和中性),行为数据结果发现,两组被试都没有出现注意偏向,相反,无论是在高兴面孔条件下还是在愤怒面孔条件下,低自尊个体的反应时都快于高自尊个体;电生理学数据结果发现,在低自尊个体中,在两种情绪面孔条件下,有效线索的P1峰值显著高于无效线索,而无效线索的N2pc峰值显著高于有效线索。电生理学数据表明,无论是负性的还是正性的情绪性的信息都能引起低自尊个体的更多注意,表明低自尊个体更容易受到情绪性信息的影响。%Previous findings have generally been consistent with the notion that low self-esteem participants tend to show difficulty in disengaging their attention from negative stimuli.However,the extant data regarding the disengage component of attention in relation to low self-esteem has been scarce and does not yield a clear-cut picture.The main object of the present study was to further investigate the time course of attention deployment to emotional stimuli in low self-esteem using event-related potentials technology.In Experiment 1,129 undergraduates participated in a visual probe task,in which they viewed pairs of faces(e.g.,a happy face paired with

  16. Computer-Assisted Face Processing Instruction Improves Emotion Recognition, Mentalizing, and Social Skills in Students with ASD.

    Science.gov (United States)

    Rice, Linda Marie; Wall, Carla Anne; Fogel, Adam; Shic, Frederick

    2015-07-01

    This study examined the extent to which a computer-based social skills intervention called FaceSay was associated with improvements in affect recognition, mentalizing, and social skills of school-aged children with Autism Spectrum Disorder (ASD). FaceSay offers students simulated practice with eye gaze, joint attention, and facial recognition skills. This randomized control trial included school-aged children meeting educational criteria for autism (N = 31). Results demonstrated that participants who received the intervention improved their affect recognition and mentalizing skills, as well as their social skills. These findings suggest that, by targeting face-processing skills, computer-based interventions may produce changes in broader cognitive and social-skills domains in a cost- and time-efficient manner.

  17. One Size Does Not Fit All: Face Emotion Processing Impairments in Semantic Dementia, Behavioural-Variant Frontotemporal Dementia and Alzheimer’s Disease Are Mediated by Distinct Cognitive Deficits

    Directory of Open Access Journals (Sweden)

    Laurie A. Miller

    2012-01-01

    Full Text Available Patients with frontotemporal dementia (both behavioural variant [bvFTD] and semantic dementia [SD] as well as those with Alzheimer's disease (AD show deficits on tests of face emotion processing, yet the mechanisms underlying these deficits have rarely been explored. We compared groups of patients with bvFTD (n = 17, SD (n = 12 or AD (n = 20 to an age- and education-matched group of healthy control subjects (n = 36 on three face emotion processing tasks (Ekman 60, Emotion Matching and Emotion Selection and found that all three patient groups were similarly impaired. Analyses of covariance employed to partial out the influences of language and perceptual impairments, which frequently co-occur in these patients, provided evidence of different underlying cognitive mechanisms. These analyses revealed that language impairments explained the original poor scores obtained by the SD patients on the Ekman 60 and Emotion Selection tasks, which involve verbal labels. Perceptual deficits contributed to Emotion Matching performance in the bvFTD and AD patients. Importantly, all groups remained impaired on one task or more following these analyses, denoting a primary emotion processing disturbance in these dementia syndromes. These findings highlight the multifactorial nature of emotion processing deficits in patients with dementia.

  18. Visual perception and processing in children with 22q11.2 deletion syndrome: associations with social cognition measures of face identity and emotion recognition.

    Science.gov (United States)

    McCabe, Kathryn L; Marlin, Stuart; Cooper, Gavin; Morris, Robin; Schall, Ulrich; Murphy, Declan G; Murphy, Kieran C; Campbell, Linda E

    2016-01-01

    People with 22q11.2 deletion syndrome (22q11DS) have difficulty processing social information including facial identity and emotion processing. However, difficulties with visual and attentional processes may play a role in difficulties observed with these social cognitive skills. A cross-sectional study investigated visual perception and processing as well as facial processing abilities in a group of 49 children and adolescents with 22q11DS and 30 age and socio-economic status-matched healthy sibling controls using the Birmingham Object Recognition Battery and face processing sub-tests from the MRC face processing skills battery. The 22q11DS group demonstrated poorer performance on all measures of visual perception and processing, with greatest impairment on perceptual processes relating to form perception as well as object recognition and memory. In addition, form perception was found to make a significant and unique contribution to higher order social-perceptual processing (face identity) in the 22q11DS group. The findings indicate evidence for impaired visual perception and processing capabilities in 22q11DS. In turn, these were found to influence cognitive skills needed for social processes such as facial identity recognition in the children with 22q11DS.

  19. Human Emotional State and its Relevance for Military VR Training

    Science.gov (United States)

    2005-07-01

    experience of military VR simulation training: human emotional state. Anxiety is a common emotional state in military operating environments. Real world...anecdotally be seen when repeated exposure to stressful military training leads to a gradual decline in anxiety responses as the trainee learns to “manage...Ample evidence for such a habituation process can be seen in the fledgling VR/mental health field whereby phobic patients are able to effectively face

  20. Facial Areas and Emotional Information

    Science.gov (United States)

    Boucher, Jerry D.; Ekman, Paul

    1975-01-01

    Provides strong support for the view that there is no one area of the face which best reveals emotion, but that the value of the different facial areas in distinguishing emotions depends upon the emotion being judged. (Author)

  1. Facial Areas and Emotional Information

    Science.gov (United States)

    Boucher, Jerry D.; Ekman, Paul

    1975-01-01

    Provides strong support for the view that there is no one area of the face which best reveals emotion, but that the value of the different facial areas in distinguishing emotions depends upon the emotion being judged. (Author)

  2. Unconsciously Triggered Emotional Conflict by Emotional Facial Expressions

    OpenAIRE

    Jun Jiang; Kira Bailey; Antao Chen; Qian Cui; Qinglin Zhang

    2013-01-01

    The present study investigated whether emotional conflict and emotional conflict adaptation could be triggered by unconscious emotional information as assessed in a backward-masked affective priming task. Participants were instructed to identify the valence of a face (e.g., happy or sad) preceded by a masked happy or sad face. The results of two experiments revealed the emotional conflict effect but no emotional conflict adaptation effect. This demonstrates that emotional conflict can be trig...

  3. Study of cognitive tendencies in recognizing negative emotional faces%消极情绪面孔识别的认知偏向性研究

    Institute of Scientific and Technical Information of China (English)

    马娉娉; 王异芳; 魏萍

    2011-01-01

    目的 研究人们在对生气、害怕、难过3种消极情绪面孔识别时的认知偏向性.方法 借鉴了传统的侧抑制范式,用E-Prime软件编写程序,采用简单随机抽样的方法对48名大学生(男性19名,女性29名)的正确率和反应时进行了分析.结果 所有情况下,男、女被试对生气面孔反应的正确率都显著高于害怕和难过(分别为96.60%,94.50%,94.40%)(P<0.05);在反应时上,当分心刺激是害怕面孔或没有分心刺激时,被试对生气面孔的反应时显著快于害怕和难过[2种分心情况下反应时分别为:害怕(801.27±140.99)ms,生气(723.94±151.37)ms,难过(812.21±148.85)ms;害怕(788.17±148.81)ms,生气(694.28±111.99)ms,难过(763.57±133.91)ms,P<0.05];当分心刺激为生气或难过面孔时,被试对生气面孔的反应时显著快于害怕(两种分心情况下的反应时分别为:害怕[824.09±164.42)ms,生气(721.48±124.06)ms,难过(760.50±131.12)ms;害怕(798.95±146.40)ms,生气(702.55±136.07)ms,难过(750.48±133.86)ms,P<0.05].结论 不同的消极情绪面孔同时出现时,被试的反应不存在侧抑制效应,但体现出了生气优势效应及其他倾向性,为临床上诊断和治疗与情绪障碍有关的精神疾病提供了一定的依据.%Objective To study the cognitive tendencies in recognizing negative emotional faces.Methods The experiment used the traditional flanker paradigm for reference.Forty-eight university students (19 males,29 females) finished the flanker tasks deigned by E-Prime, and their accuracy rate and reaction time were recorded in the study.Results ANOVA revealed that,in all cases,the accuracy rate of anger was significant higher than fear and sadness (96.60% ,94.50% ,94.40% ) (P<0.05).And when the flanking stimuli were fear faces or there were no flanking stimuli, the reaction time of anger was much shorter than fear and sadness ( respectively,fear(801.27 ± 140.99) ms,anger(723.94 ± 151.37 ) ms, sadness(812.21

  4. Research on coal repeated mining technology in gob of 100105 working face%100105综采工作面老空区复采技术研究

    Institute of Scientific and Technical Information of China (English)

    马建荣

    2014-01-01

    This paper mainly introduced the technology of 100105 working face mining through the gob, from the preparatory work, the ventilated management, the roof management, the equipment management ,the mining craft and the measure and so on ifve aspects, after implemented, has obtained the remarkable effect.%该文主要从100105工作面过老空区前准备工作、过老空区通风管理、过老空区顶板管理、过老空区设备管理及工作面过老空区安全回采工艺与措施等五个方面来阐述工作面过老空区复采技术研究,通过有效实施取得了显著效果。

  5. Unconsciously triggered emotional conflict by emotional facial expressions.

    Directory of Open Access Journals (Sweden)

    Jun Jiang

    Full Text Available The present study investigated whether emotional conflict and emotional conflict adaptation could be triggered by unconscious emotional information as assessed in a backward-masked affective priming task. Participants were instructed to identify the valence of a face (e.g., happy or sad preceded by a masked happy or sad face. The results of two experiments revealed the emotional conflict effect but no emotional conflict adaptation effect. This demonstrates that emotional conflict can be triggered by unconsciously presented emotional information, but participants may not adjust their subsequent performance trial-by trial to reduce this conflict.

  6. Unconsciously triggered emotional conflict by emotional facial expressions.

    Science.gov (United States)

    Jiang, Jun; Bailey, Kira; Chen, Antao; Cui, Qian; Zhang, Qinglin

    2013-01-01

    The present study investigated whether emotional conflict and emotional conflict adaptation could be triggered by unconscious emotional information as assessed in a backward-masked affective priming task. Participants were instructed to identify the valence of a face (e.g., happy or sad) preceded by a masked happy or sad face. The results of two experiments revealed the emotional conflict effect but no emotional conflict adaptation effect. This demonstrates that emotional conflict can be triggered by unconsciously presented emotional information, but participants may not adjust their subsequent performance trial-by trial to reduce this conflict.

  7. Outcomes for Youth with Severe Emotional Disturbance: A Repeated Measures Longitudinal Study of a Wraparound Approach of Service Delivery in Systems of Care

    Science.gov (United States)

    Painter, Kirstin

    2012-01-01

    Background: Systems of care is a family centered, strengths-based service delivery model for treating youth experiencing a serious emotional disturbance. Wraparound is the most common method of service delivery adopted by states and communities as a way to adhere to systems of care philosophy. Objective: The purpose of this study was to evaluate…

  8. Influence of single and repeated cannabidiol administration on emotional behavior and markers of cell proliferation and neurogenesis in non-stressed mice.

    Science.gov (United States)

    Schiavon, Angélica Pupin; Bonato, Jéssica Mendes; Milani, Humberto; Guimarães, Francisco Silveira; Weffort de Oliveira, Rúbia Maria

    2016-01-04

    Therapeutic effects of antidepressants and atypical antipsychotics may arise partially from their ability to stimulate neurogenesis. Cannabidiol (CBD), a phytocannabinoid present in Cannabis sativa, presents anxiolytic- and antipsychotic-like effects in preclinical and clinical settings. Anxiolytic-like effects of repeated CBD were shown in chronically stressed animals and these effects were parallel with increased hippocampal neurogenesis. However, antidepressant-like effects of repeated CBD administration in non-stressed animals have been scarcely reported. Here we investigated the behavioral consequences of single or repeated CBD administration in non-stressed animals. We also determined the effects of CBD on cell proliferation and neurogenesis in the dentate gyrus (DG) and subventricular zone (SVZ). Single CBD 3mg/kg administration resulted in anxiolytic-like effect in mice submitted to the elevated plus maze (EPM). In the tail suspension test (TST), single or repeated CBD administration reduced immobility time, an effect that was comparable to those of imipramine (20 mg/kg). Moreover, repeated CBD administration at a lower dose (3 mg/kg) increased cell proliferation and neurogenesis, as seen by an increased number of Ki-67-, BrdU- and doublecortin (DCX)-positive cells in both in DG and SVZ. Despite its antidepressant-like effects in the TST, repeated CBD administration at a higher dose (30 mg/kg) decreased cell proliferation and neurogenesis in the hippocampal DG and SVZ. Our findings show a dissociation between behavioral and proliferative effects of repeated CBD and suggest that the antidepressant-like effects of CBD may occur independently of adult neurogenesis in non-stressed Swiss mice.

  9. Dystonia: Emotional and Mental Health

    Science.gov (United States)

    ... Support Frequently Asked Questions Faces of Dystonia Emotional & Mental Health Although dystonia is a movement disorder that impacts ... emotion as well as muscle movement. For years, mental health professionals have recognized that coping with a chronic ...

  10. Developing an eBook-Integrated High-Fidelity Mobile App Prototype for Promoting Child Motor Skills and Taxonomically Assessing Children’s Emotional Responses Using Face and Sound Topology

    Science.gov (United States)

    Brown, William; Liu, Connie; John, Rita Marie; Ford, Phoebe

    2014-01-01

    Developing gross and fine motor skills and expressing complex emotion is critical for child development. We introduce “StorySense”, an eBook-integrated mobile app prototype that can sense face and sound topologies and identify movement and expression to promote children’s motor skills and emotional developmental. Currently, most interactive eBooks on mobile devices only leverage “low-motor” interaction (i.e. tapping or swiping). Our app senses a greater breath of motion (e.g. clapping, snapping, and face tracking), and dynamically alters the storyline according to physical responses in ways that encourage the performance of predetermined motor skills ideal for a child’s gross and fine motor development. In addition, our app can capture changes in facial topology, which can later be mapped using the Facial Action Coding System (FACS) for later interpretation of emotion. StorySense expands the human computer interaction vocabulary for mobile devices. Potential clinical applications include child development, physical therapy, and autism. PMID:25954336

  11. The Impact of Perceptual Load on Emotional Face Processing in Attentional Blink Paradigm%注意瞬脱范式中的知觉负载对情绪面孔加工的影响

    Institute of Scientific and Technical Information of China (English)

    叶榕; 余凤琼; 蒋玉宝; 汪凯

    2011-01-01

    , a scrambled face colored in green) and a second target (T2, either a fearful or a neutral face) embedded in a rapid series visual presentation (RSVP) of 18 scrambled faces. The Eriksen flanker task was used to distinguish the level of T1 perceptual load in which the participants were asked to determine the orientation of central arrow between other 4 congruent (low-load condition) or incongruent (high-load condition) arrows. All 20 items were presented for 67ms on the black background of computer screen and immediately followed by sequential items. The possible intervals between T1 and T2 were Lag-2 (SOA 134ms), Lag-3 (SOA 201ms), Lag-5 (SOA 335ms) and Lag-8 (SOA 536ms). 30 undergraduate and graduate students were instructed to make the identification response to the central arrow of the only green item (left or right) and the detection response to the other target face with intact features (present or absent). T1 load conditions were separately presented in two blocks and the order of these two blocks was counterbalanced across the participants. The T1 identification accuracy and the T2 detection accuracy in all conditions were recorded respectively.The analysis of behavioral data revealed that for low-load conditions, fearful faces were deteaed more often than neutral faces, therefore replicaOng previous reports of the privileged emotion processing in AB.However, this advantage was hampered significantly in the high-load condition and the detection of neutral faces was not affected by the increased T1 load, suggesting that the privileged access of fearful faces to awareness is more sensitive to the current available processing resources. Most importantly, the attenuated emotional impact in AB was merely observed in the condition of high T1-1oad and short T1-T2 lag, indicated that the prioritization of emotion-laden stimuli processing are restricted by both the depletion of attentional resources induced by T1 perceptual load and the ineffective modulation of

  12. 面向社会稳定风险治理的社会情绪共同体研究%Study on Social Emotional Community Facing Social Stability Risk Governance

    Institute of Scientific and Technical Information of China (English)

    洪宇翔; 李从东

    2015-01-01

    Emotion has significant function on expressing the process of social stability risk and explaining its dynamic mechanism. First-ly, we gave a definition to social emotions under the context of social stability risk, and proposed the concept of social emotional commu-nity. Secondly, we found the realization paths for the emergence of social emotional community based on the analysis of its knowledge level, mental model level, and action level. Thirdly, we constructed the online realization platform for social emotional community by u-sing the ICT. Finally, we made sense of the emotional community facing social stability risk governance.%情绪具有表征社会稳定风险过程以及解释其动力机制的重要作用。在社会稳定风险情境下界定社会情绪,并提出社会情绪共同体的概念;通过对社会情绪共同体知识层面、心智模式层面、行动层面的分析,发现社会情绪共同体的有效实现路径;基于ICT技术,提出面向社会稳定风险治理的社会情绪共同体的“线上”实现形式,并分析其面向社会稳定风险治理的内涵。

  13. Face it, don't Facebook it: Impacts of Social Media Addiction on Mindfulness, Coping Strategies and the Consequence on Emotional Exhaustion.

    Science.gov (United States)

    Sriwilai, Kanokporn; Charoensukmongkol, Peerayuth

    2016-10-01

    Addiction to social media has now become a problem that societies are concerned with. The aim of the present study is to investigate the impacts that social media addiction has on mindfulness and choice of coping strategy, as well as to explore the consequences on emotional exhaustion. The survey data were collected from 211 employees in 13 enterprises in Thailand. Results from partial least square structural equation modelling revealed that people who are highly addicted to social media tended to have lower mindfulness and tended to use emotion-focused coping to deal with stress. Lack of mindfulness and the decision to use emotion-coping strategy are also subsequently associated with higher emotional exhaustion. Copyright © 2015 John Wiley & Sons, Ltd.

  14. Sex-specific mediation effect of the right fusiform face area volume on the association between variants in repeat length of AVPR1A RS3 and altruistic behavior in healthy adults.

    Science.gov (United States)

    Wang, Junping; Qin, Wen; Liu, Feng; Liu, Bing; Zhou, Yuan; Jiang, Tianzi; Yu, Chunshui

    2016-07-01

    Microsatellite variants in the arginine vasopressin receptor 1A gene (AVPR1A) RS3 have been associated with normal social behaviors variation and autism spectrum disorders (ASDs) in a sex-specific manner. However, neural mechanisms underlying these associations remain largely unknown. We hypothesized that AVPR1A RS3 variants affect altruistic behavior by modulating the gray matter volume (GMV) of specific brain regions in a sex-specific manner. We investigated 278 young healthy adults using the Dictator Game to assess altruistic behavior. All subjects were genotyped and main effect of AVPR1A RS3 repeat polymorphisms and interaction of genotype-by-sex on the GMV were assessed in a voxel-wise manner. We observed that male subjects with relatively short repeats allocated less money to others and exhibited a significantly smaller GMV in the right fusiform face area (FFA) compared with male long homozygotes. In male subjects, the GMV of the right FFA exhibited a significant positive correlation with altruistic behavior. A mixed mediation and moderation analysis further revealed both a significant mediation effect of the GMV of the right FFA on the association between AVPR1A RS3 repeat polymorphisms and allocation sums and a significant moderation effect of sex (only in males) on the mediation effect. Post hoc analysis showed that the GMV of the right FFA was significantly smaller in male subjects carrying allele 426 than in non-426 carriers. These results suggest that the GMV of the right FFA may be a potential mediator whereby the genetic variants in AVPR1A RS3 affect altruistic behavior in healthy male subjects. Hum Brain Mapp 37:2700-2709, 2016. © 2016 Wiley Periodicals, Inc.

  15. Emotions and trait emotional intelligence among ultra-endurance runners.

    Science.gov (United States)

    Lane, Andrew M; Wilson, Mathew

    2011-07-01

    The aim of this study was to investigate relationships between trait emotional intelligence and emotional state changes over the course of an ultra-endurance foot race covering a route of approximately 175 miles (282 km) and held in set stages over six days. A repeated measures field design that sought to maintain ecological validity was used. Trait emotional intelligence was defined as a relatively stable concept that should predict adaptive emotional states experienced over the duration of the race and therefore associate with pleasant emotions during a 6-stage endurance event. Thirty-four runners completed a self-report measure of trait emotional intelligence before the event started. Participants reported emotional states before and after each of the six races. Repeated measures ANOVA results showed significant variations in emotions over time and a main effect for trait emotional intelligence. Runners high in self-report trait emotional intelligence also reported higher pleasant and lower unpleasant emotions than runners low in trait emotional intelligence. Findings lend support to the notion that trait emotional intelligence associates with adaptive psychological states, suggesting that it may be a key individual difference that explains why some athletes respond to repeated bouts of hard exercise better than others. Future research should test the effectiveness of interventions designed to enhance trait emotional intelligence and examine the attendant impact on emotional responses to intense exercise during multi-stage events. Copyright © 2011. Published by Elsevier Ltd.

  16. Audiovisual emotional processing and neurocognitive functioning in patients with depression

    Directory of Open Access Journals (Sweden)

    Sophie eDoose-Grünefeld

    2015-01-01

    Full Text Available Alterations in the processing of emotional stimuli (e.g. facial expressions, prosody, music have repeatedly been reported in patients with major depression. Such impairments may result from the likewise prevalent executive deficits in these patients. However, studies investigating this relationship are rare. Moreover, most studies to date have only assessed impairments in unimodal emotional processing, whereas in real life, emotions are primarily conveyed through more than just one sensory channel. The current study therefore aimed at investigating multi-modal emotional processing in patients with depression and to assess the relationship between emotional and neurocognitive impairments. 41 patients suffering from major depression and 41 never-depressed healthy controls participated in an audiovisual (faces-sounds emotional integration paradigm as well as a neurocognitive test battery. Our results showed that depressed patients were specifically impaired in the processing of positive auditory stimuli as they rated faces significantly more fearful when presented with happy than with neutral sounds. Such an effect was absent in controls. Findings in emotional processing in patients did not correlate with BDI-scores. Furthermore, neurocognitive findings revealed significant group differences for two of the tests. The effects found in audiovisual emotional processing, however, did not correlate with performance in the neurocognitive tests.In summary, our results underline the diversity of impairments going along with depression and indicate that deficits found for unimodal emotional processing cannot trivially be generalized to deficits in a multi-modal setting. The mechanisms of impairments therefore might be far more complex than previously thought. Our findings furthermore contradict the assumption that emotional processing deficits in major depression are associated with impaired attention or inhibitory functioning.

  17. I can't keep your face and voice out of my head: neural correlates of an attentional bias toward nonverbal emotional cues.

    Science.gov (United States)

    Jacob, Heike; Brück, Carolin; Domin, Martin; Lotze, Martin; Wildgruber, Dirk

    2014-06-01

    Emotional information can be conveyed by verbal and nonverbal cues with the latter often suggested to exert a greater influence in shaping our perceptions of others. The present functional magnetic resonance imaging study sought to explore attentional biases toward nonverbal signals by investigating the interaction of verbal and nonverbal cues. Results obtained in this study underline the previous suggestions of a "nonverbal dominance" in emotion communication by evidencing implicit effects of nonverbal cues on emotion judgements even when attention is directed away from nonverbal signals and focused on verbal cues. Attentional biases toward nonverbal signals appeared to be reflected in increasing activation of the dorsolateral prefrontal cortex (DLPFC) assumed to reflect increasing difficulties to suppress nonverbal cues during task conditions that asked to shift attention away from nonverbal signals. Aside the DLPFC, results suggest the right amygdala to play a role in attention control mechanisms related to the processing of emotional cues. Analyses conducted to determine the cerebral correlates of the individual ability to shift attention between verbal and nonverbal sources of information indicated that higher task-switching abilities seem to be associated with the up-regulation of right amygdala activation during explicit judgments of nonverbal cues, whereas difficulties in task-switching seem to be related to a down-regulation.

  18. Free-Labeling Facial Expressions and Emotional Situations in Children Aged 3-7 Years: Developmental Trajectory and a Face Inferiority Effect

    Science.gov (United States)

    Wang, Zhenhong; Lü, Wei; Zhang, Hui; Surina, Alyssa

    2014-01-01

    Chinese children (N = 185, aged 3-7 years) were assessed on their abilities to freely label facial expressions and emotional situations. Results indicated that the overall accuracy of free-labeling facial expressions increased relatively quickly in children aged 3-5 years, but slowed down in children aged 5-7 years. In contrast, the overall…

  19. How Do Typically Developing Deaf Children and Deaf Children with Autism Spectrum Disorder Use the Face When Comprehending Emotional Facial Expressions in British Sign Language?

    Science.gov (United States)

    Denmark, Tanya; Atkinson, Joanna; Campbell, Ruth; Swettenham, John

    2014-01-01

    Facial expressions in sign language carry a variety of communicative features. While emotion can modulate a spoken utterance through changes in intonation, duration and intensity, in sign language specific facial expressions presented concurrently with a manual sign perform this function. When deaf adult signers cannot see facial features, their…

  20. Emotional Side Effects

    Science.gov (United States)

    ... Treatment Most people face some degree of depression, anxiety, fear or distress when cancer becomes part of their ... who has cancer. Distress in People With Cancer Anxiety, Fear, and Depression Depression Anxiety, Fear, and Emotional Distress ...

  1. Sleep Quality and Emotional Correlates in Taiwanese Coronary Artery Bypass Graft Patients 1 Week and 1 Month after Hospital Discharge: A Repeated Descriptive Correlational Study.

    Science.gov (United States)

    Yang, Pei-Lin; Huang, Guey-Shiun; Tsai, Chien-Sung; Lou, Meei-Fang

    2015-01-01

    Poor sleep quality is a common health problem for coronary artery bypass graft patients, however few studies have evaluated sleep quality during the period immediately following hospital discharge. The aim of this study was to investigate changes in sleep quality and emotional correlates in coronary artery bypass graft patients in Taiwan at 1 week and 1 month after hospital discharge. We used a descriptive correlational design for this study. One week after discharge, 87 patients who had undergone coronary artery bypass surgery completed two structured questionnaires: the Pittsburgh Sleep Quality Index and the Hospital Anxiety and Depression Scale. Three weeks later (1 month after discharge) the patients completed the surveys again. Pearson correlations, t-tests, ANOVA and linear multiple regression analysis were used to analyze the data. A majority of the participants had poor sleep quality at 1 week (82.8%) and 1 month (66.7%) post-hospitalization, based on the global score of the Pittsburgh Sleep Quality Index. Despite poor sleep quality at both time-points the sleep quality at 1 month was significantly better than at 1-week post hospitalization. Poorer sleep quality correlated with older age, poorer heart function, anxiety and depression. The majority of participants had normal levels of anxiety at 1 week (69.0%) and 1 month (88.5%) as measured by the Hospital Anxiety and Depression Scale. However, some level of depression was seen at 1 week (78.1%) and 1 month (59.7%). Depression was a significant predictor of sleep quality at 1 week; at 1 month after hospital discharge both anxiety and depression were significant predictors of sleep quality. Sleep quality, anxiety and depression all significantly improved 1 month after hospital discharge. However, more than half of the participants continued to have poor sleep quality and some level of depression. Health care personnel should be encouraged to assess sleep and emotional status in patients after coronary artery

  2. Recursive quantum repeater networks

    CERN Document Server

    Van Meter, Rodney; Horsman, Clare

    2011-01-01

    Internet-scale quantum repeater networks will be heterogeneous in physical technology, repeater functionality, and management. The classical control necessary to use the network will therefore face similar issues as Internet data transmission. Many scalability and management problems that arose during the development of the Internet might have been solved in a more uniform fashion, improving flexibility and reducing redundant engineering effort. Quantum repeater network development is currently at the stage where we risk similar duplication when separate systems are combined. We propose a unifying framework that can be used with all existing repeater designs. We introduce the notion of a Quantum Recursive Network Architecture, developed from the emerging classical concept of 'recursive networks', extending recursive mechanisms from a focus on data forwarding to a more general distributed computing request framework. Recursion abstracts independent transit networks as single relay nodes, unifies software layer...

  3. Maternal Posttraumatic Stress Symptoms and Infant Emotional Reactivity and Emotion Regulation

    Science.gov (United States)

    Enlow, Michelle Bosquet; Kitts, Robert L.; Blood, Emily; Bizarro, Andrea; Hofmeister, Michelle; Wright, Rosalind J.

    2011-01-01

    The current study examined associations between maternal posttraumatic stress disorder (PTSD) symptoms and infant emotional reactivity and emotion regulation during the first year of life in a primarily low-income, urban, ethnic/racial minority sample of 52 mother-infant dyads. Mothers completed questionnaires assessing their own trauma exposure history and current PTSD and depressive symptoms and their infants’ temperament when the infants were 6 months old. Dyads participated in the repeated Still-Face Paradigm (SFP-R) when the infants were 6 months old, and infant affective states were coded for each SFP-R episode. Mothers completed questionnaires assessing infant trauma exposure history and infant current emotional and behavioral symptoms when the infants were 13 months old. Maternal PTSD symptoms predicted infants’ emotion regulation at 6 months as assessed by (a) infant ability to recover from distress during the SFP-R and (b) maternal report of infant rate of recovery from distress/arousal in daily life. Maternal PTSD symptoms also predicted maternal report of infant externalizing, internalizing, and dysregulation symptoms at 13 months. Maternal PTSD was not associated with measures of infant emotional reactivity. Neither maternal depressive symptoms nor infant direct exposure to trauma accounted for the associations between maternal PTSD symptoms and infant outcomes. These findings suggest that maternal PTSD is associated with offspring emotion regulation difficulties as early as infancy. Such difficulties may contribute to increased risk of mental health problems among children of mothers with PTSD. PMID:21862136

  4. Learning about faces: effects of trustworthiness on affective evaluation.

    Science.gov (United States)

    Aguado, Luis; Román, Francisco J; Fernández-Cahill, María; Diéguez-Risco, Teresa; Romero-Ferreiro, Verónica

    2011-11-01

    The results of two studies on the relationship between evaluations of trustworthiness, valence and arousal of faces are reported. In Experiment 1, valence and trustworthiness judgments of faces were positively correlated, while arousal was negatively correlated with both trustworthiness and valence. In Experiment 2, learning about faces based on their emotional expression and the extent to which this learning is influenced by perceived trustworthiness was investigated. Neutral faces of different models differing in trustworthiness were repeatedly associated with happy or with angry expressions and the participants were asked to categorize each neutral face as belonging to a "friend" or to an "enemy" based on these associations. Four pairing conditions were defined in terms of the congruency between trustworthiness level and expression: Trustworthy-congruent, trustworthy-incongruent, untrustworthy-congruent and untrustworthy-incongruent. Categorization accuracy during the learning phase and face evaluation after learning were measured. During learning, participants learned to categorize with similar efficiency trustworthy and untrustworthy faces as friends or enemies and thus no effects of congruency were found. In the evaluation phase, faces of enemies were rated as more negative and arousing than those of friends, thus showing that learning was effective to change the affective value of the faces. However, faces of untrustworthy models were still judged on average more negative and arousing than those of trustworthy ones. In conclusion, although face trustworthiness did not influence learning of associations between faces and positive or negative social information it did have a significant influence on face evaluation that was manifest even after that learning.

  5. Impacto emocional en estudiantes de pedagogía ante eventos de maltrato en la práctica profesional / Emotional impact on pedagogy students faced with abuse during professional practice

    Directory of Open Access Journals (Sweden)

    Denisse Alejandra Jaramillo Sandoval

    2015-12-01

    Full Text Available RESUMEN El objetivo de este estudio fue comprender la experiencia emocional de los estudiantes de pedagogía al enfrentarse a eventos de maltrato por parte de los docentes del establecimiento, en sus prácticas profesionales. Desde el enfoque cualitativo y utilizando la técnica de incidentes críticos, se entrevistó a 12 estudiantes que experimentaron maltrato durante su práctica y se recabó 15 incidentes críticos. De esta muestra, se categorizaron 5 conductas de maltrato, de las cuales, la angustia, la rabia e inseguridad fueron las principales emociones que manifestaron los estudiantes. Para ellos, dicha experiencia tuvo un intenso impacto emocional, asociándose a altos nivel de estrés. Se discute por una parte que estos eventos provocaron un significativo cuestionamiento en su identidad profesional y, por otra parte, sobre la importancia de fortalecer las habilidades socioemocionales en la formación docente, de modo que se maneje adecuadamente los conflictos y conservar el bienestar psicológico. ABSTRACT The objective of this study was to understand the emotional experience of pedagogy students faced with abusive events caused by teachers of the institution during their professional practice. From the qualitative approach, and using the critical incidents technique, we interviewed 12 students who experienced abuse during their practice and gathered 15 critical incidents. From this sample, it was categorized 5 abusive behaviors of which anguish, anger, and insecurity were the main emotions expressed by the students. For them, such experience had an intense emotional impact associating it with high levels of stress. The discussion that these events caused them to significantly question in their professional identity; and on the other hand, regarding the importance of strengthening the socio-emotional skills in teacher training, so that conflicts are managed properly – thereby maintaining the psychological well-being.

  6. Priming with threatening faces modulates the self-face advantage by enhancing the other-face processing rather than suppressing the self-face processing.

    Science.gov (United States)

    Guan, Lili; Qi, Mingming; Li, Haijiang; Hitchman, Glenn; Yang, Juan; Liu, Yijun

    2015-05-22

    Social emotional information influences self-processing in everyday activities, but few researchers have investigated this process. The current ERP study adopted a prime paradigm to investigate how socially threatening faces impact on the self-face processing advantage. After being primed with emotional faces (happy, angry or neutral), participants judged whether the target face (self, friend, and stranger) was familiar or unfamiliar. Results showed an interaction effect between the prime face and the target face at posterior P3, suggesting that after priming with happy and neutral faces, self-faces elicited larger P3 amplitudes than friend-faces and stranger-faces; however, after priming with angry faces, the P3 amplitudes were not significantly different between self-face and friend-face. Moreover, the P3 amplitudes of self-faces did not differ between priming with angry and neutral faces; however, the P3 amplitude of both friend-faces and stranger-faces showed enhanced responses after priming with angry faces compared to priming with neutral faces. We suggest that the self-face processing advantage (self vs. friend) could be weakened by priming with threatening faces, through enhancement of the other-faces processing rather than suppression of self-faces processing in angry vs. neutral face prime.

  7. Visual Search for Basic Emotional Expressions in Autism; Impaired Processing of Anger, Fear and Sadness, but a Typical Happy Face Advantage

    Science.gov (United States)

    Farran, Emily K.; Branson, Amanda; King, Ben J.

    2011-01-01

    Facial expression recognition was investigated in 20 males with high functioning autism (HFA) or Asperger syndrome (AS), compared to typically developing individuals matched for chronological age (TD CA group) and verbal and non-verbal ability (TD V/NV group). This was the first study to employ a visual search, "face in the crowd" paradigm with a…

  8. Human Emotion Recognition System

    Directory of Open Access Journals (Sweden)

    Dilbag Singh

    2012-08-01

    Full Text Available This paper discusses the application of feature extraction of facial expressions with combination of neural network for the recognition of different facial emotions (happy, sad, angry, fear, surprised, neutral etc... Humans are capable of producing thousands of facial actions during communication that vary in complexity, intensity, and meaning. This paper analyses the limitations with existing system Emotion recognition using brain activity. In this paper by using an existing simulator I have achieved 97 percent accurate results and it is easy and simplest way than Emotion recognition using brain activity system. Purposed system depends upon human face as we know face also reflects the human brain activities or emotions. In this paper neural network has been used for better results. In the end of paper comparisons of existing Human Emotion Recognition System has been made with new one.

  9. Attentional-shaping as a means to improve emotion perception deficits in schizophrenia.

    Science.gov (United States)

    Combs, Dennis R; Tosheva, Aneta; Penn, David L; Basso, Michael R; Wanner, Jill L; Laib, Kristen

    2008-10-01

    Inability to recognize emotional expressions of others (emotion perception) is one of the most common impairments observed among individuals with schizophrenia. Such deficits presumably contribute much to the social dysfunction characteristic of schizophrenia. This study examined the efficacy of a novel attentional-shaping intervention to improve emotion perception abilities. Sixty participants with schizophrenia were randomly assigned to one of three intervention conditions: 1) attentional-shaping, 2) contingent monetary reinforcement, or 3) repeated practice. Participants completed the Face Emotion Identification Test (FEIT) at pre-test, intervention, post-test, and one week follow-up. Participants also completed the Bell-Lysaker Emotion Recognition Test (BLERT) and the Social Behavior Scale at pre-test and follow-up to measure generalization. The results showed that the attentional-shaping condition had significantly higher scores on the FEIT at intervention, post-test, and follow-up compared to monetary reinforcement and repeated practice. Improvement was also found on the BLERT and a trend was found for improved social behaviors at one-week follow-up. Results will be discussed in terms of face scanning and attentional deficits present in schizophrenia and potential uses of this intervention in the remediation of emotion perception deficits.

  10. Social judgments from faces.

    Science.gov (United States)

    Todorov, Alexander; Mende-Siedlecki, Peter; Dotsch, Ron

    2013-06-01

    People make rapid and consequential social judgments from minimal (non-emotional) facial cues. There has been rapid progress in identifying the perceptual basis of these judgments using data-driven, computational models. In contrast, our understanding of the neural underpinnings of these judgments is rather limited. Meta-analyses of neuroimaging studies find a wide range of seemingly inconsistent responses in the amygdala that co-vary with social judgments from faces. Guided by computational models of social judgments, these responses can be accounted by positing that the amygdala (and posterior face selective regions) tracks face typicality. Atypical faces, whether positively or negatively evaluated, elicit stronger responses in the amygdala. We conclude with the promise of data-driven methods for modeling neural responses to social judgments from faces.

  11. Modeling the Experience of Emotion

    OpenAIRE

    Broekens, Joost

    2009-01-01

    Affective computing has proven to be a viable field of research comprised of a large number of multidisciplinary researchers resulting in work that is widely published. The majority of this work consists of computational models of emotion recognition, computational modeling of causal factors of emotion and emotion expression through rendered and robotic faces. A smaller part is concerned with modeling the effects of emotion, formal modeling of cognitive appraisal theory and models of emergent...

  12. Modeling Social Perception of Faces

    NARCIS (Netherlands)

    Todorov, A.T.; Oosterhof, N.N.

    2011-01-01

    The face is our primary source of visual information for identifying people and reading their emotional and mental states. With the exception of prosopagnosics (who are unable to recognize faces) and those suffering from such disorders of social cognition as autism, people are extremely adept at the

  13. Audiovisual integration of emotional signals from others’ social interactions.

    Directory of Open Access Journals (Sweden)

    Lukasz ePiwek

    2015-05-01

    Full Text Available Audiovisual perception of emotions has been typically examined using displays of a solitary character (e.g. the face-voice and/or body-sound of one actor. However, in real life humans often face more complex multisensory social situations, involving more than one person. Here we ask if the audiovisual facilitation in emotion recognition previously found in simpler social situations extends to more complex and ecological situations. Stimuli consisting of the biological motion and voice of two interacting agents were used in two experiments. In Experiment 1, participants were presented with visual, auditory, auditory filtered/noisy, and audiovisual congruent and incongruent clips. We asked participants to judge whether the two agents were interacting happily or angrily. In Experiment 2, another group of participants repeated the same task, as in Experiment 1, while trying to ignore either the visual or the auditory information. The findings from both experiments indicate that when the reliability of the auditory cue was decreased participants weighted more the visual cue in their emotional judgments. This in turn translated in increased emotion recognition accuracy for the multisensory condition. Our findings thus point to a common mechanism of multisensory integration of emotional signals irrespective of social stimulus complexity.

  14. [Emotional intelligence and oscillatory responses on the emotional facial expressions].

    Science.gov (United States)

    Kniazev, G G; Mitrofanova, L G; Bocharov, A V

    2013-01-01

    Emotional intelligence-related differences in oscillatory responses to emotional facial expressions were investigated in 48 subjects (26 men and 22 women) in age 18-30 years. Participants were instructed to evaluate emotional expression (angry, happy and neutral) of each presented face on an analog scale ranging from -100 (very hostile) to + 100 (very friendly). High emotional intelligence (EI) participants were found to be more sensitive to the emotional content of the stimuli. It showed up both in their subjective evaluation of the stimuli and in a stronger EEG theta synchronization at an earlier (between 100 and 500 ms after face presentation) processing stage. Source localization using sLORETA showed that this effect was localized in the fusiform gyrus upon the presentation of angry faces and in the posterior cingulate gyrus upon the presentation of happy faces. At a later processing stage (500-870 ms) event-related theta synchronization in high emotional intelligence subject was higher in the left prefrontal cortex upon the presentation of happy faces, but it was lower in the anterior cingulate cortex upon presentation of angry faces. This suggests the existence of a mechanism that can be selectively increase the positive emotions and reduce negative emotions.

  15. Touch communicates distinct emotions.

    Science.gov (United States)

    Hertenstein, Matthew J; Keltner, Dacher; App, Betsy; Bulleit, Brittany A; Jaskolka, Ariane R

    2006-08-01

    The study of emotional signaling has focused almost exclusively on the face and voice. In 2 studies, the authors investigated whether people can identify emotions from the experience of being touched by a stranger on the arm (without seeing the touch). In the 3rd study, they investigated whether observers can identify emotions from watching someone being touched on the arm. Two kinds of evidence suggest that humans can communicate numerous emotions with touch. First, participants in the United States (Study 1) and Spain (Study 2) could decode anger, fear, disgust, love, gratitude, and sympathy via touch at much-better-than-chance levels. Second, fine-grained coding documented specific touch behaviors associated with different emotions. In Study 3, the authors provide evidence that participants can accurately decode distinct emotions by merely watching others communicate via touch. The findings are discussed in terms of their contributions to affective science and the evolution of altruism and cooperation. (c) 2006 APA, all rights reserved

  16. Face Processing: Models For Recognition

    Science.gov (United States)

    Turk, Matthew A.; Pentland, Alexander P.

    1990-03-01

    The human ability to process faces is remarkable. We can identify perhaps thousands of faces learned throughout our lifetime and read facial expression to understand such subtle qualities as emotion. These skills are quite robust, despite sometimes large changes in the visual stimulus due to expression, aging, and distractions such as glasses or changes in hairstyle or facial hair. Computers which model and recognize faces will be useful in a variety of applications, including criminal identification, human-computer interface, and animation. We discuss models for representing faces and their applicability to the task of recognition, and present techniques for identifying faces and detecting eye blinks.

  17. Sleep Deprivation Impairs the Accurate Recognition of Human Emotions

    Science.gov (United States)

    van der Helm, Els; Gujar, Ninad; Walker, Matthew P.

    2010-01-01

    Study Objectives: Investigate the impact of sleep deprivation on the ability to recognize the intensity of human facial emotions. Design: Randomized total sleep-deprivation or sleep-rested conditions, involving between-group and within-group repeated measures analysis. Setting: Experimental laboratory study. Participants: Thirty-seven healthy participants, (21 females) aged 18–25 y, were randomly assigned to the sleep control (SC: n = 17) or total sleep deprivation group (TSD: n = 20). Interventions: Participants performed an emotional face recognition task, in which they evaluated 3 different affective face categories: Sad, Happy, and Angry, each ranging in a gradient from neutral to increasingly emotional. In the TSD group, the task was performed once under conditions of sleep deprivation, and twice under sleep-rested conditions following different durations of sleep recovery. In the SC group, the task was performed twice under sleep-rested conditions, controlling for repeatability. Measurements and Results: In the TSD group, when sleep-deprived, there was a marked and significant blunting in the recognition of Angry and Happy affective expressions in the moderate (but not extreme) emotional intensity range; differences that were most reliable and significant in female participants. No change in the recognition of Sad expressions was observed. These recognition deficits were, however, ameliorated following one night of recovery sleep. No changes in task performance were observed in the SC group. Conclusions: Sleep deprivation selectively impairs the accurate judgment of human facial emotions, especially threat relevant (Anger) and reward relevant (Happy) categories, an effect observed most significantly in females. Such findings suggest that sleep loss impairs discrete affective neural systems, disrupting the identification of salient affective social cues. Citation: van der Helm E; Gujar N; Walker MP. Sleep deprivation impairs the accurate recognition of human

  18. Deployment Repeatability

    Science.gov (United States)

    2016-04-01

    controlled to great precision, but in a Cubesat , there may be no attitude determination at all. Such a Cubesat might treat sun angle and tumbling rates as...could be sensitive to small differences in motor controller timing. In these cases, the analyst might choose to model the entire deployment path, with...knowledge of the material damage model or motor controller timing precision. On the other hand, if many repeated and environmentally representative

  19. Social Psychological Face Perception: Why Appearance Matters

    Science.gov (United States)

    Zebrowitz, Leslie A.; Montepare, Joann M.

    2009-01-01

    We form first impressions from faces despite warnings not to do so. Moreover, there is considerable agreement in our impressions, which carry significant social outcomes. Appearance matters because some facial qualities are so useful in guiding adaptive behavior that even a trace of those qualities can create an impression. Specifically, the qualities revealed by facial cues that characterize low fitness, babies, emotion, and identity are overgeneralized to people whose facial appearance resembles the unfit (anomalous face overgeneralization), babies (babyface overgeneralization), a particular emotion (emotion face overgeneralization), or a particular identity (familiar face overgeneralization). We review studies that support the overgeneralization hypotheses and recommend research that incorporates additional tenets of the ecological theory from which these hypotheses are derived: the contribution of dynamic and multi-modal stimulus information to face perception; bidirectional relationships between behavior and face perception; perceptual learning mechanisms and social goals that sensitize perceivers to particular information in faces. PMID:20107613

  20. 抑郁症患者对负性情绪面孔的注意偏向%Attention Bias to Negative Emotional Faces in Clinical Depression

    Institute of Scientific and Technical Information of China (English)

    许媛美; 盛利; 张英辉; 何小琼; 赵正前; 王文娟; 翟长平

    2016-01-01

    Objective:According to the characteristics of behaviorism when executing cue-target paradigm ,to ex‐plore clinical depressions’ cognitive features of attention bias of negative emotional information .Methods:Selected clinical depressions were in accordance with international psychiatric diagnosis and classification standard (ICD -10) ,20-case control group were selected in terms of gender ,age ,and education years which matched with depres‐sion group .Using the Chinese facial affective picture system established by psychological research institutes in Chi‐nese academy of sciences as stimulus materials ,selected cue-target paradigm as current experiment paradigm ,com‐paring the difference of mean reaction time and accuracy between two groups of subjects under the condition of dif‐ferent clues .Results:Compared to the control subjects ,clinical depressions’ total mean reaction time was longer(t=‐5 5.79 ,P<0 0.1) .The difference of control group between invalid task reaction time and effective one was positive , and the difference of depression group was negative .Both groups showed that the accuracy of effective tasks was higher than invalid one ,and the neutral one was higher than negative one(t=8 3.53 ,2 9.94 ,7 3.63 ,4 4.99 ;P<0 0.1) . Conclusion:①Compared to control subjects ,clinical depression’s speed of cognitive processing when executing cue-target paradigm is slower ,which can interpret the symptom characteristics of depression which are retardation of thinking and hypobulia .②Normal person shows cuing effect to negative emotional pictures ,clinical depression shows inhibition of return ,prompts that clinical depression tends to be attracted by negative information .%目的:通过分析抑郁症患者在执行线索提示任务时的行为学特点,探索抑郁症患者对负性情绪信息注意偏向的认知特点。方法:筛选出符合国际精神疾病诊断与分类标准(ICD -10)的抑郁症患者20例(抑郁组

  1. Face pain

    Science.gov (United States)

    ... begin in other places in the body. Abscessed tooth (ongoing throbbing pain on one side of the lower face that ... face, and aggravated by eating. Call a dentist. Pain is persistent, ... by other unexplained symptoms. Call your primary provider.

  2. Behavioral dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia

    OpenAIRE

    Roberta eDaini; Chiara Maddalena Comparetti; Paola eRicciardelli

    2014-01-01

    Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently fro...

  3. Masked emotional priming beyond global valence activations.

    Science.gov (United States)

    Rohr, Michaela; Degner, Juliane; Wentura, Dirk

    2012-01-01

    An immense body of research demonstrates that emotional facial expressions can be processed unconsciously. However, it has been assumed that such processing takes place solely on a global valence-based level, allowing individuals to disentangle positive from negative emotions but not the specific emotion. In three studies, we investigated the specificity of emotion processing under conditions of limited awareness using a modified variant of an affective priming task. Faces with happy, angry, sad, fearful, and neutral expressions were presented as masked primes for 33 ms (Study 1) or 14 ms (Studies 2 and 3) followed by emotional target faces (Studies 1 and 2) or emotional adjectives (Study 3). Participants' task was to categorise the target emotion. In all three studies, discrimination of targets was significantly affected by the emotional primes beyond a simple positive versus negative distinction. Results indicate that specific aspects of emotions might be automatically disentangled in addition to valence, even under conditions of subjective unawareness.

  4. A new look at emotion perception: Concepts speed and shape facial emotion recognition.

    Science.gov (United States)

    Nook, Erik C; Lindquist, Kristen A; Zaki, Jamil

    2015-10-01

    Decades ago, the "New Look" movement challenged how scientists thought about vision by suggesting that conceptual processes shape visual perceptions. Currently, affective scientists are likewise debating the role of concepts in emotion perception. Here, we utilized a repetition-priming paradigm in conjunction with signal detection and individual difference analyses to examine how providing emotion labels-which correspond to discrete emotion concepts-affects emotion recognition. In Study 1, pairing emotional faces with emotion labels (e.g., "sad") increased individuals' speed and sensitivity in recognizing emotions. Additionally, individuals with alexithymia-who have difficulty labeling their own emotions-struggled to recognize emotions based on visual cues alone, but not when emotion labels were provided. Study 2 replicated these findings and further demonstrated that emotion concepts can shape perceptions of facial expressions. Together, these results suggest that emotion perception involves conceptual processing. We discuss the implications of these findings for affective, social, and clinical psychology.

  5. Emotional intelligence and emotions associated with optimal and dysfunctional athletic performance.

    Science.gov (United States)

    Lane, Andrew M; Devonport, Tracey J; Soos, Istvan; Karsai, Istvan; Leibinger, Eva; Hamar, Pal

    2010-01-01

    This study investigated relationships between self-report measures of emotional intelligence and memories of pre-competitive emotions before optimal and dysfunctional athletic performance. Participant-athletes (n = 284) completed a self-report measure of emotional intelligence and two measures of pre-competitive emotions; a) emotions experienced before an optimal performance, and b) emotions experienced before a dysfunctional performance. Consistent with theoretical predictions, repeated MANOVA results demonstrated pleasant emotions associated with optimal performance and unpleasant emotions associated with dysfunctional performance. Emotional intelligence correlated with pleasant emotions in both performances with individuals reporting low scores on the self-report emotional intelligence scale appearing to experience intense unpleasant emotions before dysfunctional performance. We suggest that future research should investigate relationships between emotional intelligence and emotion-regulation strategies used by athletes. Key pointsAthletes reporting high scores of self-report emotional intelligence tend to experience pleasant emotions.Optimal performance is associated with pleasant emotions and dysfunctional performance is associated with unpleasant emotions.Emotional intelligence might help athletes recognize which emotional states help performance.

  6. Psychophysics of emotion: the QUEST for emotional attention.

    Science.gov (United States)

    Roesch, Etienne B; Sander, David; Mumenthaler, Christian; Kerzel, Dirk; Scherer, Klaus R

    2010-03-24

    To investigate the mechanisms involved in automatic processing of facial expressions, we used the QUEST procedure to measure the display durations needed to make a gender decision on emotional faces portraying fearful, happy, or neutral facial expressions. In line with predictions of appraisal theories of emotion, our results showed greater processing priority of emotional stimuli regardless of their valence. Whereas all experimental conditions led to an averaged threshold of about 50 ms, fearful and happy facial expressions led to significantly less variability in the responses than neutral faces. Results suggest that attention may have been automatically drawn by the emotion portrayed by face targets, yielding more informative perceptions and less variable responses. The temporal resolution of the perceptual system (expressed by the thresholds) and the processing priority of the stimuli (expressed by the variability in the responses) may influence subjective and objective measures of awareness, respectively.

  7. (How) do medical students regulate their emotions?

    Science.gov (United States)

    Doulougeri, Karolina; Panagopoulou, Efharis; Montgomery, Anthony

    2016-12-12

    Medical training can be a challenging and emotionally intense period for medical students. However the emotions experienced by medical students in the face of challenging situations and the emotion regulation strategies they use remains relatively unexplored. The aim of the present study was to explore the emotions elicited by memorable incidents reported by medical students and the associated emotion regulation strategies. Peer interviewing was used to collect medical students' memorable incidents. Medical students at both preclinical and clinical stage of medical school were eligible to participate. In total 104 medical students provided memorable incidents. Only 54 narratives included references to emotions and emotion regulation and thus were further analyzed. The narratives of 47 clinical and 7 preclinical students were further analyzed for their references to emotions and emotion regulation strategies. Forty seven out of 54 incidents described a negative incident associated with negative emotions. The most frequently mentioned emotion was shock and surprise followed by feelings of embarrassment, sadness, anger and tension or anxiety. The most frequent reaction was inaction often associated with emotion regulation strategies such as distraction, focusing on a task, suppression of emotions and reappraisal. When students witnessed mistreatment or disrespect exhibited towards patients, the regulation strategy used involved focusing and comforting the patient. The present study sheds light on the strategies medical students use to deal with intense negative emotions. The vast majority reported inaction in the face of a challenging situation and the use of more subtle strategies to deal with the emotional impact of the incident.

  8. Facial age affects emotional expression decoding

    OpenAIRE

    2014-01-01

    Facial expressions convey important information on emotional states of our interaction partners. However, in interactions between younger and older adults, there is evidence for a reduced ability to accurately decode emotional facial expressions. Previous studies have often followed up this phenomenon by examining the effect of the observers' age. However, decoding emotional faces is also likely to be influenced by stimulus features, and age-related changes in the face such as wrinkles and fo...

  9. Facial age affects emotional expression decoding

    OpenAIRE

    2014-01-01

    Facial expressions convey important information on emotional states of our interaction partners. However, in interactions between younger and older adults, there is evidence for a reduced ability to accurately decode emotional facial expressions.Previous studies have often followed up this phenomenon by examining the effect of the observers’ age. However, decoding emotional faces is also likely to be influenced by stimulus features, and age-related changes in the face such as wrinkles and fol...

  10. Emotion as morphofunctionality.

    Science.gov (United States)

    Pérez, Carlos Herrera; Sanz, Ricardo

    2013-01-01

    We argue for a morphofunctional approach to emotion modeling that can also aid the design of adaptive embodied systems. By morphofunctionality we target the online change in both structure and function of a system, and relate it to the notion of physiology and emotion in animals. Besides the biological intuition that emotions serve the function of preparing the body, we investigate the control requirements that any morphofunctional autonomous system must face. We argue that changes in morphology modify the dynamics of the system, thus forming a variable structure system (VSS). We introduce some of the techniques of control theory to deal with VSSs and derive a twofold hypothesis: first, the loose coupling between two control systems, in charge of action and action readiness, respectively; second, the formation of patterned metacontrol. Emotional phenomena can be seen as emergent from this control setup.

  11. Learning faces: similar comparator faces do not improve performance.

    Directory of Open Access Journals (Sweden)

    Scott P Jones

    Full Text Available Recent evidence indicates that comparison of two similar faces can aid subsequent discrimination between them. However, the fact that discrimination between two faces is facilitated by comparing them directly does not demonstrate that comparison produces a general improvement in the processing of faces. It remains an open question whether the opportunity to compare a "target" face to similar faces can facilitate the discrimination of the exposed target face from other nonexposed faces. In Experiment 1, selection of a target face from an array of novel foils was not facilitated by intermixed exposure to the target and comparators of the same sex. Experiment 2 also found no advantage for similar comparators (morphed towards the target over unmorphed same sex comparators, or over repeated target exposure alone. But all repeated exposure conditions produced better performance than a single brief presentation of the target. Experiment 3 again demonstrated that repeated exposure produced equivalent learning in same sex and different sex comparator conditions, and also showed that increasing the number of same sex or different sex comparators failed to improve identification. In all three experiments, exposure to a target alongside similar comparators failed to support selection of the target from novel test stimuli to a greater degree than exposure alongside dissimilar comparators or repeated target exposure alone. The current results suggest that the facilitatory effects of comparison during exposure may be limited to improving discrimination between exposed stimuli, and thus our results do not support the idea that providing the opportunity for comparison is a practical means for improving face identification.

  12. Facial age affects emotional expression decoding

    Directory of Open Access Journals (Sweden)

    Mara eFölster

    2014-02-01

    Full Text Available Facial expressions convey important information on emotional states of our interaction partners. However, in interactions between younger and older adults, there is evidence for a reduced ability to accurately decode emotional facial expressions.Previous studies have often followed up this phenomenon by examining the effect of the observers’ age. However, decoding emotional faces is also likely to be influenced by stimulus features, and age-related changes in the face such as wrinkles and folds may render facial expressions of older adults harder to decode. In this paper, we review theoretical frameworks and empirical findings on age effects on decoding emotional expressions, with an emphasis on age-of-face effects. We conclude that the age of the face plays an important role for facial expression decoding. Lower expressivity, age-related changes in the face, less elaborated emotion schemas for older faces, negative attitudes toward older adults, and different visual scan patterns and neural processing of older than younger faces may lower decoding accuracy for older faces. Furthermore, age-related stereotypes and age-related changes in the face may bias the attribution of specific emotions such as sadness to older faces.

  13. Facial age affects emotional expression decoding.

    Science.gov (United States)

    Fölster, Mara; Hess, Ursula; Werheid, Katja

    2014-01-01

    Facial expressions convey important information on emotional states of our interaction partners. However, in interactions between younger and older adults, there is evidence for a reduced ability to accurately decode emotional facial expressions. Previous studies have often followed up this phenomenon by examining the effect of the observers' age. However, decoding emotional faces is also likely to be influenced by stimulus features, and age-related changes in the face such as wrinkles and folds may render facial expressions of older adults harder to decode. In this paper, we review theoretical frameworks and empirical findings on age effects on decoding emotional expressions, with an emphasis on age-of-face effects. We conclude that the age of the face plays an important role for facial expression decoding. Lower expressivity, age-related changes in the face, less elaborated emotion schemas for older faces, negative attitudes toward older adults, and different visual scan patterns and neural processing of older than younger faces may lower decoding accuracy for older faces. Furthermore, age-related stereotypes and age-related changes in the face may bias the attribution of specific emotions such as sadness to older faces.

  14. Impact of Emotion on Consciousness

    DEFF Research Database (Denmark)

    Thomsen, Kristine Rømer; Lou, Hans Olav Christensen; Jønsson, Morten

    2011-01-01

    showed that participants were more confident and accurate when consciously seeing happy versus sad/neutral faces and words. When stimuli were presented subliminally, we found no effect of emotion. To investigate the neural basis of this impact of emotion, we recorded local field potentials (LFPs......) directly in the ACC in a chronic pain patient. Behavioural findings were replicated: the patient was more confident and accurate when (consciously) seeing happy versus sad faces, while no effect was seen in subliminal trials. Mirroring behavioural findings, we found significant differences in the LFPs...... after around 500 ms (lasting 30 ms) in conscious trials between happy and sad faces, while no effect was found in subliminal trials. We thus demonstrate a striking impact of emotion on conscious experience, with positive emotional stimuli enhancing conscious reportability. In line with previous studies...

  15. Recognition of emotional expressions in blended faces and gender discrimination by children with autism%孤独症儿童混合面部表情识别及面孔性别区分能力研究

    Institute of Scientific and Technical Information of China (English)

    闫瑾; 姜志梅; 郭岚敏; 吕洋; 孙奇峰; 李兴洲; 王立苹

    2012-01-01

    [Objective] To test the ability of the recognition of emotional expression in blended faces and gender discrimination of eyes and mouths by children with autism. [Methods] Thirty-two male children with autism and thirty-two typically developing children matched on developmental age and gender were selected. They were tested with the Emotional Expressions Recognition Software System developed in this research which took recognition accuracy rate and response time in different presentation manners as analysis indexes. [Results] 1)The accuracy rates of emotional expression were significantly lower in children with autism than in typically developing[(58. 0 ± 15. 6)%vs(78. 4±13. 5)%,i=- 5. i,P = 0. 000],the response time was delayed[(9 948. 3 ± 3 116. 2)ms vs(5 617.0±1 362. 9)ms,t=4. 7,P = 0. 000]. 2)The accuracy rates of gender discrimination was significantly lower in children with autism than in typically developing[eye: (76. 7 ± 11. 5)%vs(86. 6 ±10. 9)%,mouth: (66. 2 ± 12. 8)%vs(73. 1 ±10. 7)%], the response time was delayed[eye: (4 138.7 ± 542. 0)ms vs(2 721. 9±636. 6)ms,mouth:(3 807. 8 ± 710. Dms vs(2 836. 5 ± 619. 9)ms). [Conclusions] Children with autism are inclined to attend to the lower face when making judgments about emotional expressions; they can use information from eyes for gender discrimination,and do not appear to be superior to typically developing children at using mouth information to process gender information.%[目的] 测试孤独症儿童混合面部表情的识别能力及通过眼和嘴对面孔性别进行区分的能力. [方法] 采用自制计算机系统对32例孤独症儿童和32例正常儿童进行测试并分析,以正确率、反应时和错误类型为分析指标,两组儿童在发展年龄上进行匹配. [结果] 1)孤独症组儿童识别混合面部表情的平均正确率小于正常对照组[(58.0±15.6)%和(78.4±13.5)%,t=-5.4,P=0.000],平均反应时长于正常对照组[(9 948.3±3 116.2) ms和(5 617.0

  16. Generational Differences of Emotional Expression

    Institute of Scientific and Technical Information of China (English)

    李学勇

    2014-01-01

    As a kind of subjective psychological activity, emotion can only be known and perceived by a certain expressive form. Varies as the different main bodies, difference of emotional expression can be reflected not only among individuals but between generations. The old conceals their emotions inside, the young express their emotions boldly, and the middle-aged are rational and deep in their expressions. Facing and understanding such differences is the premise and foundation of the con-struction of a harmonious relationship between different generations.

  17. Faced with a dilemma

    DEFF Research Database (Denmark)

    Christensen, Anne Vinggaard; Christiansen, Anne Hjøllund; Petersson, Birgit

    2013-01-01

    's legal right to choose TOP and considerations about the foetus' right to live were suppressed. Midwives experienced a dilemma when faced with aborted foetuses that looked like newborns and when aborted foetuses showed signs of life after a termination. Furthermore, they were critical of how physicians...... counsel women/couples after prenatal diagnosis. CONCLUSIONS: The midwives' practice in relation to late TOP was characterised by an acknowledgement of the growing ethical status of the foetus and the emotional reactions of the women/couples going through late TOP. Other professions as well as structural...

  18. Emotional Intelligence (EI): A Therapy for Higher Education Students

    Science.gov (United States)

    Machera, Robert P.; Machera, Precious C.

    2017-01-01

    This study investigates the need to design and develop emotional intelligence curriculum for students in higher education. Emotional intelligence curriculum may be used as a therapy that provides skills to manage high emotions faced by generation "Y", on a day-to-day basis. Generation "Y" is emotionally challenged with: drug…

  19. Behavioral dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia.

    Science.gov (United States)

    Daini, Roberta; Comparetti, Chiara M; Ricciardelli, Paola

    2014-01-01

    Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition.

  20. Behavioural dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia

    Directory of Open Access Journals (Sweden)

    Roberta eDaini

    2014-12-01

    Full Text Available Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP, and basic emotional expressions. To this end, we carried out a behavioural study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm, could be identical (neutral to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs’ impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non emotional facial expression (task 1. Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2. These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition

  1. Binocular rivalry: a window into emotional processing in aging.

    Science.gov (United States)

    Bannerman, Rachel L; Regener, Paula; Sahraie, Arash

    2011-06-01

    Previous binocular rivalry studies with younger adults have shown that emotional stimuli dominate perception over neutral stimuli. Here we investigated the effects of age on patterns of emotional dominance during binocular rivalry. Participants performed a face/house rivalry task where the emotion of the face (happy, angry, neutral) and orientation (upright, inverted) of the face and house stimuli were varied systematically. Age differences were found with younger adults showing a general emotionality effect (happy and angry faces were more dominant than neutral faces) and older adults showing inhibition of anger (neutral faces were more dominant than angry faces) and positivity effects (happy faces were more dominant than both angry and neutral faces). Age differences in dominance patterns were reflected by slower rivalry rates for both happy and angry compared to neutral face/house pairs in younger adults, and slower rivalry rates for happy compared to both angry and neutral face/house pairs in older adults. Importantly, these patterns of emotional dominance and slower rivalry rates for emotional-face/house pairs disappeared when the stimuli were inverted. This suggests that emotional valence, and not low-level image features, were responsible for the emotional bias in both age groups. Given that binocular rivalry has a limited role for voluntary control, the findings imply that anger suppression and positivity effects in older adults may extend to more automatic tasks.

  2. About Face

    Medline Plus

    Full Text Available Skip to Content Menu Closed (Tap to Open) Home Videos by Topic Videos by Type Search All ... What is AboutFace? Resources for Professionals Get Help Home Watch Videos by Topic Videos by Type Search ...

  3. About Face

    Medline Plus

    Full Text Available ... Home Videos by Topic Videos by Type Search All Videos PTSD Basics PTSD Treatment What is AboutFace? ... Watch Videos by Topic Videos by Type Search All Videos Learn More PTSD Basics PTSD Treatment What ...

  4. Face Forward

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Last November, surgeons in France successfully performed the world's first face transplant surgery. Ten days later, Chen Huanran in Beijing began soliciting patients who were ready to accept a face transplant, searching for China's first such patient through an advertisement on his website and other channels. Chen, chief orthopedic surgeon at the Plastic Surgery Hospital under the Chinese Academy of Medical Sciences, has conducted more than 300 transsexual operations and was considered one of the top com...

  5. Pitching Emotions: The Interpersonal Effects of Emotions in Professional Baseball

    Directory of Open Access Journals (Sweden)

    Arik eCheshin

    2016-02-01

    Full Text Available Sports games are inherently emotional situations, but surprisingly little is known about the social consequences of these emotions. We examined the interpersonal effects of emotional expressions in professional baseball. Specifically, we investigated whether pitchers' facial displays influence how pitches are assessed and responded to. Using footage from MLB World Series finals, we isolated incidents where the pitcher's face was visible before a pitch. A pre-study indicated that participants consistently perceived anger, happiness, and worry in pitchers' facial displays. An independent sample then predicted pitch characteristics and batter responses based on the same perceived emotional displays. Participants expected pitchers perceived as happy to throw more accurate balls, pitchers perceived as angry to throw faster and more difficult balls, and pitchers perceived as worried to throw slower and less accurate balls. Batters were expected to approach (swing when faced with a pitcher perceived as happy and to avoid (no swing when faced with a pitcher perceived as worried. Whereas previous research focused on using emotional expressions as information regarding past and current situations, our work suggests that people also use perceived emotional expressions to predict future behavior. Our results attest to the impact perceived emotional expressions can have on professional sports.

  6. Pitching Emotions: The Interpersonal Effects of Emotions in Professional Baseball.

    Science.gov (United States)

    Cheshin, Arik; Heerdink, Marc W; Kossakowski, Jolanda J; Van Kleef, Gerben A

    2016-01-01

    Sports games are inherently emotional situations, but surprisingly little is known about the social consequences of these emotions. We examined the interpersonal effects of emotional expressions in professional baseball. Specifically, we investigated whether pitchers' facial displays influence how pitches are assessed and responded to. Using footage from the Major League Baseball World Series finals, we isolated incidents where the pitcher's face was visible before a pitch. A pre-study indicated that participants consistently perceived anger, happiness, and worry in pitchers' facial displays. An independent sample then predicted pitch characteristics and batter responses based on the same perceived emotional displays. Participants expected pitchers perceived as happy to throw more accurate balls, pitchers perceived as angry to throw faster and more difficult balls, and pitchers perceived as worried to throw slower and less accurate balls. Batters were expected to approach (swing) when faced with a pitcher perceived as happy and to avoid (no swing) when faced with a pitcher perceived as worried. Whereas previous research focused on using emotional expressions as information regarding past and current situations, our work suggests that people also use perceived emotional expressions to predict future behavior. Our results attest to the impact perceived emotional expressions can have on professional sports.

  7. Selective Attention to Emotional Stimuli: What IQ and Openness Do, and Emotional Intelligence Does Not

    Science.gov (United States)

    Fiori, Marina; Antonakis, John

    2012-01-01

    We examined how general intelligence, personality, and emotional intelligence--measured as an ability using the MSCEIT--predicted performance on a selective-attention task requiring participants to ignore distracting emotion information. We used a visual prime in which participants saw a pair of faces depicting emotions; their task was to focus on…

  8. Social and Emotional Issues of Living with OI

    Science.gov (United States)

    ... Parents are faced with many emotional issues and decisions. An unexpected diagnosis affects all members of the family in terms of their emotions, everyday activities, career choices and finances. Life with OI is uncertain ...

  9. Effects of unconscious processing on implicit memory for fearful faces.

    Directory of Open Access Journals (Sweden)

    Jiongjiong Yang

    Full Text Available Emotional stimuli can be processed even when participants perceive them without conscious awareness, but the extent to which unconsciously processed emotional stimuli influence implicit memory after short and long delays is not fully understood. We addressed this issue by measuring a subliminal affective priming effect in Experiment 1 and a long-term priming effect in Experiment 2. In Experiment 1, a flashed fearful or neutral face masked by a scrambled face was presented three times, then a target face (either fearful or neutral was presented and participants were asked to make a fearful/neutral judgment. We found that, relative to a neutral prime face (neutral-fear face, a fearful prime face speeded up participants' reaction to a fearful target (fear-fear face, when they were not aware of the masked prime face. But this response pattern did not apply to the neutral target. In Experiment 2, participants were first presented with a masked faces six times during encoding. Three minutes later, they were asked to make a fearful/neutral judgment for the same face with congruent expression, the same face with incongruent expression or a new face. Participants showed a significant priming effect for the fearful faces but not for the neutral faces, regardless of their awareness of the masked faces during encoding. These results provided evidence that unconsciously processed stimuli could enhance emotional memory after both short and long delays. It indicates that emotion can enhance memory processing whether the stimuli are encoded consciously or unconsciously.

  10. Facial resemblance to emotions: group differences, impression effects, and race stereotypes.

    Science.gov (United States)

    Zebrowitz, Leslie A; Kikuchi, Masako; Fellous, Jean-Marc

    2010-02-01

    The authors used connectionist modeling to extend previous research on emotion overgeneralization effects. Study 1 demonstrated that neutral expression male faces objectively resemble angry expressions more than female faces do, female faces objectively resemble surprise expressions more than male faces do, White faces objectively resemble angry expressions more than Black or Korean faces do, and Black faces objectively resemble happy and surprise expressions more than White faces do. Study 2 demonstrated that objective resemblance to emotion expressions influences trait impressions even when statistically controlling possible confounding influences of attractiveness and babyfaceness. It further demonstrated that emotion overgeneralization is moderated by face race and that racial differences in emotion resemblance contribute to White perceivers' stereotypes of Blacks and Asians. These results suggest that intergroup relations may be strained not only by cultural stereotypes but also by adaptive responses to emotion expressions that are overgeneralized to groups whose faces subtly resemble particular emotions.

  11. Neural correlates of emotional intelligence in a visual emotional oddball task: an ERP study.

    Science.gov (United States)

    Raz, Sivan; Dan, Orrie; Zysberg, Leehu

    2014-11-01

    The present study was aimed at identifying potential behavioral and neural correlates of Emotional Intelligence (EI) by using scalp-recorded Event-Related Potentials (ERPs). EI levels were defined according to both self-report questionnaire and a performance-based ability test. We identified ERP correlates of emotional processing by using a visual-emotional oddball paradigm, in which subjects were confronted with one frequent standard stimulus (a neutral face) and two deviant stimuli (a happy and an angry face). The effects of these faces were then compared across groups with low and high EI levels. The ERP results indicate that participants with high EI exhibited significantly greater mean amplitudes of the P1, P2, N2, and P3 ERP components in response to emotional and neutral faces, at frontal, posterior-parietal and occipital scalp locations. P1, P2 and N2 are considered indexes of attention-related processes and have been associated with early attention to emotional stimuli. The later P3 component has been thought to reflect more elaborative, top-down, emotional information processing including emotional evaluation and memory encoding and formation. These results may suggest greater recruitment of resources to process all emotional and non-emotional faces at early and late processing stages among individuals with higher EI. The present study underscores the usefulness of ERP methodology as a sensitive measure for the study of emotional stimuli processing in the research field of EI.

  12. The interplay between emotion and cognition in Autism Spectrum Disorder: Implications for developmental theory

    Directory of Open Access Journals (Sweden)

    Sebastian B Gaigg

    2012-12-01

    Full Text Available Autism Spectrum Disorder (ASD is a neurodevelopmental disorder that is clinically defined by abnormalities in reciprocal social and communicative behaviours and an inflexible adherence to routinised patterns of thought and behaviour. Laboratory studies repeatedly demonstrate that autistic individuals experience difficulties in recognising and understanding the emotional expressions of others and naturalistic observations show that they use such expressions infrequently and inappropriately to regulate social exchanges. Dominant theories attribute this facet of the ASD phenotype to abnormalities in a social brain network that mediates social-motivational and social-cognitive processes such as face processing, mental state understanding and empathy. Such theories imply that only emotion related processes relevant to social cognition are compromised in ASD but accumulating evidence suggests that the disorder may be characterised by more widespread anomalies in the domain of emotions. In this review I summarise the relevant literature and argue that the social-emotional characteristics of ASD may be better understood in terms of a disruption in the domain-general interplay between emotion and cognition. More specifically I will suggest that ASD is the developmental consequence of early-emerging anomalies in how emotional responses to the environment modulate a wide range of cognitive processes including those that are relevant to navigating the social world.

  13. The Interplay between Emotion and Cognition in Autism Spectrum Disorder: Implications for Developmental Theory.

    Science.gov (United States)

    Gaigg, Sebastian B

    2012-01-01

    Autism Spectrum Disorder (ASD) is a neurodevelopmental disorder that is clinically defined by abnormalities in reciprocal social and communicative behaviors and an inflexible adherence to routinised patterns of thought and behavior. Laboratory studies repeatedly demonstrate that autistic individuals experience difficulties in recognizing and understanding the emotional expressions of others and naturalistic observations show that they use such expressions infrequently and inappropriately to regulate social exchanges. Dominant theories attribute this facet of the ASD phenotype to abnormalities in a social brain network that mediates social-motivational and social-cognitive processes such as face processing, mental state understanding, and empathy. Such theories imply that only emotion related processes relevant to social cognition are compromised in ASD but accumulating evidence suggests that the disorder may be characterized by more widespread anomalies in the domain of emotions. In this review I summarize the relevant literature and argue that the social-emotional characteristics of ASD may be better understood in terms of a disruption in the domain-general interplay between emotion and cognition. More specifically I will suggest that ASD is the developmental consequence of early emerging anomalies in how emotional responses to the environment modulate a wide range of cognitive processes including those that are relevant to navigating the social world.

  14. Facial Asymmetry and Emotional Expression

    CERN Document Server

    Pickin, Andrew

    2011-01-01

    This report is about facial asymmetry, its connection to emotional expression, and methods of measuring facial asymmetry in videos of faces. The research was motivated by two factors: firstly, there was a real opportunity to develop a novel measure of asymmetry that required minimal human involvement and that improved on earlier measures in the literature; and secondly, the study of the relationship between facial asymmetry and emotional expression is both interesting in its own right, and important because it can inform neuropsychological theory and answer open questions concerning emotional processing in the brain. The two aims of the research were: first, to develop an automatic frame-by-frame measure of facial asymmetry in videos of faces that improved on previous measures; and second, to use the measure to analyse the relationship between facial asymmetry and emotional expression, and connect our findings with previous research of the relationship.

  15. Human Wagering Behavior Depends on Opponents' Faces

    Science.gov (United States)

    Schlicht, Erik J.; Shimojo, Shinsuke; Camerer, Colin F.; Battaglia, Peter; Nakayama, Ken

    2010-01-01

    Research in competitive games has exclusively focused on how opponent models are developed through previous outcomes and how peoples' decisions relate to normative predictions. Little is known about how rapid impressions of opponents operate and influence behavior in competitive economic situations, although such subjective impressions have been shown to influence cooperative decision-making. This study investigates whether an opponent's face influences players' wagering decisions in a zero-sum game with hidden information. Participants made risky choices in a simplified poker task while being presented opponents whose faces differentially correlated with subjective impressions of trust. Surprisingly, we find that threatening face information has little influence on wagering behavior, but faces relaying positive emotional characteristics impact peoples' decisions. Thus, people took significantly longer and made more mistakes against emotionally positive opponents. Differences in reaction times and percent correct were greatest around the optimal decision boundary, indicating that face information is predominantly used when making decisions during medium-value gambles. Mistakes against emotionally positive opponents resulted from increased folding rates, suggesting that participants may have believed that these opponents were betting with hands of greater value than other opponents. According to these results, the best “poker face” for bluffing may not be a neutral face, but rather a face that contains emotional correlates of trustworthiness. Moreover, it suggests that rapid impressions of an opponent play an important role in competitive games, especially when people have little or no experience with an opponent. PMID:20657772

  16. Emotion Recognition

    Science.gov (United States)

    Neiberg, Daniel; Elenius, Kjell; Burger, Susanne

    Studies of expressive speech have shown that discrete emotions such as anger, fear, joy, and sadness can be accurately communicated, also cross-culturally, and that each emotion is associated with reasonably specific acoustic characteristics [8]. However, most previous research has been conducted on acted emotions. These certainly have something in common with naturally occurring emotions but may also be more intense and prototypical than authentic, everyday expressions [6, 13]. Authentic emotions are, on the other hand, often a combination of different affective states and occur rather infrequently in everyday life.

  17. About Face

    Medline Plus

    Full Text Available ... PTSD (posttraumatic stress disorder). Watch the intro This is AboutFace In these videos, Veterans, family members, and ... to hear what they have to say. What is PTSD? → How does PTSD affect loved ones? → Am ...

  18. About Face

    Medline Plus

    Full Text Available ... traumatic event — like combat, an assault, or a disaster — it's normal to feel scared, keyed up, or sad at first. But if it's been months or years since the trauma and you're not feeling better, you may have PTSD (posttraumatic stress disorder). Watch the intro This is AboutFace In ...

  19. Positive affective interactions: The role of repeated exposure and copresence

    NARCIS (Netherlands)

    Shahid, S.; Krahmer, E.; Neerincx, M.; Swerts, M.

    2013-01-01

    We describe and evaluate a new interface to induce positive emotions in users: a digital, interactive adaptive mirror. We study whether the induced affect is repeatable after a fixed interval (Study 1) and how copresence influences the emotion induction (Study 2). Results show that participants syst

  20. Coping as a mediator of emotion.

    Science.gov (United States)

    Folkman, S; Lazarus, R S

    1988-03-01

    There is widespread conviction among health care professionals that coping affects emotion. Yet theory and research have traditionally emphasized the effects of emotion on coping. The present research addresses this imbalance by evaluating the extent to which coping mediated emotions during stressful encounters in two Caucasian, community-residing samples. Subjects' recently experienced stressful encounters, the ways they coped with the demands of those encounters, and the emotions they experienced during two stages of those encounters were assessed repeatedly. The extent to which eight forms of coping mediated each of four sets of emotions was evaluated with a series of hierarchical regression analyses (of residuals). Coping was associated with changes in all four sets of emotions, with some forms of coping associated with increases in positive emotions and other forms associated with increases in negative emotions.

  1. Perception of face and body expressions using electromyography, pupillometry and gaze measures

    NARCIS (Netherlands)

    Kret, M.E.; Stekelenburg, J.J.; Roelofs, K.; de Gelder, B.

    2013-01-01

    Traditional emotion theories stress the importance of the face in the expression of emotions but bodily expressions are becoming increasingly important as well. In these experiments we tested the hypothesis that similar physiological responses can be evoked by observing emotional face and body signa

  2. Face to Face : The Perception of Automotive Designs.

    Science.gov (United States)

    Windhager, Sonja; Slice, Dennis E; Schaefer, Katrin; Oberzaucher, Elisabeth; Thorstensen, Truls; Grammer, Karl

    2008-12-01

    Over evolutionary time, humans have developed a selective sensitivity to features in the human face that convey information on sex, age, emotions, and intentions. This ability might not only be applied to our conspecifics nowadays, but also to other living objects (i.e., animals) and even to artificial structures, such as cars. To investigate this possibility, we asked people to report the characteristics, emotions, personality traits, and attitudes they attribute to car fronts, and we used geometric morphometrics (GM) and multivariate statistical methods to determine and visualize the corresponding shape information. Automotive features and proportions are found to covary with trait perception in a manner similar to that found with human faces. Emerging analogies are discussed. This study should have implications for both our understanding of our prehistoric psyche and its interrelation with the modern world.

  3. Facial, vocal and musical emotion recognition is altered in paranoid schizophrenic patients.

    Science.gov (United States)

    Weisgerber, Anne; Vermeulen, Nicolas; Peretz, Isabelle; Samson, Séverine; Philippot, Pierre; Maurage, Pierre; De Graeuwe D'Aoust, Catherine; De Jaegere, Aline; Delatte, Benoît; Gillain, Benoît; De Longueville, Xavier; Constant, Eric

    2015-09-30

    Disturbed processing of emotional faces and voices is typically observed in schizophrenia. This deficit leads to impaired social cognition and interactions. In this study, we investigated whether impaired processing of emotions also affects musical stimuli, which are widely present in daily life and known for their emotional impact. Thirty schizophrenic patients and 30 matched healthy controls evaluated the emotional content of musical, vocal and facial stimuli. Schizophrenic patients are less accurate than healthy controls in recognizing emotion in music, voices and faces. Our results confirm impaired recognition of emotion in voice and face stimuli in schizophrenic patients and extend this observation to the recognition of emotion in musical stimuli.

  4. Advances in face detection and facial image analysis

    CERN Document Server

    Celebi, M; Smolka, Bogdan

    2016-01-01

    This book presents the state-of-the-art in face detection and analysis. It outlines new research directions, including in particular psychology-based facial dynamics recognition, aimed at various applications such as behavior analysis, deception detection, and diagnosis of various psychological disorders. Topics of interest include face and facial landmark detection, face recognition, facial expression and emotion analysis, facial dynamics analysis, face classification, identification, and clustering, and gaze direction and head pose estimation, as well as applications of face analysis.

  5. Reading faces and Facing words

    DEFF Research Database (Denmark)

    Robotham, Julia Emma; Lindegaard, Martin Weis; Delfi, Tzvetelina Shentova

    It has long been argued that perceptual processing of faces and words is largely independent, highly specialised and strongly lateralised. Studies of patients with either pure alexia or prosopagnosia have strongly contributed to this view. The aim of our study was to investigate how visual...

  6. Reading faces and Facing words

    DEFF Research Database (Denmark)

    Robotham, Julia Emma; Lindegaard, Martin Weis; Delfi, Tzvetelina Shentova

    performed within normal range on at least one test of visual categorisation, strongly suggesting that their abnormal performance with words and faces does not represent a generalised visuo-perceptual deficit. Our results suggest that posterior areas in both hemispheres may be critical for both reading...

  7. Evidence for unintentional emotional contagion beyond dyads.

    Directory of Open Access Journals (Sweden)

    Guillaume Dezecache

    Full Text Available Little is known about the spread of emotions beyond dyads. Yet, it is of importance for explaining the emergence of crowd behaviors. Here, we experimentally addressed whether emotional homogeneity within a crowd might result from a cascade of local emotional transmissions where the perception of another's emotional expression produces, in the observer's face and body, sufficient information to allow for the transmission of the emotion to a third party. We reproduced a minimal element of a crowd situation and recorded the facial electromyographic activity and the skin conductance response of an individual C observing the face of an individual B watching an individual A displaying either joy or fear full body expressions. Critically, individual B did not know that she was being watched. We show that emotions of joy and fear displayed by A were spontaneously transmitted to C through B, even when the emotional information available in B's faces could not be explicitly recognized. These findings demonstrate that one is tuned to react to others' emotional signals and to unintentionally produce subtle but sufficient emotional cues to induce emotional states in others. This phenomenon could be the mark of a spontaneous cooperative behavior whose function is to communicate survival-value information to conspecifics.

  8. Mental imagery of emotions: Electrophysiological evidence.

    Science.gov (United States)

    Suess, Franziska; Abdel Rahman, Rasha

    2015-07-01

    Affective stimuli such as emotional words, scenes or facial expressions elicit well-investigated emotional responses. For instance, two distinct event-related brain potentials (ERPs) have been reported in response to emotional facial expressions, the early posterior negativity (EPN), associated with enhanced attention and perception of affective stimuli, and a later centro-parietal positivity (LPP) that is taken to reflect evaluations of the intrinsic relevance of emotional stimuli. However, other rich sources of emotions that have as yet received little attention are internal mental events such as thoughts, memories and imagination. Here we investigated mental imagery of emotional facial expressions and its time course using ERPs. Participants viewed neutral familiar and unfamiliar faces, and were subsequently asked to imagine the faces with an emotional or neutral expression. Imagery was compared to visually perceiving the same faces with the different expressions. Early ERP modulations during imagery resemble the effects frequently reported for perceived emotional facial expressions, suggesting that common early processes are associated with emotion perception and imagination. A later posterior positivity was also found in the imagery condition, but with a different distribution than for perception. These findings underscore the similarity of the brain's responses to internally generated and external sources of emotions. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Quantified Faces

    DEFF Research Database (Denmark)

    Sørensen, Mette-Marie Zacher

    2016-01-01

    Abstract: The article presents three contemporary art projects that, in various ways, thematise questions regarding numerical representation of the human face in relation to the identification of faces, for example through the use of biometric video analysis software, or DNA technology. The Dutch...... and critically examine bias in surveillance technologies, as well as scientific investigations, regarding the stereotyping mode of the human gaze. The American artist Heather Dewey-Hagborg creates three-dimensional portraits of persons she has “identified” from their garbage. Her project from 2013 entitled....... The three works are analysed with perspectives to historical physiognomy and Francis Galton's composite portraits from the 1800s. It is argued that, rather than being a statistical compression like the historical composites, contemporary statistical visual portraits (composites) are irreversible...

  10. Repeat-until-success quantum repeaters

    Science.gov (United States)

    Bruschi, David Edward; Barlow, Thomas M.; Razavi, Mohsen; Beige, Almut

    2014-09-01

    We propose a repeat-until-success protocol to improve the performance of probabilistic quantum repeaters. Conventionally, these rely on passive static linear-optics elements and photodetectors to perform Bell-state measurements (BSMs) with a maximum success rate of 50%. This is a strong impediment for entanglement swapping between distant quantum memories. Every time a BSM fails, entanglement needs to be redistributed between the corresponding memories in the repeater link. The key ingredients of our scheme are repeatable BSMs. Under ideal conditions, these turn probabilistic quantum repeaters into deterministic ones. Under realistic conditions, our protocol too might fail. However, using additional threshold detectors now allows us to improve the entanglement generation rate by almost orders of magnitude, at a nominal distance of 1000 km, compared to schemes that rely on conventional BSMs. This improvement is sufficient to make the performance of our scheme comparable to the expected performance of some deterministic quantum repeaters.

  11. Emotional engineering

    CERN Document Server

    In an age of increasing complexity, diversification and change, customers expect services that cater to their needs and to their tastes. Emotional Engineering vol 2. describes how their expectations can be satisfied and managed throughout the product life cycle, if producers focus their attention more on emotion. Emotional engineering provides the means to integrate products to create a new social framework and develops services beyond product realization to create of value across a full lifetime.  14 chapters cover a wide range of topics that can be applied to product, process and industry development, with special attention paid to the increasing importance of sensing in the age of extensive and frequent changes, including: • Multisensory stimulation and user experience  • Physiological measurement • Tactile sensation • Emotional quality management • Mental model • Kansei engineering.   Emotional Engineering vol 2 builds on Dr Fukuda’s previous book, Emotional Engineering, and provides read...

  12. Extended Emotions

    DEFF Research Database (Denmark)

    Krueger, Joel; Szanto, Thomas

    2016-01-01

    Until recently, philosophers and psychologists conceived of emotions as brain- and body-bound affairs. But researchers have started to challenge this internalist and individualist orthodoxy. A rapidly growing body of work suggests that some emotions incorporate external resources and thus extend...... beyond the neurophysiological confines of organisms; some even argue that emotions can be socially extended and shared by multiple agents. Call this the extended emotions thesis (ExE). In this article, we consider different ways of understanding ExE in philosophy, psychology, and the cognitive sciences....... First, we outline the background of the debate and discuss different argumentative strategies for ExE. In particular, we distinguish ExE from cognate but more moderate claims about the embodied and situated nature of cognition and emotion (Section 1). We then dwell upon two dimensions of ExE: emotions...

  13. Emotional expression modulates perceived gaze direction.

    Science.gov (United States)

    Lobmaier, Janek S; Tiddeman, Bernard P; Perrett, David I

    2008-08-01

    Gaze perception is an important social skill, as it portrays information about what another person is attending to. Gaze direction has been shown to affect interpretation of emotional expression. Here the authors investigate whether the emotional facial expression has a reciprocal influence on interpretation of gaze direction. In a forced-choice yes-no task, participants were asked to judge whether three faces expressing different emotions (anger, fear, happiness, and neutral) in different viewing angles were looking at them or not. Happy faces were more likely to be judged as looking at the observer than were angry, fearful, or neutral faces. Angry faces were more often judged as looking at the observer than were fearful and neutral expressions. These findings are discussed on the background of approach and avoidance orientation of emotions and of the self-referential positivity bias.

  14. Wordsworthian Emotion

    Institute of Scientific and Technical Information of China (English)

    张敏

    2010-01-01

    As a great poet in British Romanticism.Wordsworth is not the practioner of an artistic craft designed tO satisfy "taste" of a literary connoisseur.He is,instead."a man speaking to men" with his uniqueness in emotion.This paper tempts to demonstrate how Wordsworth conveys emotion with poetic language.Wordsworthian "emotion recollected in tranquility" is simple,pure and genuine,which is the true art in wordsworth's poems.

  15. Emotional speech processing: disentangling the effects of prosody and semantic cues.

    Science.gov (United States)

    Pell, Marc D; Jaywant, Abhishek; Monetta, Laura; Kotz, Sonja A

    2011-08-01

    To inform how emotions in speech are implicitly processed and registered in memory, we compared how emotional prosody, emotional semantics, and both cues in tandem prime decisions about conjoined emotional faces. Fifty-two participants rendered facial affect decisions (Pell, 2005a), indicating whether a target face represented an emotion (happiness or sadness) or not (a facial grimace), after passively listening to happy, sad, or neutral prime utterances. Emotional information from primes was conveyed by: (1) prosody only; (2) semantic cues only; or (3) combined prosody and semantic cues. Results indicated that prosody, semantics, and combined prosody-semantic cues facilitate emotional decisions about target faces in an emotion-congruent manner. However, the magnitude of priming did not vary across tasks. Our findings highlight that emotional meanings of prosody and semantic cues are systematically registered during speech processing, but with similar effects on associative knowledge about emotions, which is presumably shared by prosody, semantics, and faces.

  16. Neural basis of implicit memory for socio-emotional information in schizophrenia.

    Science.gov (United States)

    Schwartz, Barbara L; Vaidya, Chandan J; Shook, Devon; Deutsch, Stephen I

    2013-04-30

    Individuals with schizophrenia are impaired in processing social signals such as facial expressions of emotion. Perceiving facial expressions is a complex process that depends on a distributed neural network of regions involved in affective, cognitive, and visual processing. We examined repetition priming, a non-conscious form of perceptual learning, to explore the visual-perceptual processes associated with perceiving facial expression in people with schizophrenia. Functional magnetic resonance imaging (fMRI) was also employed to probe the sensitivity of face-responsive regions in the ventral pathway to the repetition of stimuli. Subjects viewed blocks of novel and repeated faces displaying fear expressions and neutral expressions and identified each face as male or female. Gender decisions were faster for repeated encoding relative to initial encoding of faces, indicating significant priming for facial expressions. Priming was normal in schizophrenia patients, but, as expected, recognition memory for the expressions was impaired. Neuroimaging findings showed that priming-related activation for patients was reduced in the left fusiform gyrus, relative to controls, regardless of facial expression. The findings suggest that schizophrenia patients have altered neural sensitivity in regions of the ventral visual processing stream that underlie early perceptual learning of objects and faces.

  17. Anger Superiority in Single-Face Judgements

    Directory of Open Access Journals (Sweden)

    Hiroshi Ashida

    2011-05-01

    Full Text Available We investigated “anger superiority” in single-face judgements. Angry, or threatening, faces are easier to find than smiling ones (Hansen & Hansen, 1988 but it remains controversial whether this reflects emotional effects on the basis of the whole face or rather perceptual effects on the basis of parts. We sought this question differently from most previous studies that used the visual search paradigm. We presented a picture of angry, smiling, or neutral face (extracted from ATR DB99 database that has been confirmed for emotional strength either to the left or to the right of the fixation mark, which was followed by a mask, and the participants were asked to make a forced-choice judgement of anger or smile. The results showed that neutral faces were significantly biased towards anger with upright presentation but not with inverted presentation. Angry and smiling faces were judged equally well with upright presentation, while there was notable reduction of correct responses only for angry face with inverted presentation. Difference between hemifields was not clear. The results suggest that angry faces are judged on the basis of configural processing of the whole face, while smiling faces may be judged more locally on the basis of parts.

  18. Induction of depressed and elated mood by music influences the perception of facial emotional expressions in healthy subjects.

    Science.gov (United States)

    Bouhuys, A L; Bloem, G M; Groothuis, T G

    1995-04-04

    The judgement of healthy subject rating the emotional expressions of a set of schematic drawn faces is validated (study 1) to examine the relationship between mood (depressed/elated) and judgement of emotional expressions of these faces (study 2). Study 1: 30 healthy subjects judged 12 faces with respect to the emotions they express (fear, happiness, anger, sadness, disgust, surprise, rejection and invitation). It was found that a particular face could reflect various emotions. All eight emotions were reflected in the set of faces and the emotions were consensually judged. Moreover, gender differences in judgement could be established. Study 2: In a cross-over design, 24 healthy subjects judged the faces after listening to depressing or elating music. The faces were subdivided in six 'ambiguous' faces (i.e., expressing similar amounts of positive and negative emotions) and six 'clear' faces (i.e., faces showing a preponderance of positive or negative emotions). In addition, these two types of faces were distinguished with respect to the intensity of emotions they express. 11 subjects who showed substantial differences in experienced depression after listening to the music were selected for further analysis. It was found that, when feeling more depressed, the subjects perceived more rejection/sadness in ambiguous faces (displaying less intensive emotions) and less invitation/happiness in clear faces. In addition, subjects saw more fear in clear faces that express less intensive emotions. Hence, results show a depression-related negative bias in the perception of facial displays.

  19. Emotional priming of pop-out in visual search.

    Science.gov (United States)

    Lamy, Dominique; Amunts, Liana; Bar-Haim, Yair

    2008-04-01

    When searching for a discrepant target along a simple dimension such as color or shape, repetition of the target feature substantially speeds search, an effect known as feature priming of pop-out (V. Maljkovic and K. Nakayama, 1994). The authors present the first report of emotional priming of pop-out. Participants had to detect the face displaying a discrepant expression of emotion in an array of four face photographs. On each trial, the target when present was either a neutral face among emotional faces (angry in Experiment 1 or happy in Experiment 2), or an emotional face among neutral faces. Target detection was faster when the target displayed the same emotion on successive trials. This effect occurred for angry and for happy faces, not for neutral faces. It was completely abolished when faces were inverted instead of upright, suggesting that emotional categories rather than physical feature properties drive emotional priming of pop-out. The implications of the present findings for theoretical accounts of intertrial priming and for the face-in-the-crowd phenomenon are discussed.

  20. FACE RECOGNITION FROM FRONT-VIEW FACE

    Institute of Scientific and Technical Information of China (English)

    WuLifang; ShenLansun

    2003-01-01

    This letter presents a face normalization algorithm based on 2-D face model to rec-ognize faces with variant postures from front-view face.A 2-D face mesh model can be extracted from faces with rotation to left or right and the corresponding front-view mesh model can be estimated according to facial symmetry.Then based on the relationship between the two mesh models,the nrmalized front-view face is formed by gray level mapping.Finally,the face recognition will be finished based on Principal Component Analysis(PCA).Experiments show that better face recognition performance is achieved in this way.

  1. FACE RECOGNITION FROM FRONT-VIEW FACE

    Institute of Scientific and Technical Information of China (English)

    Wu Lifang; Shen Lansun

    2003-01-01

    This letter presents a face normalization algorithm based on 2-D face model to recognize faces with variant postures from front-view face. A 2-D face mesh model can be extracted from faces with rotation to left or right and the corresponding front-view mesh model can be estimated according to the facial symmetry. Then based on the inner relationship between the two mesh models, the normalized front-view face is formed by gray level mapping. Finally, the face recognition will be finished based on Principal Component Analysis (PCA). Experiments show that better face recognition performance is achieved in this way.

  2. Emotional Intelligence in medical practice

    Directory of Open Access Journals (Sweden)

    Abu Hasan Sarkar

    2016-08-01

    Full Text Available Emotional Intelligence is the ability to perceive, express, understand and regulate one’s inner emotions and the emotions of others. It is considered to be a ‘must have’ competence in the workplace. Several scientific studies have proven that the application of emotional intelligence is effective in improving the teaching-learning process and that it leads to organizational growth; however, only limited work has been carried out to assess its effectiveness in the practice of medicine, especially in India. Various scales have been developed to measure emotional intelligence but they are not universally applicable because emotional intelligence depends upon culture and personal background among other factors. In recent years in India, conflicts between patients and doctors have had serious, sometimes fatal, consequences for the physician. Behavior, when faced with a potential conflict-like situation, depends to a great extent on the emotional intelligence of the physician. Emotional intelligence of medical students and medical professionals can be honed through exposure to the medical humanities which are known to promote patient-centered care. Building better physician-patient relationships might help in averting doctor-patient conflict.

  3. Does cortisol modulate emotion recognition and empathy?

    Science.gov (United States)

    Duesenberg, Moritz; Weber, Juliane; Schulze, Lars; Schaeuffele, Carmen; Roepke, Stefan; Hellmann-Regen, Julian; Otte, Christian; Wingenfeld, Katja

    2016-04-01

    Emotion recognition and empathy are important aspects in the interaction and understanding of other people's behaviors and feelings. The Human environment comprises of stressful situations that impact social interactions on a daily basis. Aim of the study was to examine the effects of the stress hormone cortisol on emotion recognition and empathy. In this placebo-controlled study, 40 healthy men and 40 healthy women (mean age 24.5 years) received either 10mg of hydrocortisone or placebo. We used the Multifaceted Empathy Test to measure emotional and cognitive empathy. Furthermore, we examined emotion recognition from facial expressions, which contained two emotions (anger and sadness) and two emotion intensities (40% and 80%). We did not find a main effect for treatment or sex on either empathy or emotion recognition but a sex × emotion interaction on emotion recognition. The main result was a four-way-interaction on emotion recognition including treatment, sex, emotion and task difficulty. At 40% task difficulty, women recognized angry faces better than men in the placebo condition. Furthermore, in the placebo condition, men recognized sadness better than anger. At 80% task difficulty, men and women performed equally well in recognizing sad faces but men performed worse compared to women with regard to angry faces. Apparently, our results did not support the hypothesis that increases in cortisol concentration alone influence empathy and emotion recognition in healthy young individuals. However, sex and task difficulty appear to be important variables in emotion recognition from facial expressions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Association between Ability Emotional Intelligence and Left Insula during Social Judgment of Facial Emotions.

    Science.gov (United States)

    Quarto, Tiziana; Blasi, Giuseppe; Maddalena, Chiara; Viscanti, Giovanna; Lanciano, Tiziana; Soleti, Emanuela; Mangiulli, Ivan; Taurisano, Paolo; Fazio, Leonardo; Bertolino, Alessandro; Curci, Antonietta

    2016-01-01

    The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain "somatic marker circuitry" (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.

  5. Facial EMG responses to emotional expressions are related to emotion perception ability.

    Science.gov (United States)

    Künecke, Janina; Hildebrandt, Andrea; Recio, Guillermo; Sommer, Werner; Wilhelm, Oliver

    2014-01-01

    Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a "reactivation" of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG)--in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110) in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective.

  6. Facial EMG responses to emotional expressions are related to emotion perception ability.

    Directory of Open Access Journals (Sweden)

    Janina Künecke

    Full Text Available Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a "reactivation" of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG--in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110 in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective.

  7. Higher resting heart rate variability predicts skill in expressing some emotions.

    Science.gov (United States)

    Tuck, Natalie L; Grant, Rosemary C I; Sollers, John J; Booth, Roger J; Consedine, Nathan S

    2016-12-01

    Vagally mediated heart rate variability (vmHRV) is a measure of cardiac vagal tone, and is widely viewed as a physiological index of the capacity to regulate emotions. However, studies have not directly tested whether vmHRV is associated with the ability to facially express emotions. In extending prior work, the current report tested links between resting vmHRV and the objectively assessed ability to facially express emotions, hypothesizing that higher vmHRV would predict greater expressive skill. Eighty healthy women completed self-reported measures, before attending a laboratory session in which vmHRV and the ability to express six emotions in the face were assessed. A repeated measures analysis of variance revealed a marginal main effect for vmHRV on skill overall; individuals with higher resting vmHRV were only better able to deliberately facially express anger and interest. Findings suggest that differences in resting vmHRV are associated with the objectively assessed ability to facially express some, but not all, emotions, with potential implications for health and well-being. © 2016 Society for Psychophysiological Research.

  8. Emotion and Interhemispheric Interactions in Binocular Rivalry

    Directory of Open Access Journals (Sweden)

    K L Ritchie

    2013-10-01

    Full Text Available Previous research has shown that fear-related stimuli presented in peripheral vision are preferentially processed over stimuli depicting other emotions. Furthermore, emotional content can influence dominance duration in binocular rivalry, with the period of dominance for an emotional image (e.g. a fearful face being significantly longer than a neutral image (e.g. a neutral face or a house. Experiment 1 of the current study combined these two ideas to investigate the role of emotion in binocular rivalry with face/house pairs viewed in the periphery. The results showed that faces were perceived as more dominant than houses, and fearful faces more so than neutral faces, even when viewed in the periphery. Experiment 2 extended this paradigm to present a rival pair in the periphery in each hemifield, with each eye either viewing the same stimulus in each location (traditional condition, or a different stimulus in each location (Diaz-Caneja condition. The results showed that the two pairs tended to rival in synchrony only in the traditional condition. Taken together, the results show that face dominance and emotion dominance in binocular rivalry persist in the periphery, and that interhemispheric interactions in binocular rivalry depend on an eye- as opposed to an object-based mechanism.

  9. Is facial emotion recognition impairment in schizophrenia identical for different emotions? A signal detection analysis.

    Science.gov (United States)

    Tsoi, Daniel T; Lee, Kwang-Hyuk; Khokhar, Waqqas A; Mir, Nusrat U; Swalli, Jaspal S; Gee, Kate A; Pluck, Graham; Woodruff, Peter W R

    2008-02-01

    Patients with schizophrenia have difficulty recognising the emotion that corresponds to a given facial expression. According to signal detection theory, two separate processes are involved in facial emotion perception: a sensory process (measured by sensitivity which is the ability to distinguish one facial emotion from another facial emotion) and a cognitive decision process (measured by response criterion which is the tendency to judge a facial emotion as a particular emotion). It is uncertain whether facial emotion recognition deficits in schizophrenia are primarily due to impaired sensitivity or response bias. In this study, we hypothesised that individuals with schizophrenia would have both diminished sensitivity and different response criteria in facial emotion recognition across different emotions compared with healthy controls. Twenty-five individuals with a DSM-IV diagnosis of schizophrenia were compared with age and IQ matched healthy controls. Participants performed a "yes-no" task by indicating whether the 88 Ekman faces shown briefly expressed one of the target emotions in three randomly ordered runs (happy, sad and fear). Sensitivity and response criteria for facial emotion recognition was calculated as d-prime and In(beta) respectively using signal detection theory. Patients with schizophrenia showed diminished sensitivity (d-prime) in recognising happy faces, but not faces that expressed fear or sadness. By contrast, patients exhibited a significantly less strict response criteria (In(beta)) in recognising fearful and sad faces. Our results suggest that patients with schizophrenia have a specific deficit in recognising happy faces, whereas they were more inclined to attribute any facial emotion as fearful or sad.

  10. Serotonergic neurotransmission in emotional processing

    DEFF Research Database (Denmark)

    Laursen, Helle Ruff; Henningsson, Susanne; Macoveanu, Julian;

    2016-01-01

    ,4-methylene-dioxymethamphetamine [MDMA]) induces alterations in serotonergic neurotransmission that are comparable to those observed in a depleted state. In this functional magnetic resonance imaging (fMRI) study, we investigated the responsiveness of the amygdala to emotional face stimuli in recreational...

  11. An audiovisual emotion recognition system

    Science.gov (United States)

    Han, Yi; Wang, Guoyin; Yang, Yong; He, Kun

    2007-12-01

    Human emotions could be expressed by many bio-symbols. Speech and facial expression are two of them. They are both regarded as emotional information which is playing an important role in human-computer interaction. Based on our previous studies on emotion recognition, an audiovisual emotion recognition system is developed and represented in this paper. The system is designed for real-time practice, and is guaranteed by some integrated modules. These modules include speech enhancement for eliminating noises, rapid face detection for locating face from background image, example based shape learning for facial feature alignment, and optical flow based tracking algorithm for facial feature tracking. It is known that irrelevant features and high dimensionality of the data can hurt the performance of classifier. Rough set-based feature selection is a good method for dimension reduction. So 13 speech features out of 37 ones and 10 facial features out of 33 ones are selected to represent emotional information, and 52 audiovisual features are selected due to the synchronization when speech and video fused together. The experiment results have demonstrated that this system performs well in real-time practice and has high recognition rate. Our results also show that the work in multimodules fused recognition will become the trend of emotion recognition in the future.

  12. Facial Emotion Recognition in Bipolar Disorder and Healthy Aging.

    Science.gov (United States)

    Altamura, Mario; Padalino, Flavia A; Stella, Eleonora; Balzotti, Angela; Bellomo, Antonello; Palumbo, Rocco; Di Domenico, Alberto; Mammarella, Nicola; Fairfield, Beth

    2016-03-01

    Emotional face recognition is impaired in bipolar disorder, but it is not clear whether this is specific for the illness. Here, we investigated how aging and bipolar disorder influence dynamic emotional face recognition. Twenty older adults, 16 bipolar patients, and 20 control subjects performed a dynamic affective facial recognition task and a subsequent rating task. Participants pressed a key as soon as they were able to discriminate whether the neutral face was assuming a happy or angry facial expression and then rated the intensity of each facial expression. Results showed that older adults recognized happy expressions faster, whereas bipolar patients recognized angry expressions faster. Furthermore, both groups rated emotional faces more intensely than did the control subjects. This study is one of the first to compare how aging and clinical conditions influence emotional facial recognition and underlines the need to consider the role of specific and common factors in emotional face recognition.

  13. Famous face recognition, face matching, and extraversion.

    Science.gov (United States)

    Lander, Karen; Poyarekar, Siddhi

    2015-01-01

    It has been previously established that extraverts who are skilled at interpersonal interaction perform significantly better than introverts on a face-specific recognition memory task. In our experiment we further investigate the relationship between extraversion and face recognition, focusing on famous face recognition and face matching. Results indicate that more extraverted individuals perform significantly better on an upright famous face recognition task and show significantly larger face inversion effects. However, our results did not find an effect of extraversion on face matching or inverted famous face recognition.

  14. Emotional Responses

    DEFF Research Database (Denmark)

    Hansen, Flemming; Christensen, Sverre Riis; Lundsteen, Steen

    2007-01-01

    Recent neurological research has pointed to the importance of fundamental emotional processes for most kinds of human behaviour. Measures of emotional response tendencies towards brands seem to reveal intangible aspects of brand equity, particularly in a marketing context. In this paper a procedure...... for estimating such emotional brand equity is presented and findings from two successive studies of more than 100 brands are reported. It demonstrates how changes that occur between two years are explainable in terms of factors identifiable in the markets, and that the measures otherwise are stable over time...

  15. Prospective and concurrent correlates of emotion perception in psychotic disorders: a naturalistic, longitudinal study of neurocognition, affective blunting and avolition.

    Science.gov (United States)

    Vaskinn, Anja; Johnsen, Erik; Jørgensen, Hugo A; Kroken, Rune A; Løberg, Else-Marie

    2013-06-01

    This naturalistic study investigated longitudinal and cross-sectional symptomatic and neurocognitive correlates of social cognition indexed by emotion perception. Participants were 31 persons admitted to a psychiatric emergency ward due to acute psychosis. Positive and negative (i.e., affective blunting and avolition) symptoms were assessed at baseline and 12-month follow-up using the Positive and Negative Syndrome Scale. Participants completed neuropsychological assessments with alternative versions of the Repeatable Battery for the Assessment of Neuropsychological Status at baseline and at 12-month follow-up. Emotion perception was measured using the Face/Voice Emotion Test at 12-month follow-up. Correlational analyses (Spearman's rho) revealed strong and statistically significant associations between neurocognition and emotion perception (baseline r = 0.58, follow-up r = 0.43). Associations between positive symptoms and emotion perception were weak or non-existent (baseline r = 0.13, follow-up r  =  -0.01). Emotion perception was moderately, but not significantly, associated with affective blunting at follow-up (r = 0.33), but not at baseline (r = 0.21). The association with avolition was non-existent (baseline r  =  -0.05, follow-up r = 0.01). This study supports the notion that emotion perception has neurocognitive correlates. The cross-sectional trend level association with affective blunting suggests that the ability to perceive emotions might be related to, but dissociable from the ability to express emotions.

  16. Preattentive processing of audio-visual emotional signals

    DEFF Research Database (Denmark)

    Föcker, J.; Gondan, Matthias; Röder, B.

    2011-01-01

    to a response conflict rather than interference at earlier, e.g. perceptual processing stages. In Experiment 1, participants had to categorize the valence and rate the intensity of happy, sad, angry and neutral unimodal or bimodal face-voice stimuli. They were asked to rate either the facial or vocal expression...... and ignore the emotion expressed in the other modality. Participants responded faster and more precisely to emotionally congruent compared to incongruent face-voice pairs in both the Attend Face and in the Attend Voice condition. Moreover, when attending to faces, emotionally congruent bimodal stimuli were...

  17. Emotional intelligence as a cognitive-emotional ability

    Directory of Open Access Journals (Sweden)

    Andreja Avsec

    2003-06-01

    Full Text Available In the article we analyse Mayer and Salovey's model of emotional intelligence. The authors have defined it for the first time in the 90's, delimited its relation to the social intelligence and formed two tests for its measurement, which are unique published tests of their kind. The authors try to separate their approach towards the measurement of emotional intelligence from the self-report measures and from defining emotional intelligence as a set of personality traits. Besides the measurement of emotional intelligence with the tests of maximum performance, authors try to prove that correlation between emotional abilities indicate similar hierarchical structure as is characteristic for other kinds of intelligence. Since the first test for measuring the emotional intelligence was published in 1997 and there have been no other published tests of this kind yet, it is very difficult to evaluate its metric characteristics and the validity of the model. Anyhow, in defining and measuring the emotional intelligence researchers face similar problems as in social intelligence research.

  18. Emotion recognition during cocaine intoxication.

    Science.gov (United States)

    Kuypers, K P C; Steenbergen, L; Theunissen, E L; Toennes, S W; Ramaekers, J G

    2015-11-01

    Chronic or repeated cocaine use has been linked to impairments in social skills. It is not clear whether cocaine is responsible for this impairment or whether other factors, like polydrug use, distort the observed relation. We aimed to investigate this relation by means of a placebo-controlled experimental study. Additionally, associations between stressor-related activity (cortisol, cardiovascular parameters) induced by the biological stressor cocaine, and potential cocaine effects on emotion recognition were studied. Twenty-four healthy recreational cocaine users participated in this placebo-controlled within-subject study. Participants were tested between 1 and 2 h after treatment with oral cocaine (300 mg) or placebo. Emotion recognition of low and high intensity expressions of basic emotions (fear, anger, disgust, sadness, and happiness) was tested. Findings show that cocaine impaired recognition of negative emotions; this was mediated by the intensity of the presented emotions. When high intensity expressions of Anger and Disgust were shown, performance under influence of cocaine 'normalized' to placebo-like levels while it made identification of Sadness more difficult. The normalization of performance was most notable for participants with the largest cortisol responses in the cocaine condition compared to placebo. It was demonstrated that cocaine impairs recognition of negative emotions, depending on the intensity of emotion expression and cortisol response.

  19. Emotional collectives: How groups shape emotions and emotions shape groups.

    Science.gov (United States)

    van Kleef, Gerben A; Fischer, Agneta H

    2016-01-01

    Group settings are epicentres of emotional activity. Yet, the role of emotions in groups is poorly understood. How do group-level phenomena shape group members' emotional experience and expression? How are emotional expressions recognised, interpreted and shared in group settings? And how do such expressions influence the emotions, cognitions and behaviours of fellow group members and outside observers? To answer these and other questions, we draw on relevant theoretical perspectives (e.g., intergroup emotions theory, social appraisal theory and emotions as social information theory) and recent empirical findings regarding the role of emotions in groups. We organise our review according to two overarching themes: how groups shape emotions and how emotions shape groups. We show how novel empirical approaches break important new ground in uncovering the role of emotions in groups. Research on emotional collectives is thriving and constitutes a key to understanding the social nature of emotions.

  20. Working memory of emotional stimuli: Electrophysiological characterization.

    Science.gov (United States)

    Kessel, Dominique; García-Rubio, María J; González, E Kirstin; Tapia, Manuel; López-Martín, Sara; Román, Francisco J; Capilla, Almudena; Martínez, Kenia; Colom, Roberto; Carretié, Luis

    2016-09-01

    Memorizing emotional stimuli in a preferential way seems to be one of the adaptive strategies brought on by evolution for supporting survival. However, there is a lack of electrophysiological evidence on this bias in working memory. The present study analyzed the influence of emotion on the updating component of working memory. Behavioral and electrophysiological indices were measured from a 3-back task using negative, neutral, and positive faces. Electrophysiological data evidenced an emotional influence on the working memory sensitive P3 component, which presented larger amplitudes for negative matching faces compared to neutral ones. This effect originated in the superior parietal cortex, previously reported to be involved in N-back tasks. Additionally, P3 results showed a correlation with reaction times, where higher amplitudes were associated with faster responses for negative matching faces. These findings indicate that electrophysiological measures seem to be very suitable indices of the emotional influence on working memory.

  1. Familiarity is not notoriety: Phenomenological accounts of face recognition

    Directory of Open Access Journals (Sweden)

    Davide eLiccione

    2014-09-01

    Full Text Available From a phenomenological perspective, faces are perceived differently from objects as their perception always involves the possibility of a relational engagement (Bredlau, 2011. This is especially true for familiar faces, i.e. faces of people with a history of real relational engagements. Similarly, valence of emotional expressions assumes a key role, as they define the sense and direction of this engagement. Following these premises, the aim of the present study is to demonstrate that face recognition is facilitated by at least two variables, familiarity and emotional expression, and that perception of familiar faces is not influenced by orientation. In order to verify this hypothesis, we implemented a 3x3x2 factorial design, showing seventeen healthy subjects three type of faces (unfamiliar, personally familiar, famous characterized by three different emotional expressions (happy, hungry/sad, neutral and in two different orientation (upright vs inverted. We showed every subject a total of 180 faces with the instructions to give a familiarity judgment. Reaction times were recorded and we found that the recognition of a face is facilitated by personal familiarity and emotional expression, and that this process is otherwise independent from a cognitive elaboration of stimuli and remains stable despite orientation. These results highlight the need to make a distinction between famous and personally familiar faces when studying face perception and to consider its historical aspects from a phenomenological point of view.

  2. The effect of intranasal oxytocin on perceiving and understanding emotion on the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT).

    Science.gov (United States)

    Cardoso, Christopher; Ellenbogen, Mark A; Linnen, Anne-Marie

    2014-02-01

    Evidence suggests that intranasal oxytocin enhances the perception of emotion in facial expressions during standard emotion identification tasks. However, it is not clear whether this effect is desirable in people who do not show deficits in emotion perception. That is, a heightened perception of emotion in faces could lead to "oversensitivity" to the emotions of others in nonclinical participants. The goal of this study was to assess the effects of intranasal oxytocin on emotion perception using ecologically valid social and nonsocial visual tasks. Eighty-two participants (42 women) self-administered a 24 IU dose of intranasal oxytocin or a placebo in a double-blind, randomized experiment and then completed the perceiving and understanding emotion components of the Mayer-Salovey-Caruso Emotional Intelligence Test. In this test, emotion identification accuracy is based on agreement with a normative sample. As expected, participants administered intranasal oxytocin rated emotion in facial stimuli as expressing greater emotional intensity than those given a placebo. Consequently, accurate identification of emotion in faces, based on agreement with a normative sample, was impaired in the oxytocin group relative to placebo. No such effect was observed for tests using nonsocial stimuli. The results are consistent with the hypothesis that intranasal oxytocin enhances the salience of social stimuli in the environment, but not nonsocial stimuli. The present findings support a growing literature showing that the effects of intranasal oxytocin on social cognition can be negative under certain circumstances, in this case promoting "oversensitivity" to emotion in faces in healthy people.

  3. Crossmodal transfer of emotion by music.

    Science.gov (United States)

    Logeswaran, Nidhya; Bhattacharya, Joydeep

    2009-05-15

    Music is one of the most powerful elicitors of subjective emotion, yet it is not clear whether emotions elicited by music are similar to emotions elicited by visual stimuli. This leads to an open question: can music-elicited emotion be transferred to and/or influence subsequent vision-elicited emotional processing? Here we addressed this question by investigating processing of emotional faces (neutral, happy and sad) primed by short excerpts of musical stimuli (happy and sad). Our behavioural experiment showed a significant effect of musical priming: prior listening to a happy (sad) music enhanced the perceived happiness (sadness) of a face irrespective of facial emotion. Further, this musical priming-induced effect was largest for neutral face. Our electrophysiological experiment showed that such crossmodal priming effects were manifested by event related brain potential components at a very early (within 100 ms post-stimulus) stages of neuronal information processing. Altogether, these results offer new insight into the crossmodal nature of music and its ability to transfer emotion to visual modality.

  4. Brain Structural Correlates of Emotion Recognition in Psychopaths

    Science.gov (United States)

    Batalla, Iolanda; Kosson, David; Menchón, José M; Pifarré, Josep; Bosque, Javier; Cardoner, Narcís; Soriano-Mas, Carles

    2016-01-01

    Individuals with psychopathy present deficits in the recognition of facial emotional expressions. However, the nature and extent of these alterations are not fully understood. Furthermore, available data on the functional neural correlates of emotional face recognition deficits in adult psychopaths have provided mixed results. In this context, emotional face morphing tasks may be suitable for clarifying mild and emotion-specific impairments in psychopaths. Likewise, studies exploring corresponding anatomical correlates may be useful for disentangling available neurofunctional evidence based on the alleged neurodevelopmental roots of psychopathic traits. We used Voxel-Based Morphometry and a morphed emotional face expression recognition task to evaluate the relationship between regional gray matter (GM) volumes and facial emotion recognition deficits in male psychopaths. In comparison to male healthy controls, psychopaths showed deficits in the recognition of sad, happy and fear emotional expressions. In subsequent brain imaging analyses psychopaths with better recognition of facial emotional expressions showed higher volume in the prefrontal cortex (orbitofrontal, inferior frontal and dorsomedial prefrontal cortices), somatosensory cortex, anterior insula, cingulate cortex and the posterior lobe of the cerebellum. Amygdala and temporal lobe volumes contributed to better emotional face recognition in controls only. These findings provide evidence suggesting that variability in brain morphometry plays a role in accounting for psychopaths’ impaired ability to recognize emotional face expressions, and may have implications for comprehensively characterizing the empathy and social cognition dysfunctions typically observed in this population of subjects. PMID:27175777

  5. Brain Structural Correlates of Emotion Recognition in Psychopaths.

    Directory of Open Access Journals (Sweden)

    Vanessa Pera-Guardiola

    Full Text Available Individuals with psychopathy present deficits in the recognition of facial emotional expressions. However, the nature and extent of these alterations are not fully understood. Furthermore, available data on the functional neural correlates of emotional face recognition deficits in adult psychopaths have provided mixed results. In this context, emotional face morphing tasks may be suitable for clarifying mild and emotion-specific impairments in psychopaths. Likewise, studies exploring corresponding anatomical correlates may be useful for disentangling available neurofunctional evidence based on the alleged neurodevelopmental roots of psychopathic traits. We used Voxel-Based Morphometry and a morphed emotional face expression recognition task to evaluate the relationship between regional gray matter (GM volumes and facial emotion recognition deficits in male psychopaths. In comparison to male healthy controls, psychopaths showed deficits in the recognition of sad, happy and fear emotional expressions. In subsequent brain imaging analyses psychopaths with better recognition of facial emotional expressions showed higher volume in the prefrontal cortex (orbitofrontal, inferior frontal and dorsomedial prefrontal cortices, somatosensory cortex, anterior insula, cingulate cortex and the posterior lobe of the cerebellum. Amygdala and temporal lobe volumes contributed to better emotional face recognition in controls only. These findings provide evidence suggesting that variability in brain morphometry plays a role in accounting for psychopaths' impaired ability to recognize emotional face expressions, and may have implications for comprehensively characterizing the empathy and social cognition dysfunctions typically observed in this population of subjects.

  6. Intellectual emotions

    Directory of Open Access Journals (Sweden)

    Vasilyev, Igor A.

    2013-12-01

    Full Text Available In the laboratory of O.K. Tikhomirov, the phenomenon of the acute emotional regulation of productive thinking was justified. This regulation is realized by means of the elaboration of the axiological profile of cognition. The following definition of intellectual emotions can be given: intellectual emotions are the appraisals of specific cognitive objects — contradictions, assumptions, probabilities, and the intermediate and final results of operations. The main aspect of the method used in the research consisted of the synchronous registration of an external (tactile elaboration of problems, skin galvanic response and verbal utterances regarding tasks to be completed in a game of chess. The principle position in Tikhomirov`s group is the following: intellectual emotions represent not only the energetic resource or catalysts for the thinking process, but also the determinants of its structure.

  7. Emotional Disturbance

    Science.gov (United States)

    ... skills, and increase self-awareness, self-control, and self-esteem. A large body of research exists regarding methods ... not. Back to top Other Considerations Children and adolescents with an emotional disturbance should receive services based ...

  8. Autism and the development of face processing.

    Science.gov (United States)

    Golarai, Golijeh; Grill-Spector, Kalanit; Reiss, Allan L

    2006-10-01

    Autism is a pervasive developmental condition, characterized by impairments in non-verbal communication, social relationships and stereotypical patterns of behavior. A large body of evidence suggests that several aspects of face processing are impaired in autism, including anomalies in gaze processing, memory for facial identity and recognition of facial expressions of emotion. In search of neural markers of anomalous face processing in autism, much interest has focused on a network of brain regions that are implicated in social cognition and face processing. In this review, we will focus on three such regions, namely the STS for its role in processing gaze and facial movements, the FFA in face detection and identification and the amygdala in processing facial expressions of emotion. Much evidence suggests that a better understanding of the normal development of these specialized regions is essential for discovering the neural bases of face processing anomalies in autism. Thus, we will also examine the available literature on the normal development of face processing. Key unknowns in this research area are the neuro-developmental processes, the role of experience and the interactions among components of the face processing system in shaping each of the specialized regions for processing faces during normal development and in autism.

  9. Compound facial expressions of emotion.

    Science.gov (United States)

    Du, Shichuan; Tao, Yong; Martinez, Aleix M

    2014-04-15

    Understanding the different categories of facial expressions of emotion regularly used by us is essential to gain insights into human cognition and affect as well as for the design of computational models and perceptual interfaces. Past research on facial expressions of emotion has focused on the study of six basic categories--happiness, surprise, anger, sadness, fear, and disgust. However, many more facial expressions of emotion exist and are used regularly by humans. This paper describes an important group of expressions, which we call compound emotion categories. Compound emotions are those that can be constructed by combining basic component categories to create new ones. For instance, happily surprised and angrily surprised are two distinct compound emotion categories. The present work defines 21 distinct emotion categories. Sample images of their facial expressions were collected from 230 human subjects. A Facial Action Coding System analysis shows the production of these 21 categories is different but consistent with the subordinate categories they represent (e.g., a happily surprised expression combines muscle movements observed in happiness and surprised). We show that these differences are sufficient to distinguish between the 21 defined categories. We then use a computational model of face perception to demonstrate that most of these categories are also visually discriminable from one another.

  10. Vocal expressions of emotion and positive and negative basic emotions [Abstract

    OpenAIRE

    Scott, S.; Sauter, D.

    2004-01-01

    Previous studies have indicated that vocal and facial expressions of the ‘basic’ emotions share aspects of processing. Thus amygdala damage compromises the perception of fear and anger from the face and from the voice. In the current study we tested the hypothesis that there exist positive basic emotions, expressed mainly in the voice (Ekman, 1992). Vocal stimuli were produced to express the specific positive emotions of amusement, achievement, pleasure, contentment and relief.

  11. From Facial Emotional Recognition Abilities to Emotional Attribution: A Study in Down Syndrome

    Science.gov (United States)

    Hippolyte, Loyse; Barisnikov, Koviljka; Van der Linden, Martial; Detraux, Jean-Jacques

    2009-01-01

    Facial expression processing and the attribution of facial emotions to a context were investigated in adults with Down syndrome (DS) in two experiments. Their performances were compared with those of a child control group matched for receptive vocabulary. The ability to process faces without emotional content was controlled for, and no differences…

  12. From Facial Emotional Recognition Abilities to Emotional Attribution: A Study in Down Syndrome

    Science.gov (United States)

    Hippolyte, Loyse; Barisnikov, Koviljka; Van der Linden, Martial; Detraux, Jean-Jacques

    2009-01-01

    Facial expression processing and the attribution of facial emotions to a context were investigated in adults with Down syndrome (DS) in two experiments. Their performances were compared with those of a child control group matched for receptive vocabulary. The ability to process faces without emotional content was controlled for, and no differences…

  13. The hierarchical brain network for face recognition.

    Science.gov (United States)

    Zhen, Zonglei; Fang, Huizhen; Liu, Jia

    2013-01-01

    Numerous functional magnetic resonance imaging (fMRI) studies have identified multiple cortical regions that are involved in face processing in the human brain. However, few studies have characterized the face-processing network as a functioning whole. In this study, we used fMRI to identify face-selective regions in the entire brain and then explore the hierarchical structure of the face-processing network by analyzing functional connectivity among these regions. We identified twenty-five regions mainly in the occipital, temporal and frontal cortex that showed a reliable response selective to faces (versus objects) across participants and across scan sessions. Furthermore, these regions were clustered into three relatively independent sub-networks in a face-recognition task on the basis of the strength of functional connectivity among them. The functionality of the sub-networks likely corresponds to the recognition of individual identity, retrieval of semantic knowledge and representation of emotional information. Interestingly, when the task was switched to object recognition from face recognition, the functional connectivity between the inferior occipital gyrus and the rest of the face-selective regions were significantly reduced, suggesting that this region may serve as an entry node in the face-processing network. In sum, our study provides empirical evidence for cognitive and neural models of face recognition and helps elucidate the neural mechanisms underlying face recognition at the network level.

  14. Impaired face recognition is associated with social inhibition.

    Science.gov (United States)

    Avery, Suzanne N; VanDerKlok, Ross M; Heckers, Stephan; Blackford, Jennifer U

    2016-02-28

    Face recognition is fundamental to successful social interaction. Individuals with deficits in face recognition are likely to have social functioning impairments that may lead to heightened risk for social anxiety. A critical component of social interaction is how quickly a face is learned during initial exposure to a new individual. Here, we used a novel Repeated Faces task to assess how quickly memory for faces is established. Face recognition was measured over multiple exposures in 52 young adults ranging from low to high in social inhibition, a core dimension of social anxiety. High social inhibition was associated with a smaller slope of change in recognition memory over repeated face exposure, indicating participants with higher social inhibition showed smaller improvements in recognition memory after seeing faces multiple times. We propose that impaired face learning is an important mechanism underlying social inhibition and may contribute to, or maintain, social anxiety.

  15. Emotional language processing in Autism Spectrum Disorders: A systematic review

    OpenAIRE

    Alina eLartseva; Ton eDijkstra; Jan eBuitelaar

    2015-01-01

    In his first description of Autism Spectrum Disorders (ASD), Kanner emphasized emotional impairments by characterizing children with ASD as indifferent to other people, self-absorbed, emotionally cold, distanced, and retracted. Thereafter, emotional impairments became regarded as part of the social impairments of ASD, and research mostly focused on understanding how individuals with ASD recognize visual expressions of emotions from faces and body postures. However, it still remains unclear ho...

  16. Emotional Labour, Burnout and Job Satisfaction in UK Teachers: The Role of Workplace Social Support

    Science.gov (United States)

    Kinman, Gail; Wray, Siobhan; Strange, Calista

    2011-01-01

    Although teaching has been described as a profoundly emotional activity, little is known about the emotional demands faced by teachers or how this impacts on their well-being. This study examined relationships between "emotional labour", burnout (emotional exhaustion, depersonalisation and personal accomplishment) and job satisfaction in…

  17. Emotional Labour, Burnout and Job Satisfaction in UK Teachers: The Role of Workplace Social Support

    Science.gov (United States)

    Kinman, Gail; Wray, Siobhan; Strange, Calista

    2011-01-01

    Although teaching has been described as a profoundly emotional activity, little is known about the emotional demands faced by teachers or how this impacts on their well-being. This study examined relationships between "emotional labour", burnout (emotional exhaustion, depersonalisation and personal accomplishment) and job satisfaction in a sample…

  18. Attentional modulation of emotional conflict processing with flanker tasks.

    Directory of Open Access Journals (Sweden)

    Pingyan Zhou

    Full Text Available Emotion processing has been shown to acquire priority by biasing allocation of attentional resources. Aversive images or fearful expressions are processed quickly and automatically. Many existing findings suggested that processing of emotional information was pre-attentive, largely immune from attentional control. Other studies argued that attention gated the processing of emotion. To tackle this controversy, the current study examined whether and to what degrees attention modulated processing of emotion using a stimulus-response-compatibility (SRC paradigm. We conducted two flanker experiments using color scale faces in neutral expressions or gray scale faces in emotional expressions. We found SRC effects for all three dimensions (color, gender, and emotion and SRC effects were larger when the conflicts were task relevant than when they were task irrelevant, suggesting that conflict processing of emotion was modulated by attention, similar to those of color and face identity (gender. However, task modulation on color SRC effect was significantly greater than that on gender or emotion SRC effect, indicating that processing of salient information was modulated by attention to a lesser degree than processing of non-emotional stimuli. We proposed that emotion processing can be influenced by attentional control, but at the same time salience of emotional information may bias toward bottom-up processing, rendering less top-down modulation than that on non-emotional stimuli.

  19. Attentional modulation of emotional conflict processing with flanker tasks.

    Science.gov (United States)

    Zhou, Pingyan; Liu, Xun

    2013-01-01

    Emotion processing has been shown to acquire priority by biasing allocation of attentional resources. Aversive images or fearful expressions are processed quickly and automatically. Many existing findings suggested that processing of emotional information was pre-attentive, largely immune from attentional control. Other studies argued that attention gated the processing of emotion. To tackle this controversy, the current study examined whether and to what degrees attention modulated processing of emotion using a stimulus-response-compatibility (SRC) paradigm. We conducted two flanker experiments using color scale faces in neutral expressions or gray scale faces in emotional expressions. We found SRC effects for all three dimensions (color, gender, and emotion) and SRC effects were larger when the conflicts were task relevant than when they were task irrelevant, suggesting that conflict processing of emotion was modulated by attention, similar to those of color and face identity (gender). However, task modulation on color SRC effect was significantly greater than that on gender or emotion SRC effect, indicating that processing of salient information was modulated by attention to a lesser degree than processing of non-emotional stimuli. We proposed that emotion processing can be influenced by attentional control, but at the same time salience of emotional information may bias toward bottom-up processing, rendering less top-down modulation than that on non-emotional stimuli.

  20. Holistic person processing: faces with bodies tell the whole story.

    Science.gov (United States)

    Aviezer, Hillel; Trope, Yaacov; Todorov, Alexander

    2012-07-01

    Faces and bodies are typically encountered simultaneously, yet little research has explored the visual processing of the full person. Specifically, it is unknown whether the face and body are perceived as distinct components or as an integrated, gestalt-like unit. To examine this question, we investigated whether emotional face-body composites are processed in a holistic-like manner by using a variant of the composite face task, a measure of holistic processing. Participants judged facial expressions combined with emotionally congruent or incongruent bodies that have been shown to influence the recognition of emotion from the face. Critically, the faces were either aligned with the body in a natural position or misaligned in a manner that breaks the ecological person form. Converging data from 3 experiments confirm that breaking the person form reduces the facilitating influence of congruent body context as well as the impeding influence of incongruent body context on the recognition of emotion from the face. These results show that faces and bodies are processed as a single unit and support the notion of a composite person effect analogous to the classic effect described for faces.

  1. Quantum repeated games revisited

    CERN Document Server

    Frackiewicz, Piotr

    2011-01-01

    We present a scheme for playing quantum repeated 2x2 games based on the Marinatto and Weber's approach to quantum games. As a potential application, we study twice repeated Prisoner's Dilemma game. We show that results not available in classical game can be obtained when the game is played in the quantum way. Before we present our idea, we comment on the previous scheme of playing quantum repeated games.

  2. Salient cues from faces, bodies and scenes influence observers’ face expressions, fixations and pupil size

    Directory of Open Access Journals (Sweden)

    Mariska Esther Kret

    2013-12-01

    Full Text Available We receive emotional signals from different sources, including the face, the whole body and the natural scene. Previous research has shown the importance of context provided by the whole body and the scene context on the recognition of facial expressions. This study measured physiological responses to face-body-scene combinations. Participants viewed emotionally (incongruent face-body and body-scene pairs whilst eye fixations, pupil-size and electromyography (EMG responses were recorded. Participants focused more on angry and fearful vs. happy or neutral cues, independent of the source and relatively independent from emotional incongruence. Moreover, angry faces combined with angry bodies and angry bodies viewed in an aggressive social scene context elicited greatest pupil dilation. Participants' face expressions matched the valence of the stimuli but when face-body compounds were shown, the observed facial expression influenced EMG responses more than the postures. Our results show that threatening signals from faces, bodies and scenes attract attention, induce arousal, and evoke congruent facial reactions.

  3. From emotion perception to emotion experience: emotions evoked by pictures and classical music.

    Science.gov (United States)

    Baumgartner, Thomas; Esslen, Michaela; Jäncke, Lutz

    2006-04-01

    Most previous neurophysiological studies evoked emotions by presenting visual stimuli. Models of the emotion circuits in the brain have for the most part ignored emotions arising from musical stimuli. To our knowledge, this is the first emotion brain study which examined the influence of visual and musical stimuli on brain processing. Highly arousing pictures of the International Affective Picture System and classical musical excerpts were chosen to evoke the three basic emotions of happiness, sadness and fear. The emotional stimuli modalities were presented for 70 s either alone or combined (congruent) in a counterbalanced and random order. Electroencephalogram (EEG) Alpha-Power-Density, which is inversely related to neural electrical activity, in 30 scalp electrodes from 24 right-handed healthy female subjects, was recorded. In addition, heart rate (HR), skin conductance responses (SCR), respiration, temperature and psychometrical ratings were collected. Results showed that the experienced quality of the presented emotions was most accurate in the combined conditions, intermediate in the picture conditions and lowest in the sound conditions. Furthermore, both the psychometrical ratings and the physiological involvement measurements (SCR, HR, Respiration) were significantly increased in the combined and sound conditions compared to the picture conditions. Finally, repeated measures ANOVA revealed the largest Alpha-Power-Density for the sound conditions, intermediate for the picture conditions, and lowest for the combined conditions, indicating the strongest activation in the combined conditions in a distributed emotion and arousal network comprising frontal, temporal, parietal and occipital neural structures. Summing up, these findings demonstrate that music can markedly enhance the emotional experience evoked by affective pictures.

  4. Selecting fillers on emotional appearance improves lineup identification accuracy.

    Science.gov (United States)

    Flowe, Heather D; Klatt, Thimna; Colloff, Melissa F

    2014-12-01

    Mock witnesses sometimes report using criminal stereotypes to identify a face from a lineup, a tendency known as criminal face bias. Faces are perceived as criminal-looking if they appear angry. We tested whether matching the emotional appearance of the fillers to an angry suspect can reduce criminal face bias. In Study 1, mock witnesses (n = 226) viewed lineups in which the suspect had an angry, happy, or neutral expression, and we varied whether the fillers matched the expression. An additional group of participants (n = 59) rated the faces on criminal and emotional appearance. As predicted, mock witnesses tended to identify suspects who appeared angrier and more criminal-looking than the fillers. This tendency was reduced when the lineup fillers matched the emotional appearance of the suspect. Study 2 extended the results, testing whether the emotional appearance of the suspect and fillers affects recognition memory. Participants (n = 1,983) studied faces and took a lineup test in which the emotional appearance of the target and fillers was varied between subjects. Discrimination accuracy was enhanced when the fillers matched an angry target's emotional appearance. We conclude that lineup member emotional appearance plays a critical role in the psychology of lineup identification. The fillers should match an angry suspect's emotional appearance to improve lineup identification accuracy.

  5. Emotionally Speaking

    DEFF Research Database (Denmark)

    Lippke, Lena; Murphy, Kristin Marie

    In this presentation, we demonstrate how qualitative methods help us to explore barriers and facilitators to teacher emotional well-being in school settings serving students identified as being at-risk in the US and Denmark. Through a hermeneutical lens (Gadamer, 2004) and collective narrative (R......, and their relationships with colleagues and students? Finally, what can the two groups of teachers learn from each others_ experiences?......In this presentation, we demonstrate how qualitative methods help us to explore barriers and facilitators to teacher emotional well-being in school settings serving students identified as being at-risk in the US and Denmark. Through a hermeneutical lens (Gadamer, 2004) and collective narrative...... (Richardson, 1987), we begin by identifying each country_s educational model, teacher preparation and teacher support for serving at-risk students. Next, we explore the experience of teachers working with students defined as being at-risk. Inspired by the concept of emotional labor (Hochschild, 1983) we...

  6. Towards Designing Android Faces after Actual Humans

    DEFF Research Database (Denmark)

    Vlachos, Evgenios; Schärfe, Henrik

    2015-01-01

    Using their face as their prior affective interface, android robots and other agents embody emotional facial expressions, and convey messages on their identity, gender, age, race, and attractiveness. We are examining whether androids can convey emotionally relevant information via their static...... facial sig-nals, just as humans do. Based on the fact that social information can be accu-rately identified from still images of nonexpressive unknown faces, a judgment paradigm was employed to discover, and compare the style of facial expres-sions of the Geminoid-DK android (modeled after an actual...... initially made for the Original, suggesting that androids inherit the same style of facial expression as their originals. Our findings support the case of designing android faces after specific actual persons who portray facial features that are familiar to the users, and also relevant to the notion...

  7. The influence of indirect and direct emotional processing on memory for facial expressions.

    Science.gov (United States)

    Patel, Ronak; Girard, Todd A; Green, Robin E A

    2012-01-01

    We used the remember-know procedure (Tulving, 1985 ) to test the behavioural expression of memory following indirect and direct forms of emotional processing at encoding. Participants (N=32) viewed a series of facial expressions (happy, fearful, angry, and neutral) while performing tasks involving either indirect (gender discrimination) or direct (emotion discrimination) emotion processing. After a delay, participants completed a surprise recognition memory test. Our results revealed that indirect encoding of emotion produced enhanced memory for fearful faces whereas direct encoding of emotion produced enhanced memory for angry faces. In contrast, happy faces were better remembered than neutral faces after both indirect and direct encoding tasks. These findings suggest that fearful and angry faces benefit from a recollective advantage when they are encoded in a way that is consistent with the predictive nature of their threat. We propose that the broad memory advantage for happy faces may reflect a form of cognitive flexibility that is specific to positive emotions.

  8. Emotional modulation of touch in alexithymia.

    Science.gov (United States)

    Scarpazza, Cristina; di Pellegrino, Giuseppe; Làdavas, Elisabetta

    2014-06-01

    Alexithymia refers to difficulties in recognizing one's own emotions, but difficulties have also been found in the recognition of others' emotions, particularly when the task is not easy. Previous research has demonstrated that, in order to understand other peoples' feelings, observers remap the observed emotion onto their own sensory systems. The aim of the present study was to investigate the ability of high and low alexithymic subjects to remap the emotional expressions of others onto their own somatosensory systems using an indirect task. We used the emotional Visual Remapping of Touch (eVRT) paradigm, in which seeing a face being touched improves detection of near-threshold tactile stimulation concurrently delivered to one's own face. In eVRT, subjects performance is influenced by the emotional content of the stimuli, while they were required to distinguish between unilateral or bilateral tactile stimulation on their own cheeks. The results show that tactile perception was enhanced when viewing touch on a fearful face compared with viewing touch on other expressions in low but not in high alexithymic participants. A negative correlation between TAS-20 alexithymia subscale ("difficulty in identify feelings") and the magnitude of the eVRT effect was also found. Conversely, arousal and valence ratings of emotional faces did not vary as a function of the degree of alexithymia. The results provide evidence that alexithymia is associated with difficulties in remapping seen emotions, particularly fear, onto one's own sensory system. This impairment could be due to an inability to modulate somatosensory system activity according to the observed emotional expression.

  9. Adolescents’ emotional competence is associated with parents’ neural sensitivity to emotions

    Directory of Open Access Journals (Sweden)

    Eva H Telzer

    2014-07-01

    Full Text Available An essential component of youths’ successful development is learning to appropriately respond to emotions, including the ability to recognize, identify, and describe one’s feelings. Such emotional competence is thought to arise through the parent-child relationship. Yet, the mechanisms by which parents transmit emotional competence to their children are difficult to measure because they are often implicit, idiosyncratic, and not easily articulated by parents or children. In the current study, we used a multifaceted approach that went beyond self-report measures and examined whether parental neural sensitivity to emotions predicted their child’s emotional competence. Twenty-two adolescent-parent dyads completed an fMRI scan during which they labeled the emotional expressions of negatively valenced faces. Results indicate that parents who recruited the amygdala, VLPFC, and brain regions involved in mentalizing (i.e., inferring others’ emotional states had adolescent children with greater emotional competence. These results held after controlling for parents’ self-reports of emotional expressivity and adolescents’ self-reports of the warmth and support of their parent relationships. In addition, adolescents recruited neural regions involved in mentalizing during affect labeling, which significantly mediated the associated between parental neural sensitivity and adolescents’ emotional competence, suggesting that youth are modeling or referencing their parents’ emotional profiles, thereby contributing to better emotional competence.

  10. Adolescents' emotional competence is associated with parents' neural sensitivity to emotions.

    Science.gov (United States)

    Telzer, Eva H; Qu, Yang; Goldenberg, Diane; Fuligni, Andrew J; Galván, Adriana; Lieberman, Matthew D

    2014-01-01

    An essential component of youths' successful development is learning to appropriately respond to emotions, including the ability to recognize, identify, and describe one's feelings. Such emotional competence is thought to arise through the parent-child relationship. Yet, the mechanisms by which parents transmit emotional competence to their children are difficult to measure because they are often implicit, idiosyncratic, and not easily articulated by parents or children. In the current study, we used a multifaceted approach that went beyond self-report measures and examined whether parental neural sensitivity to emotions predicted their child's emotional competence. Twenty-two adolescent-parent dyads completed an fMRI scan during which they labeled the emotional expressions of negatively valenced faces. Results indicate that parents who recruited the amygdala, VLPFC, and brain regions involved in mentalizing (i.e., inferring others' emotional states) had adolescent children with greater emotional competence. These results held after controlling for parents' self-reports of emotional expressivity and adolescents' self-reports of the warmth and support of their parent relationships. In addition, adolescents recruited neural regions involved in mentalizing during affect labeling, which significantly mediated the associated between parental neural sensitivity and adolescents' emotional competence, suggesting that youth are modeling or referencing their parents' emotional profiles, thereby contributing to better emotional competence.

  11. Task relevance regulates the interaction between reward expectation and emotion.

    Science.gov (United States)

    Wei, Ping; Kang, Guanlan

    2014-06-01

    In the present study, we investigated the impact of reward expectation on the processing of emotional facial expression using a cue-target paradigm. A cue indicating the reward condition of each trial (incentive vs. non-incentive) was followed by the presentation of a picture of an emotional face, the target. Participants were asked to discriminate the emotional expression of the target face in Experiment 1, to discriminate the gender of the target face in Experiment 2, and to judge a number superimposed on the center of the target face as even or odd in Experiment 3, rendering the emotional expression of the target face as task relevant in Experiment 1 but task irrelevant in Experiments 2 and 3. Faster reaction times (RTs) were observed in the monetary incentive condition than in the non-incentive condition, demonstrating the effect of reward on facilitating task concentration. Moreover, the reward effect (i.e., RTs in non-incentive conditions versus incentive conditions) was larger for emotional faces than for neutral faces when emotional expression was task relevant but not when it was task irrelevant. The findings suggest that top-down incentive motivation biased attentional processing toward task-relevant stimuli, and that task relevance played an important role in regulating the influence of reward expectation on the processing of emotional stimuli.

  12. Time perception and dynamics of facial expressions of emotions.

    Directory of Open Access Journals (Sweden)

    Sophie L Fayolle

    Full Text Available Two experiments were run to examine the effects of dynamic displays of facial expressions of emotions on time judgments. The participants were given a temporal bisection task with emotional facial expressions presented in a dynamic or a static display. Two emotional facial expressions and a neutral expression were tested and compared. Each of the emotional expressions had the same affective valence (unpleasant, but one was high-arousing (expressing anger and the other low-arousing (expressing sadness. Our results showed that time judgments are highly sensitive to movements in facial expressions and the emotions expressed. Indeed, longer perceived durations were found in response to the dynamic faces and the high-arousing emotional expressions compared to the static faces and low-arousing expressions. In addition, the facial movements amplified the effect of emotions on time perception. Dynamic facial expressions are thus interesting tools for examining variations in temporal judgments in different social contexts.

  13. Emotion Talk: Helping Caregivers Facilitate Emotion Understanding and Emotion Regulation

    Science.gov (United States)

    Brinton, Bonnie; Fujiki, Martin

    2011-01-01

    This article focuses on two aspects of emotional intelligence, emotion understanding and emotion regulation. These abilities are important because of their impact on social communication and the way in which they influence a child's access to knowledge. Caregivers who engage their children in emotion talk may strengthen the ability of their…

  14. Emotional collectives : How groups shape emotions and emotions shape groups

    NARCIS (Netherlands)

    van Kleef, G.A.; Fischer, A.H.

    2016-01-01

    Group settings are epicentres of emotional activity. Yet, the role of emotions in groups is poorly understood. How do group-level phenomena shape group members’ emotional experience and expression? How are emotional expressions recognised, interpreted and shared in group settings? And how do such ex

  15. Early Environmental Correlates of Maternal Emotion Talk.

    Science.gov (United States)

    Garrett-Peters, Patricia; Mills-Koonce, Roger; Adkins, Daniel; Vernon-Feagans, Lynne; Cox, Martha

    2008-04-01

    OBJECTIVE: The primary goal of this study was to examine contextual, child, and maternal factors that are associated with mothers' early emotion talk in an ethnically diverse, low-income sample. DESIGN: Emotion talk (positive and negative labels) was coded for 1111 mothers while engaged with their 7-month-olds in viewing an emotion-faces picture book. Infant attention during the interaction was also coded. Mothers' parenting style (positive engagement and negative intrusiveness) was coded during a dyadic free-play interaction. Demographic information was obtained, as well as maternal ratings of child temperament and mother's knowledge of infant development. RESULTS: Hierarchical regression analyses revealed that social context and maternal qualities are significant predictors of mothers' early positive and negative emotion talk. In particular, mothers who were African American, had higher income, and who showed more positive engagement when interacting with their infants demonstrated increased rates of positive and negative emotion talk with their infants. For negative emotion talk, social context variables moderated other predictors. Specifically, infant attention was positively associated with negative emotion talk only for African American mothers, and knowledge of infant development was positively associated with negative emotion talk only for non-African American mothers. The positive association between maternal positive engagement and negative emotion talk was greater for lower-income families than for higher-income families. CONCLUSIONS: Mothers' emotion language with infants is not sensitive to child factors but is associated with social contextual factors and characteristics of the mothers themselves.

  16. Facial Emotion and Identity Processing Development in 5- to 15-Year-Old Children

    OpenAIRE

    Johnston, Patrick J.; Kaufman, Jordy; Bajic, Julie; Sercombe, Alicia; Michie, Patricia T.; Karayanidis, Frini

    2011-01-01

    Most developmental studies of emotional face processing to date have focused on infants and very young children. Additionally, studies that examine emotional face processing in older children do not distinguish development in emotion and identity face processing from more generic age-related cognitive improvement. In this study, we developed a paradigm that measures processing of facial expression in comparison to facial identity and complex visual stimuli. The three matching tasks were devel...

  17. Facial Emotion and Identity Processing Development in 5- to 15-Year-Old Children

    OpenAIRE

    Patrick eJohnston; Jordy eKaufman; Julie eBajic; Alicia eSercombe; Patricia eMichie; Frini eKarayanidis

    2011-01-01

    Most developmental studies of emotional face processing to date have focussed on infants and very young children. Additionally, studies that examine emotional face processing in older children do not distinguish development in emotion and identity face processing from more generic age-related cognitive improvement. In this study, we developed a paradigm that measures processing of facial expression in comparison to facial identity and complex visual stimuli. The three matching tasks were de...

  18. European cinema: face to face with Hollywood

    NARCIS (Netherlands)

    T. Elsaesser

    2005-01-01

    In the face of renewed competition from Hollywood since the early 1980s and the challenges posed to Europe's national cinemas by the fall of the Wall in 1989, independent filmmaking in Europe has begun to re-invent itself. European Cinema: Face to Face with Hollywood re-assesses the different debate

  19. The perception of emotion in body expressions.

    Science.gov (United States)

    de Gelder, B; de Borst, A W; Watson, R

    2015-01-01

    During communication, we perceive and express emotional information through many different channels, including facial expressions, prosody, body motion, and posture. Although historically the human body has been perceived primarily as a tool for actions, there is now increased understanding that the body is also an important medium for emotional expression. Indeed, research on emotional body language is rapidly emerging as a new field in cognitive and affective neuroscience. This article reviews how whole-body signals are processed and understood, at the behavioral and neural levels, with specific reference to their role in emotional communication. The first part of this review outlines brain regions and spectrotemporal dynamics underlying perception of isolated neutral and affective bodies, the second part details the contextual effects on body emotion recognition, and final part discusses body processing on a subconscious level. More specifically, research has shown that body expressions as compared with neutral bodies draw upon a larger network of regions responsible for action observation and preparation, emotion processing, body processing, and integrative processes. Results from neurotypical populations and masking paradigms suggest that subconscious processing of affective bodies relies on a specific subset of these regions. Moreover, recent evidence has shown that emotional information from the face, voice, and body all interact, with body motion and posture often highlighting and intensifying the emotion expressed in the face and voice.

  20. Mapping Teacher-Faces

    Science.gov (United States)

    Thompson, Greg; Cook, Ian

    2013-01-01

    This paper uses Deleuze and Guattari's concept of faciality to analyse the teacher's face. According to Deleuze and Guattari, the teacher-face is a special type of face because it is an "overcoded" face produced in specific landscapes. This paper suggests four limit-faces for teacher faciality that actualise different mixes of significance and…

  1. Aging and emotional expressions: is there a positivity bias during dynamic emotion recognition?

    Directory of Open Access Journals (Sweden)

    Alberto eDi Domenico

    2015-08-01

    Full Text Available In this study, we investigated whether age-related differences in emotion regulation priorities influence online dynamic emotional facial discrimination. A group of 40 younger and a group of 40 older adults were invited to recognize a positive or negative expression as soon as the expression slowly emerged and subsequently rate it in terms of intensity. Our findings show that older adults recognized happy expressions faster than angry ones, while the direction of emotional expression does not seem to affect younger adults’ performance. Furthermore, older adults rated both negative and positive emotional faces as more intense compared to younger controls. This study detects age-related differences with a dynamic online paradigm and suggests that different regulation strategies may shape emotional face recognition.

  2. Emotional Matters: Innovative software brings emotional intelligence to our digital devices.

    Science.gov (United States)

    Morsy, Ahmed

    2016-01-01

    In 1872, Charles Darwin published The Expression of the Emotions in Man and Animals, in which he argued that mammals show emotion reliably in their faces. Since then, thousands of studies have confirmed the robustness of Darwin's argument in many fields, including linguistics, semiotics, social psychology, and computer science. More interestingly, several studies, including those of renowned psychologist Paul Ekman, demonstrated that basic emotions are, indeed, universal. Affectiva, a Massachusetts Institute of Technology spinoff located in Waltham, Massachusetts, builds a variety of products that harness the two main characteristics of facial expressions-robustness and universality-to measure and analyze emotional responses.

  3. Implementing soul studio activities in oncology nurses with negative emotion facing patients' death%心灵工作室在肿瘤科护士面对患者死亡所致负性情绪中的应用

    Institute of Scientific and Technical Information of China (English)

    肖柳红; 黄敏清; 宋文强

    2013-01-01

    目的:分析肿瘤科护士面对患者死亡所产生的负性情绪;探讨心灵工作室降低护士负性情绪的效果.方法:建立心灵工作室,为62名肿瘤科护士开展为期4个月共16次的心理健康促进活动,提供其宣泄患者死亡所致负性情绪的平台;活动前、后均使用症状自评量表(SCL-90)进行评估.结果:活动前护士内心压抑、恐惧无助、自认倒霉;活动后护士内心坦然、敞开心扉、自我释放.活动后SCL-90测评因子中的躯体化、强迫症状、抑郁、焦虑、恐怖的评分均低于活动前,差异具有统计学意义(P<0.05或P<0.01).结论:心灵工作室心理健康促进活动可以减轻肿瘤科护士面对患者死亡的负性情绪,增强其心理调适能力,从而提升其心理健康水平.%Objective:To analyze negative emotion in oncology nurses when facing death in patients,and explore the effect of soul studio in reducing negative emotion of nurses.Methods:The authors established soul studio for 62 oncology nurses,organized 16 times mental health promotion activities in four months,and provided a platform for them to release negative emotion.The symptom rating scale (SCL-90) was used to evaluate their mental status before and after the activities.Results:Nurses felt depressive,fear and helpless before the activities.After the activities,nurses became open and released.Scores of factors in SCL-90,such as somatization,obsessive-compulsive,depression,anxiety,and phobic anxiety after the activities were lower than that before (P<0.05 or P<0.01).Conclusion:Mental health promotion activity of soul studio could reduce negative emotion of nurses and enhance their capacity of psychological adjustment so as to improve their mental health.

  4. Reconfigurable multiport EPON repeater

    Science.gov (United States)

    Oishi, Masayuki; Inohara, Ryo; Agata, Akira; Horiuchi, Yukio

    2009-11-01

    An extended reach EPON repeater is one of the solutions to effectively expand FTTH service areas. In this paper, we propose a reconfigurable multi-port EPON repeater for effective accommodation of multiple ODNs with a single OLT line card. The proposed repeater, which has multi-ports in both OLT and ODN sides, consists of TRs, BTRs with the CDR function and a reconfigurable electrical matrix switch, can accommodate multiple ODNs to a single OLT line card by controlling the connection of the matrix switch. Although conventional EPON repeaters require full OLT line cards to accommodate subscribers from the initial installation stage, the proposed repeater can dramatically reduce the number of required line cards especially when the number of subscribers is less than a half of the maximum registerable users per OLT. Numerical calculation results show that the extended reach EPON system with the proposed EPON repeater can save 17.5% of the initial installation cost compared with a conventional repeater, and can be less expensive than conventional systems up to the maximum subscribers especially when the percentage of ODNs in lightly-populated areas is higher.

  5. Revisiting the TALE repeat.

    Science.gov (United States)

    Deng, Dong; Yan, Chuangye; Wu, Jianping; Pan, Xiaojing; Yan, Nieng

    2014-04-01

    Transcription activator-like (TAL) effectors specifically bind to double stranded (ds) DNA through a central domain of tandem repeats. Each TAL effector (TALE) repeat comprises 33-35 amino acids and recognizes one specific DNA base through a highly variable residue at a fixed position in the repeat. Structural studies have revealed the molecular basis of DNA recognition by TALE repeats. Examination of the overall structure reveals that the basic building block of TALE protein, namely a helical hairpin, is one-helix shifted from the previously defined TALE motif. Here we wish to suggest a structure-based re-demarcation of the TALE repeat which starts with the residues that bind to the DNA backbone phosphate and concludes with the base-recognition hyper-variable residue. This new numbering system is consistent with the α-solenoid superfamily to which TALE belongs, and reflects the structural integrity of TAL effectors. In addition, it confers integral number of TALE repeats that matches the number of bound DNA bases. We then present fifteen crystal structures of engineered dHax3 variants in complex with target DNA molecules, which elucidate the structural basis for the recognition of bases adenine (A) and guanine (G) by reported or uncharacterized TALE codes. Finally, we analyzed the sequence-structure correlation of the amino acid residues within a TALE repeat. The structural analyses reported here may advance the mechanistic understanding of TALE proteins and facilitate the design of TALEN with improved affinity and specificity.

  6. Females excel at basic face perception.

    Science.gov (United States)

    McBain, Ryan; Norton, Dan; Chen, Yue

    2009-02-01

    Females are generally better than males at recognizing facial emotions. However, it is not entirely clear whether and in what way females may also excel at non-affective face recognition. Here, we tested males and females on two perceptual face recognition tasks that involved only neutral expressions: detection and identity discrimination. On face detection (Experiment 1), females were significantly more accurate than males in detecting upright faces. This gender difference was reduced during inverted face detection, and not present during tree detection, suggesting that the magnitude of the gender difference for performance co-varies with the extent to which face processing mechanisms are involved. On facial identity discrimination (Experiment 2), females again outperformed males, particularly when face images were masked by visual noise, or the delay between comparison face images was extended from 0.5 to 3s. These results reveal a female advantage in processing face-specific information and underscore the role of perceptual factors in socially relevant gender differences.

  7. Nonverbal channel use in communication of emotion: how may depend on why.

    Science.gov (United States)

    App, Betsy; McIntosh, Daniel N; Reed, Catherine L; Hertenstein, Matthew J

    2011-06-01

    This study investigated the hypothesis that different emotions are most effectively conveyed through specific, nonverbal channels of communication: body, face, and touch. Experiment 1 assessed the production of emotion displays. Participants generated nonverbal displays of 11 emotions, with and without channel restrictions. For both actual production and stated preferences, participants favored the body for embarrassment, guilt, pride, and shame; the face for anger, disgust, fear, happiness, and sadness; and touch for love and sympathy. When restricted to a single channel, participants were most confident about their communication when production was limited to the emotion's preferred channel. Experiment 2 examined the reception or identification of emotion displays. Participants viewed videos of emotions communicated in unrestricted and restricted conditions and identified the communicated emotions. Emotion identification in restricted conditions was most accurate when participants viewed emotions displayed via the emotion's preferred channel. This study provides converging evidence that some emotions are communicated predominantly through different nonverbal channels. Further analysis of these channel-emotion correspondences suggests that the social function of an emotion predicts its primary channel: The body channel promotes social-status emotions, the face channel supports survival emotions, and touch supports intimate emotions.

  8. Race, emotion and trust: an ERP study.

    Science.gov (United States)

    Tortosa, María I; Lupiáñez, Juan; Ruz, María

    2013-02-04

    Faces contain certain cues that can be used to infer the intentions of other people and to formulate beliefs about them. The present study explored the extent to which the race of the partners and their emotional facial expressions influenced participants' decision-making in a Trust Game where race and emotional expression had no actual predictive value regarding the partners' reciprocation rate. Behaviourally, participants shared more money with happy than with angry partners. In two separate experiments, electrophysiological results showed an early interaction between race and emotion in the N170 potential and also in the subsequent P200, which suggests inter-dependent processing of those cues in a social context. Overall, our results indicate that racial and emotional cues exert both independent and also interacting effects in the processing of faces in an interpersonal context. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Age-related emotional bias in processing two emotionally valenced tasks.

    Science.gov (United States)

    Allen, Philip A; Lien, Mei-Ching; Jardin, Elliott

    2017-01-01

    Previous studies suggest that older adults process positive emotions more efficiently than negative emotions, whereas younger adults show the reverse effect. We examined whether this age-related difference in emotional bias still occurs when attention is engaged in two emotional tasks. We used a psychological refractory period paradigm and varied the emotional valence of Task 1 and Task 2. In both experiments, Task 1 was emotional face discrimination (happy vs. angry faces) and Task 2 was sound discrimination (laugh, punch, vs. cork pop in Experiment 1 and laugh vs. scream in Experiment 2). The backward emotional correspondence effect for positively and negatively valenced Task 2 on Task 1 was measured. In both experiments, younger adults showed a backward correspondence effect from a negatively valenced Task 2, suggesting parallel processing of negatively valenced stimuli. Older adults showed similar negativity bias in Experiment 2 with a more salient negative sound ("scream" relative to "punch"). These results are consistent with an arousal-bias competition model [Mather and Sutherland (Perspectives in Psychological Sciences 6:114-133, 2011)], suggesting that emotional arousal modulates top-down attentional control settings (emotional regulation) with age.

  10. Positive and negative emotional contexts unevenly predict episodic memory.

    Science.gov (United States)

    Martínez-Galindo, Joyce Graciela; Cansino, Selene

    2015-09-15

    The aim of this study was to investigate whether the recognition of faces with neutral expressions differs when they are encoded under different emotional contexts (positive, negative or non-emotional). The effects of the emotional valence context on the subsequent memory effect (SME) and the autonomic responses were also examined. Twenty-eight participants performed a betting-game task in which the faces of their virtual opponents were presented in each trial. The probability of winning or losing was manipulated to generate positive or negative contexts, respectively. Additionally, the participants performed the same task without betting as a non-emotional condition. After the encoding phase, an old/new paradigm was performed for the faces of the virtual opponents. The recognition was superior for the faces encoded in the positive contexts than for the faces encoded in the non-emotional contexts. The skin conductance response amplitude was equivalent for both of the emotional contexts. The N170 and P300 components at occipital sites and the frontal slow wave manifested SMEs that were modulated by positive contexts; neither negative nor non-emotional contexts influenced these effects. The behavioral and neurophysiological data demonstrated that positive contexts are stronger predictors of episodic memory than negative or non-emotional contexts.

  11. Susceptibility to emotional contagion for negative emotions improves detection of smile authenticity

    Directory of Open Access Journals (Sweden)

    Valeria eManera

    2013-03-01

    Full Text Available A smile is a context-dependent emotional expression. A smiling face can signal the experience of enjoyable emotions, but people can also smile to convince another person that enjoyment is occurring when it is not. For this reason, the ability to discriminate between felt and faked enjoyment expressions is a crucial social skill. Despite its importance, adults show remarkable individual variation in this ability. Revealing the factors responsible for these huge individual differences is a key challenge in this domain. Here we investigated, on a large sample of participants, whether individual differences in smile authenticity recognition are accounted for by differences in the predisposition to experience other people’s emotions, i.e., by susceptibility to emotional contagion. Results showed that susceptibility to emotional contagion for negative emotions increased smile authenticity detection, while susceptibility to emotional contagion for positive emotions worsened detection performance, because it leaded to categorize most of the faked smiles as sincere. These findings suggest that susceptibility to